I have audited hundreds of websites over the years and I think it’s fair to say that the same issues have a tendency to crop up repeatedly and in different guises. From missing basics, such as GA tracking, to more advanced common mistakes, such as multiple versions of the same page without a defined canonical; the SEO audit process can sometimes feel a little repetitive. That said, no two websites are the same, and the main aim of an audit is not just to define but also to prioritise the issues found. With this in my mind, I thought it might be helpful to outline some of the common mistakes that I discover when auditing websites.
Contents of this post
This post is a companion piece to my SEO Audit Checklist. As such, I have categorised all of the common mistakes using the same 17 key stages of analysis. This is not intended to be an exhaustive list of auditing checkpoints, so if that’s what you’re looking for, take a look at my other post on this topic. As with my other posts though, I’ll happily accept any feedback or suggestions for things to add. If I have missed a common mistake that you find with websites, feel free to send me a message and I might add it to the list below if it’s interesting enough.
You can use the following jump links to head directly to the areas of the audit process where I often find problems that need to be fixed.
- Google set-up mistakes
- Market research mistakes
- Crawling, indexing and ranking related issues
- Crawler error issues
- Page rendering issues
- Essential technical SEO mistakes
- Website performance issues
- Website security mistakes
- Site architecture mistakes
- Structured data mistakes
- Consistency and compliance issues
- On-page SEO mistakes
- Website accessibility & W3C Standards issues
- User Experience and CRO issues
- International SEO mistakes
- Local SEO mistakes
- Off-page SEO issues
A final introductory word. Thanks to all the smart folk who have contributed to the knowledge base via the links I’ve referenced below. I hope these links help people who are new to SEO to fix the common issues with their sites.
Google set-up mistakes
You can’t be successful on Google unless you are set-up correctly with their webmaster related services. I’ve never failed to be surprised, over the years, by the number of clients who haven’t set-up a Google Analytics account (or any other form of analytics, for that matter). This aside, here are the top mistakes I often find with the Google set-up for clients.
1. Only one unfiltered view set-up in GA
A very common mistake made by a lot of website owners, this one, whereby just one default view of website traffic is in place. For businesses with low levels of traffic, this inevitably means that a lot of the data is comprised of staff visits and bot hits.
How to fix this
- Basic Google Analytics Filters For Every Site
- How to Identify and Remove Bot Traffic in Google Analytics
2. Goals not set-up in GA
You can’t measure the success of an SEO campaign if you can’t see whether or not there has been an increase in conversions, such as form completions or sign-ups. Defining and setting up goals should be a top priority task before starting any work for a new client.
How to fix this
3. Google Tag Manager is installed but scripts such as GA and Hotjar are not deployed through it
If configured correctly, firing all scripts via Google Tag Manager can help to improve website performance. An issue I see quite often is both the GTM and GA scripts in the source code. If GTM is installed, all other scripts should be deployed through it.
How to fix this
4. Event data is not pulling through to GA from GTM
Websites frequently change but often the parties involved in managing a site forget to update the triggers and tags in GTM in line with those changes. This results in tags failing to fire because the trigger references a CSS class, anchor text or some other value which no longer exists.
How to fix this
5. eCommerce tracking not set-up
Remarkably, I have worked with several eCommerce clients, over the years, who did not have any kind of eCommerce tracking in place when I started working with them. This meant we could only see the volume of orders they were receiving initially, via the ‘Order Complete’ goal, but very little else. For clients in WooCommerce or a similar platform, this is relatively straightforward to set-up. However, custom build eCommerce sites require developer time to get this working properly.
How to fix this
- How to Set Up eCommerce Tracking with Google Tag Manager: Full Guide
- Enhanced eCommerce tracking in Google Analytics – Beginners Guide
6. HTTPS version of the site not added to Google Search Console
This is a very common mistake that I have encountered most frequently since Google set a mid-2018 deadline for getting the web moved over to HTTPS. So positively, on the one hand, webmasters have listened to Google and secured their sites. But negatively, they have forgotten to create a new profile for the HTTPS site in GSC. This means when I come to audit their site, I only have the HTTP data to look at.
How to fix this
7. Sitemaps not added or out of date in GSC
Forgetting to add XML sitemaps to Search Console means a host of valuable Google coverage information can’t be seen. One of the most common mistakes I see is a sitemap added, usually /sitemap.xml, but that map no longer exists because the client has moved to WordPress and installed Yoast (forgetting to add the typical /page-sitemap.xml or /post-sitemap.xml files that are associated to this plugin).
How to fix this
8. A disavow file has been uploaded to GSC containing editorially earnt backlinks
This is a common legacy issue following the initial round of panic in the wake of Google Penguin way back in 2012. Many website owners automatically assumed that a slight dip in rankings could be Penguin related, leading to a mass purge of perfectly acceptable backlinks.
How to fix this
9. Google Ads has not been linked to Google Analytics
Properly linking the two platforms is not just beneficial for the PPC specialist who is working on a site, it has huge SEO advantages too. Being able to see the full search query data within GA, and the engagement metrics for specific search terms, can help to inform strategic SEO decisions. When auditing sites, I often notice that the link has either not been set-up or not configured properly.
How to fix this
- Link/unlink Google Ads and Analytics
- Linking Google Ads to Analytics: A Guide to Using Google Ads & Analytics Together
10. Google My Business profile(s) not verified
Local pack SEO efforts are severely hampered by this very common issue with clients’ GMB profiles. This is an urgent fix for any business that operates at a local level.
How to fix this
11. Google My Business profile contains keyword-stuffed business name
Sadly, it is still possible to use underhand tactics to rank better in Google’s local pack. We’ve all seen the egregious listings titled [KEYWORD] [KEYWORD] [PLACENAME] instead of just the actual name of the business. It’s also a frequent issue I have encountered when auditing a new client’s website, usually because their previous SEO agency had modified the profiles without their knowledge. To help make the web a fairer place, and to help restore some integrity to the local pack, it’s always best to advise the client to de-spam their listings (and report their competitors using the same bad practice).
How to fix this
12. Poor Google reviews left unanswered
This is possibly one of the worst mistakes that any business can make regarding their Google My Business profiles. Yet, it is a common issue that I see time and time again. The client might have generally favourable reviews, however, there are one or two killer one-star takedowns that have been left unanswered. Many people (myself included) pay attention to reviews when ordering local services, so it’s vital that the client has a system in place for responding to customer feedback.
How to fix this
————————————————————————————————
Market research mistakes
When assessing a client’s website, it often becomes apparent early on as to whether it is built with SEO in mind, or if SEO has been “tacked on” as an afterthought. Sites that are struggling to make any headway in terms of organic visibility are often in that situation because they have not conducted adequate market research prior to the build phase (or sometimes not conducted any research at all). Here are the most common mistakes that I find at this stage of the audit.
13. No keyword research has been carried out
This is quite a common issue that I see with smaller businesses where the client has “had a go” at optimising the website themselves. Usually, the conversation goes something along the lines of “have you carried out any keyword research in the past?” to which the answer is “I didn’t feel that was necessary as it’s fairly obvious which terms people will use to find us”. Nine times out of ten it turns out that they are optimising for a key-phrase that either A) they will never rank for or B) isn’t matched closely enough to user intent, resulting in visits from people who are unlikely to convert.
How to fix this
14. The client has an incorrect understanding of search volumes
Frequently when keyword research has been carried out, the data has been misunderstood causing organic traffic projections to be incorrect. The most common misapprehension is the thought process that a #1 position on Google will send anything up to 100% of the identified search volume to the client’s website. This useful guide to clickthrough rates by position shows that 30% might be an ambitious target, and that’s assuming #1 positions across the board.
How to fix this
- How to Forecast SEO Potential With Google Data Studio
- SEO Forecasting – How to Calculate the Return of your SEO Activity
15. The client wants to target unrealistic keywords
Every SEO consultant on the planet has had this conversion at some point, before taking on the audit for a new client. You ask them what they would like to achieve through on-going SEO activity, and they respond to say “we’d like to be #1 for [insert trophy keyword]”. More often than not, the keyword they mention is a top of the funnel phrase with poor conversion potential.
How to fix this
16. The client’s website is unappealing to the target demographic
Over the years I have seen websites in the entertainment niche that look like they were designed for an accountancy firm. I’ve seen eCommerce stores trying to attract a younger customer whilst displaying a distinctly 90s web vibe. I’ve seen websites that look so glossy and stylish that they end up alienating their target customers, who assume they won’t be able to afford the client’s services. Having a web design based on solid market research and user testing is, therefore, an important foundation stone for future SEO success.
How to fix this
- Research-Backed Web Design Tips: How to Design a Website That Works
- How to Conduct an Effective Market Research for Website Design
17. The client’s website demonstrates low levels of E-A-T compared to key competitors
In the market research phase of an audit, it’s important to understand the specifics of the vertical that the client operates in. Businesses in sectors such as health, finance, law and science are held to much higher standards compared to other industries. Adhering to legal compliance, and adequate qualifications to conduct the work, are imperative for any of these types of businesses. One of the most common issues I see with clients in these sectors is a lack of attention to detail regarding factors that could improve their E-A-T. This can range from sites not having author profile pages to content not being attributed correctly to the author’s central page.
How to fix this
- How to Increase Expertise, Authority and Trust (E-A-T) Signals on your Website
- How YMYL Websites Can Improve E-A-T
- What Does It Mean to Have Good E-A-T?
18. The client is overly reliant on organic search compared to key competitors
This is a big mistake made by a lot of start-up businesses, especially those on a lower budget. The assumption is “SEO is free” so all of the efforts are placed on trying to rank a site in the organic results. However, SEO is not something that exists in isolation. In order for it to be truly effective, it needs to work in tandem with other promotional activities such as PR, social media, paid advertising and email. That’s not to say those things are ranking factors, but they help to create the necessary interest around a website that can lead to improved organic performance (backlinks, for example). Also, the deadliest part of this mistake is the overreliance on organic search. Sometimes Google’s frequent algorithm changes hit sites hard, even though they appear to be fully compliant with their guidelines. For any business that is 70%+ reliant on this channel, such an event has the power to close the company down.
How to fix this
- 7 Digital Marketing Strategies and How to Plan Your Own Campaign
- A Step-By-Step Guide to Structuring a Digital Marketing Plan
19. The client is not competitive in terms of value proposition & pricing
Frequently I have worked with clients who have a website in receipt of a considerable amount of traffic, but the site fails to generate enough conversions. They have then made all sorts of attempts to convert more of their visitors through CRO and UX testing but still not made much headway. Sometimes this is because they have not paid enough attention to what is going on in their market. So, for example, they have overestimated their brand value and assumed that they can charge the same premium fee as the established market leader. Or, on the other hand, they have not recognised the discounting or coupon strategy used by other eCommerce platforms in their niche. Also, one of the most common mistakes is simply not having a clear value proposition. Answering the question “why should someone buy from you rather than someone else?” is, in fact, an important SEO question. Because if this question is not asked and answered early on, all of the SEO activity in the world might not help them to win more business.
How to fix this
————————————————————————————————
Crawling, indexing & ranking related issues
Having a website that is open to be crawled by major search engines is the first, most crucial, stepping stone to achieving search visibility. Yet it’s surprising how often many sites fail this basic requirement, most frequently due to a technical oversight. Here are some of the most common errors I encounter in relation to crawling, indexing and ranking.
20. The website is blocked in robots.txt
Nine times out of ten this is a botched migration issue, usually accountable to the client’s web development agency. It often occurs in scenarios where the web developer migrates everything from the staging server to the production server, inclusive of all the root files. Therefore, the disallow directive contained in robots.txt (to prevent the staging site from getting indexed) gets carried over to live, preventing search engines from being able to crawl the live site. I encountered this situation a few years ago where a new client approached us, scratching his head, trying to figure out why he was losing rank so catastrophically. We went in and edited his robots.txt file, his rankings bounced back, and he thought we were geniuses. We let him think that, to be honest.
How to fix this
21. The staging site is open to be indexed
On the opposite side of the fence to the error above, this is where the client’s web development agency hasn’t taken any measures at all to prevent the staging version of their website from being accessed by search engines and other crawlers. This is a far more common issue that I see with many sites that I audit. It’s particularly problematic for eCommerce sites, where there is a risk that search engine users will end up clicking on a staging site link, navigating through the sales funnel and attempting to make a purchase. This is obviously a very bad situation indeed. Any web development firm employees reading this – make sure you have a system in place for protecting your client’s staging environments!
How to fix this
22. Robots.txt is missing xml sitemaps or references old maps
Including a reference to XML sitemap locations within robots.txt makes it easier for search engines to find the maps and crawl the URLs contained within them. There are two very common mistakes that I see clients make on this front. 1) They have not included any XML sitemap locations at all. 2) The robots.txt file contains references to sitemap locations that no longer exist.
How to fix this
23. Robots.txt is blocking URL parameters
This is a common mistake that I see with a lot of sites that are trying to deal with large scale duplicate content issues. As recently confirmed by Google’s John Mueller, robots.txt should never be used to block URLs with parameters. Instead, use rel=canonical and Google Search Console’s URL Parameter tool.
How to fix this
24. The website is near impossible to crawl in SF
URL parameters can cause difficulties for search engines when crawling a site, if the site is not coded properly. So, it’s important to address the fixes noted above. When it comes to crawling in Screaming Frog (or your preferred crawler tool) it is usually straight forward to exclude such URLs from the crawl results. However, over the years, I have audited sites featuring very complex URL structures outputting unlimited duplicate versions of themselves. These ‘spider traps‘ can also make it very tricky, or even impossible, to get a clean crawl of the site. Having a site with never-ending URL traps is obviously very bad for SEO; zapping crawl budget and organic visibility. It’s a priority fix whenever I encounter it.
How to fix this
25. SEO-unfriendly URLs appearing in search engine indices
This is an extremely common mistake that affects perhaps 95% of the websites that I have audited over the years. It’s also a high priority fix, as an abundance of low-quality / thin content / user-unfriendly URLs may place the client’s website at risk of being penalised by one of Google’s content related algorithm components (Google Panda, for example).
How to fix this
- Thin Content: How to Diagnose and Fix the Issue
- Fixing Index Bloat: 9 Ways to Deindex Pages from Google
- Index Bloat: Why Deleting Website Pages Is Great for SEO
26. SEO unfriendly URLs need to be removed from XML sitemaps
This is another very common issue, connected to the one above. The problem usually resides in the fact that someone has lazily reeled off a sitemap using a free generator like this one, but hasn’t bothered to check if all of the resources cited in the map are ones that should be included in search engine indices. The process of fixing this issue is made easier for sites in WordPress via plugins such as Yoast, where adding ‘noindex’ to a resource also removes it from the map. For custom-CMS built sites, consideration needs to be given to developing a dynamic sitemap which only includes the resources that the client wants search engines to crawl and index.
How to fix this
27. URLs missing from XML sitemaps
In antithesis to above, another common issue I see, particularly on larger websites, is the omission of valuable resources from the XML sitemaps. Sometimes this can be caused by the fact that the resource is not linked to from anywhere (a so-called “orphan” page), so when tools are used to dynamically generate the map it doesn’t get crawled and is excluded. Other times it is because the map for the site is static, rather than dynamic, so new pages fail to appear within the XML when they are published.
How to fix this
28. Search engines are ranking incorrect pages
When it comes to analysing how a website ranks, one of the most common mistakes I see is a lot of overlap and keyword cannibalisation around the core phrases a client is targeting. So, in order to dominate a specific keyword group, the client has set-up main site pages, informational pages, blog posts, video transcripts and a myriad of other content types; all featuring title tags and copy that overlap greatly. This forces search engines to choose what they deem to be the most relevant resource, often resulting in non-conversion focused pages outranking the primary landing page. Often, in this scenario, you will see ranking flux between the competing pages, where one page will rank highly for a certain key-phrase over a certain period of time, then drop out of the results and get replaced by an alternate page. When the primary landing page converts at a much higher rate compared to the secondary page, this is a significant issue.
How to fix this
- How to Find and Fix Keyword Cannibalization Issues (in Seconds)
- What to Do When Google Is Ranking the Wrong Page for Your Keywords
29. The client is not tracking any of their key-phrase ranking positions
Whilst I personally encourage clients not to obsess over key-phrase rankings (targeted traffic and conversions are always the key metrics), it is useful to see the keyword ranking history for a site before I embark on an audit. If certain keywords drop following an algorithm update, it’s very useful to be able to see precisely which ones so that I can focus my analysis on the relevant pages. However, frequently, when working with a new client, they do not have any historical key-phrase ranking data. Often, they don’t have GSC set-up either, effectively blacking out a large part of the data that can help to diagnose why they have lost traffic.
How to fix this
————————————————————————————————
Crawler error issues
High volumes of crawler errors will waste the crawl budget and provides a bad experience to the end-user. The most common mistakes that I see on this front are as follows.
30. There is an excessive volume of 404 errors
Situations that can contribute to surges in 404 errors include: poorly handled website migrations (where 301 redirects have not been implemented correctly); expired products or listings on eCommerce, property or classified ad style sites; developers or website editors changing content on the site without adjusting the internal links; external sites changing the structure of their URLs causing outbound links to break.
How to fix this
31. There is an excessive volume of 5xx status server errors
High volumes of server errors cause frustration for end-users of a site and will contribute to diminished search visibility, over time, if they are persistent and left unfixed. I often see clients with sites hosted on inadequate platforms, where a priority task involves either upgrading their hosting solution or moving to a new server.
How to fix this
- How to fix a 500 Internal Server error
- 5xx server errors: why they occur, when, and what you can do about them
32. There is an excessive volume of internal 301 redirects
This situation most frequently occurs when a website is upgraded on a staging environment, all of the URLs are changed and 301 redirects are put in place when the new version is set live. Many developers assume that because 301 redirects are in place, nothing else needs to happen. However, it’s best practice to update all of the internal links within the site, so that they point to the new locations of the pages. This helps to preserve crawl budget, minimising the number of URLs that search engines crawl when they visit the site.
How to fix this
33. 302 redirects are in use when a 301 would be better
The purpose of a 302 redirect can be ascertained, helpfully, by its description “Moved Temporarily”. Likewise, the 301 redirect can easily be understood by its description “Moved Permanently”. A very common mistake I see with sites is the implementation of incorrect redirects, where a 301 is required rather than a 302, or vice versa. One common example I see implemented by developers, is a 302 redirect from the non-www version of a site to the www version (or again, vice versa). If a site has hundreds of backlinks pointing to the non-www version, which then 302 redirects to the www version, the www version will not benefit fully from those links.
How to fix this
34. Pages crawled contain malware
Google takes security very seriously. If they crawl your site and discover malware on any of your pages, the issue will flag up in Google Search Console under “Security & Manual Actions”. Interesting that they place these two items together, as issues in these areas can have a devastating effect on a client’s search visibility. A common mistake I see with websites that have been hacked is a lack of attention to detail in fixing the issues and securing the site.
How to fix this
- What to Do If Your Website Is Hacked & How to Remove Malware and Fix It
- Recovering Website SEO After a Hack
————————————————————————————————
Page rendering issues
Google’s technology has advanced rapidly, over the years, enabling them to better understand how web pages are structured and displayed to the end-user of the website. That said, I frequently encounter errors and issues with the way websites are coded, effectively preventing Google and other search engines from being able to fully render the pages on the site.
35. CSS & JS files are blocked
Way back in 2015, Google announced their advanced abilities in rendering web pages, stating “as long as you’re not blocking Googlebot from crawling your JavaScript or CSS files, we are generally able to render and understand your web pages like modern browsers”. Yet still, at the time of writing this, I regularly see sites that have blocked access to their CSS & JS.
How to fix this
36. Site features render blocking CSS & JS
A very common issue this one, where all CSS & JS files load simultaneously when some could be deferred to help speed up the page load times. Pagespeed is an important ranking factor, so this common issue is usually a high priority fix for a new client.
How to fix this
37. The client has an entirely JS-based site but hasn’t configured the prerender correctly
The majority of the sites I have worked on over the years tend to be traditional HTML based properties. Rendering these types of sites is much more straightforward for Google compared to JS-based sites, which are proliferating in popularity. The rendering process is a lot more complicated for sites built entirely with JS. Take a look at this excellent guide to Javascript SEO, to better understand the SEO challenges with this framework. A common issue I see with JS sites is the lack of pre-render service or a misconfiguration in its set-up.
How to fix this
38. Site has browser compatibility issues and renders differently on each
If a site looks odd when using a certain browser, then the chances are Google may also encounter issues when parsing and rendering the pages on the site. Cross-browser compatibility issues are common with a lot of websites and should be flagged in the initial audit.
How to fix this
————————————————————————————————
Essential technical SEO mistakes
I use the phrase “essential technical SEO” to refer to the technical SEO basics that must be adhered to regardless of the site set-up. Take a look at my audit checklist for a list of typical issues. The most common mistakes I see, regarding technical SEO, are usually as follows.
39. The www and non-www versions of the site are accessible
Issues around canonicalisation are among the most frequent technical SEO problems that I encounter. I still regularly see sites where both the www and non-www version are accessible.
How to fix this
40. There are multiple versions of the same page with no rel=canonical
Another very common issue with sites are errors in the application of the canonical link element. Often sites feature filtering systems which generate many dozens of variations of essentially the same content. Sometimes the canonical does not point to the correct, definitive version of the page. Other times, the canonical link element (also referred to as the canonical tag) is missing altogether.
How to fix this
41. Legacy redirects in .htaccess causing redirect chains
This is another common issue that occurs following a site migration. The URLs all change on a site, requiring 301 redirects, so the developer adds all the new redirects to .htaccess but forgets to cross-reference them with the legacy redirects that are already in the file. This results in redirect chains and inaccessible web pages.
How to fix this
42. Redirects are being added directly to .htaccess and via a plugin
This issue causes a lot of confusion when it comes to trying to figure out the problems a site has with redirects. Where possible, sites should avoid using plugins and work directly in .htaccess.
How to fix this
43. 410 redirects are in .htaccess but not working (pages generates a 404)
A 410 status tells search engines that a certain resource has gone completely. It is useful to apply this status to URLs that the client wants to quickly deindex, or on sites which feature a lot of expired content (estate agents’ property listing pages, for example). Often, when assessing sites, I see that 410s have been added to .htaccess but when you visit the 410’d resource it returns a 404 error.
How to fix this
43. Pages which should be included in search indices have ‘noindex’ applied
This fairly common issue is often related to a purge on thin content that was applied to a site following one of Google’s content quality algorithm updates (such as Google Panda). The client’s SEO agency or developer noindexed large sections of the site, then time and effort was put into improving the quality of the excluded thin content pages. However, when the quality of the page became sufficient, the noindex directive was not removed.
How to fix this
- Accidental Noindexation Recovery Strategy and Results
- Robots.txt Noindex Update: Everything SEOs Need to Know
45. NoIndex, NoFollow has been applied where NoIndex, Follow would be better
Heavy usage of NoIndex, NoFollow can prevent link equity from being distributed properly through the site. When removing pages from search indices, most of the time it is best to use Noindex, Follow, so that pages linked to from the excluded page continue to receive the benefit of those links.
How to fix this
46. Site fails mobile-friendliness checks
Google’s move to mobile-first indexing reiterated the need to webmasters to ensure that their sites worked perfectly for mobile users. Yet, despite Google enabling mobile first for all new sites since July 2019, many still fail basic mobile-friendliness testing.
How to fix this
47. Site features an excessive volume of soft 404s
Soft 404s often affect larger websites and occur when thin content pages confuse Google, so they return a 200 status when you check the headers but are labelled soft 404 in GSC. The issue also occurs when search engine users are redirected to an irrelevant resource to the topic they searched for. For example, when an expired page is redirected to the homepage.
How to fix this
48. Site uses meta refresh
A meta refresh is a back-handed way to redirect one resource to another. Google’s John Mueller says it works fine but is not recommended. It’s inadvisable to use meta refresh as it causes a delay in page rendering and does not provide a good experience for the end-user.
How to fix this
49. No custom 404 page in place
Generic 404 error pages are bad for users, particularly if the error page is not connected to the site design in any way (preventing users from following links to other pages). A custom 404 is both a good technical SEO and UX fix that every site owner should consider developing.
How to fix this
————————————————————————————————
Website performance issues
Having a fast loading website is not just desirable, it is imperative for any serious business or service provider. Yet one of the most common issues I encounter when auditing a new client’s website is poor performance scores. Specifically, the following problems tend to crop up most frequently.
50. Performance plugins have been installed but are not configured properly
I would say about 80% of the sites that I assess are in WordPress. One of the benefits of this platform is the abundance of plugins that are available to fix common SEO-related issues. Nevertheless, plugins are not a fix-all solution. You can’t just switch them on and hope for the best. You need to extensively test the site using Google Lighthouse, PageSpeed Insights and other tools. The results of these tests can then help to inform how the plugins are configured. For WordPress sites, I find a combination of Async JavaScript, Autoptimize and WP Fastest Cache, works really well. You have to be careful that the settings of the two latter plugins do not conflict, though.
How to fix this
- The Ideal Autoptimize Settings With StackPath CDN + Cloudflare Setup Instructions
- How to Defer Parsing of JavaScript in WordPress (4 Methods)
- The Ideal WP Fastest Cache Settings With Cloudflare + StackPath CDN Instructions
51. There’s a 25mb image on the client’s homepage
Sometimes the reason why a website is so slow to load is because the staff working for a company have not been given any guidance on how to optimise images for the web. As a result, non-technical staff end up uploading enormous image files, resulting in the web page taking half a century to download for site visitors.
How to fix this
52. The client has 468 plugins installed on a 6 page WordPress brochure site
Ok, maybe not 468, but I have had some clients approach me saying that their website is a bit slow to load and, upon checking the WordPress admin area, I’ve discovered every plugin under the sun and then some. It can be a bit of a mission trying to figure out which ones can be deactivated and deleted.
How to fix this
53. GZIP compression is not enabled on the server
GZIP compression is enabled server-side and helps to reduce the size of website files. Sometimes I see clients are using a plugin where they have ticked the GZIP compression box, however, when conducting a performance test on GTMetrix the results show that this issue still needs to be fixed. This usually involves a discussion with the client’s hosting company, to enable it.
How to fix this
- How to Enable GZIP Compression in WordPress
- How to enable Gzip compression in Apache
- Enable gzip compression
54. Too many HTTP requests are slowing the site down
When a visitor loads a web page, excessive volumes of HTTP requests diminishes the load time of the page. According to HubSpot, the median level of HTTP requests to load a web page is between 69 and 75 requests. Sites with pages making 100+ requests are quite common, and this is usually a priority fix that I flag in an audit.
How to fix this
55. Site feastures intrusive interstitials on page load
Strange how this common issue still pops up (see what I did there?) as precisely no one likes intrusive advertising getting in their way of reading the content on a web page. If I see it on a client’s website, I politely suggest finding less irritating ways to capture information about their visitors.
How to fix this
56. CSS and JS files need to be minified
Minifying CSS and JS (removing unnecessary or redundant data from them) can help to save many bytes of data for the end-user, on the page download. This is usually a quick fix that can be implemented straight away for the client.
How to fix this
57. The site is hosted on a very poor quality server
One of the worst examples I encountered on this front was a manufacturing client with a homepage that took around 30 seconds to load! Needless to say, their traffic and bounce rate was diabolical. That said, I also frequently encounter clients who have not chosen the correct hosting solution suited to their business requirements. For clients with larger sites and relatively high levels of traffic, it’s crucial for them to be located on a secure, fast and stable server.
How to fix this
58. Site is targeting international users but a CDN has not been set-up
Sometimes I see clients have configured their international SEO efforts quite well, with correct hreflang markup in place and content tailored to regional audiences and languages. Then you notice that people in certain parts of the world have greater difficulty accessing the site quickly, causing higher bounce rates in those regions. Further checks reveal that a CDN (Content Delivery Network) is not in place and needs setting up.
How to fix this
- CDN for Dummies – A Stupid Easy Guide
- How to Setup CloudFlare Free CDN in WordPress
- Cloudflare and CDN – A Definitive Guide for WordPress
- 5 Best CDN Providers to Speed Up Your Website (In-Depth Comparison)
————————————————————————————————
Website security mistakes
Not adequately securing your website is a costly mistake. I know from personal experience. Some time ago, whilst working full-time for another company, I replaced my website with a temporary mini-site. I forgot to install Wordfence on it, and it got hacked. The hackers injected a load of nonsense spam pages into the blog. Years later, and even after moving domain (albeit using a 301 redirect), there are still some trust issues around my site (you can tell when you are killing it on Bing, but struggling on Google). I probably shouldn’t admit to this, as an SEO consultant, but hey ho. We all make mistakes and I’m here persevering with this blog. Anyway, don’t make the same mistake I made years ago, and avoid these common issues.
59. Site has been hacked and injected with spam pages
As mentioned above, the chief cause of this is a lack of adequate website security.
How to fix this
60. Site is still accessible via HTTP
Despite Google’s big push, in 2018, to get the web moved over to HTTPS, I still frequently encounter sites that are receiving confidential customer information over an insecure connection. This is a significant error for any business operating today and is a priority fix if ever I see it.
How to fix this
- Converting Your Site From HTTP to HTTPS
- How to Properly Move WordPress from HTTP to HTTPS (Beginner’s Guide)
61. There’s an issue with the site’s SSL certificate
Sometimes I see that clients have followed the recommendations and moved to HTTPS, however, the SSL certificate has not been installed correctly leading to “Certificate is not trusted” connection errors.
How to fix this
62. Mixed content issues have not been fixed
This is probably the most common issue that I see in relation to site security. It occurs when a website moves over to HTTPS but other resources within the site (such as video, images, CSS and JS) are loaded using an HTTP connection.
How to fix this
63. Site is not storing customer data safely
A website hack that results in customer data becoming exposed, such as credit cards details and email addresses, is not just bad for SEO, it results in significant reputational damage to the organisation in question.
How to fix this
————————————————————————————————
Site architecture mistakes

Image Source: Sitebulb
A carefully considered site structure helps link equity to be distributed well throughout a site, it helps search engines to understand the relationships between pages and, most importantly, it helps users to easily navigate through the content. The biggest mistake many business owners make on this front is to not seek the advice of an SEO consultant prior to the build of the website. Site architecture issues are not a “quick fix” post-build, they are considerations that need to be at the heart of website planning. With this in mind, here are some of the other key problems that I see in relation to the architecture of sites.
64. The navigation menu system is overloaded with links
This is one of the most common issues I see, and it is bad for users and bad for SEO in terms of passing link equity through to internal pages. Frequently I see sites that feature informational pages in the main menu when they should be in the footer. I’ve even seen sites put their Terms & Conditions pages in the top-line menu (even as a drop-down this is unnecessary). It’s important to ensure that the main menu system only features the top-level links that are most essential to a user beginning their journey through the site (secondary, sidebar and internal linking can then help users to find deeper content relating to their interests).
How to fix this
- How to Structure Your Site’s Navigation for SEO and UX
- Website Navigation: Tips, Examples and Best Practices
65. A secondary navigation menu system appears more prominently in the site code
I often see this issue with eCommerce sites, where a “secondary” navigation appears above the primary navigation and contains links to “My Account”, “Login” and “Signup”. The mistake here is not the secondary menu itself but the fact the links are usually followable by search engines, meaning these pages benefit from the link equity flowing into the site. I have often seen sites where the “login” page is the strongest in terms of internal links and authority. This is a quick fix, so always high on the list to remedy straight away.
How to fix this
66. The menu system is not user friendly
Over the years I’ve seen some pretty terrible menu systems featuring, among other things: different menu bars for every web page; tons of unimportant links on the top line or in drop-downs; poorly designed interfaces with hard to read links; a user-baffling array of sub-menu options and menu links that unexpectedly take you out to a sub-domain with a different layout or design.
How to fix this
67. Website content is not arranged in a logical, hierarchical and pyramid style
How content is organised on a website can greatly impact on its search visibility. The core themes relating to a site need to be clustered, through the creation of content hubs. This can be achieved by establishing a clear hierarchy to site content, with an overall theme at the top and sub-category themes logically ordered under it. A common issue I see with a lot of sites is disorganised content, poor vertical and horizontal linking, and generally a poor relationship between pages of a similar theme.
How to fix this
- SEO Siloing: How to build a website silo architecture
- Google’s John Mueller on Best Site Structure
- Site structure: the ultimate guide
68. URL structures do not have correct parent and child relationships
It’s a contentious one, this one, as I have seen sites that are successful on Google despite placing a lot of their content in the root folder. However, I would argue that for most sites it is important to establish a clear hierarchy through the folder structures. So, for example, your top-level page about SEO should be located at example.com/seo/. Your guide to site architecture should then be located at example.com/seo/site-architecture/. Your specific guide to navigation menu systems at example.com/seo/site-architecture/navigation/, and so on. Many sites that I audit place the majority of their content in the root and feature poor internal linking, making it harder for search engines to understand the relationships between pages.
How to fix this
69. URL structures employ capitalisation, underscores or non-ASCII characters
Another good reason for employing an SEO consultant at the site build phase is to ensure that common SEO-unfriendly web development technologies are not employed from the outset. One of the most common technical issues I see with sites is the use of dynamic URLs or coding technologies that output URLs with unusual characters or structural elements.
How to fix this
70. URL structures are overly long, contain too many sub-folders or repetitive keywords
A common mistake I see in the structure of URLs is excessive repetition of the target key-phrase. For example, myseosite.com/seo/seo-agency/seo-company-london/. Overlong, spammy looking structures like this, with many instances of the target phrase, look terrible in the search results and are unlikely to contribute to trust signals around a site.
How to fix this
71. Breadcrumb functionality has not been set-up or coded properly
Not being able to see where you are in the hierarchy of a website can be very confusing when you land on a specific web page via a search engine. Breadcrumbs are therefore an essential must-have for UX, but they also help search engines to understand the relationships between pages and they facilitate more efficient crawling. Sites without breadcrumbs are very common and, for some clients, they are a quick win to improve internal linking between pages.
How to fix this
- How To Add Breadcrumbs To Your WordPress Website
- How To Create a Breadcrumb Navigation
- Breadcrumb Navigation: Explanation & Examples
72. Internal linking is inconsistent
Sometimes, with multiple stakeholders working on a website, confusion arises regarding the focus of internal linking to key pieces of content. As a result, different team members link to different resources using the same or similar anchor text. Large scale inconsistency in internal linking can result in diminished performance for the focus web page. It’s also important to ensure that links on any page are matched well to the user’s needs and requirements. Frequently, I audit sites where the client prominently links to a certain landing page they are trying to push, even though it bears no direct relation to the linked from page.
How to fix this
- Internal Link Building: How To Optimise A Website Using Internal Links
- Using Strong Internal Links for SEO
- Optimize Your Internal Linking with Site Audit
73. Internal linking is overloaded towards unimportant pages
This is quite a common issue that crops up at the audit stage. Analysis of internal linking using Screaming Frog frequently reveals that pages which are not important from a ranking perspective (Terms & Conditions and Login pages, for example) are among the most frequently linked to resources within the site.
How to fix this
74. Internal linking is absent within the body text areas of pages
This an issue which frequently crops up when assessing the client’s web copy. Sometimes the copy will refer to a specific resource but not provide a link to it. Other times, the copy will go into detail about the specifics of a product or service but not provide any links to supporting content relating to it.
How to fix this
75. Sidebar and/or footer links are spammy
From time to time I still see websites where the site owner (or more likely their poorly qualified “SEO company”) has inserted 100+ keyword heavy links in the footer for every town and city that they serve. [Keyword] [Keyword] London, [Keyword] [Keyword] Birmingham, [Keyword] [Keyword] Manchester. And so on. Sitewide boilerplate links like these are not helpful for users and they are certainly not good for SEO.
How to fix this
————————————————————————————————
Structured data mistakes
It’s thought that when Google switched to mobile-first indexing they also moved over to indexing based on entities. I won’t go into an explanation of entities here (take a look at Kevin Indig’s post on semantic content optimization for that), I’ll just say that they are the basis upon which Google builds the Knowledge Graph. As such, the use of structured data behind a website’s code has taken on an increasing level of importance in the advent of this dramatic shift. It’s also an area where mistakes are frequently made or where it has been omitted altogether. Some of the most common mistakes include the following.
76. Site has not marked up any important business information
There are a multitude of schemas that can help Google (and other search engines) to better understand the content on a website and to serve it to end-users of the search engine. Elements of a site that can be marked up include: general business information using LocalBusiness schema; reviews; FAQs; events; people and many more besides. However, very frequently I see that many sites have not marked up any elements at all.
How to fix this
77. Use of Google’s structured data testing tools returns warnings and errors
You can’t carry out a site audit without this essential testing tool. Used in conjunction with the structured data testing capabilities of your favourite crawler tool (Screaming Frog provide a useful guide here), you can unearth all of the issues relating to every page across a client’s site.
How to fix this
78. Incorrect use of aggregateRating schema
This is very common issue I see with sites, where markup has been applied to highlight reviews but it has been implemented in a way which conflicts with Google’s guidelines. One of the most common mistakes is the application of markup for reviews which do not actually appear on the page. Also, Google recently announced a change to the use of “self-serving” reviews within LocalBusiness and Organization markup.
How to fix this
79. The client is firing mark-up sitewide when it should only appear on certain pages
Another common mistake, often relating to LocalBusiness or Organization markup. The markup should only be deployed on the most relevant pages. For Organization schema, this is most likely to be the homepage.
How to fix this
80. The structured data is firing via GTM but not showing in the testing tool
More often than not this is down to the fact that the markup is not wrapped in Google Tag Manager friendly code. It may also be caused by validation errors in the code.
How to fix this
————————————————————————————————
Consistency and compliance issues
Consistency in how a business is cited across the web is important for local SEO, but it can also have wider ramifications in terms of how a search engine might perceive the authoritativeness of a particular web property. In addition, businesses operating in certain verticals need to demonstrate their legal and regulatory compliance to the laws that oversee their industries. There are a few key mistakes that business owners make on this front, including the following.
81. Inconsistent business details cited across the web
This issue often occurs when businesses change key details about their set-up, such as their brand name, business address or addresses, web address, opening hours or any other information that helps to clarify their existence and operational activities.
How to fix this
82. The site is not GDPR-compliant
It’s not known if Google, and other search engines, check for signs of GDPR compliance or if evidence of compliance is a ranking factor. Google themselves were hit with a £44m GDPR fine, so if they have taken steps to police compliance on other domains around the web, they haven’t yet brought their own house fully in order. That said, it’s thought that Google does look for a multitude of trust signals around a site, and the presence of GDPR compliant Terms & Conditions or Cookie policies, for example, may play some sort of indirect role in how well a website ranks. Pure speculation, obviously, but it still makes sense to have this base covered. Many sites that I audit in the UK still reference the superseded Data Protection Act, so this is one that I recommend they fix asap.
How to fix this
83. YMYL sites not demonstrating compliance with the law
Google rolled out numerous core updates in 2019 which appeared to affect websites in certain verticals more than others. Within the SEO industry, and within Google’s Search Quality Evaluator Guidelines, these sites are categorised as YMYL (Your Money Your Life). A YMYL website can be defined as any which provides information which could have a dramatically negative impact on the end user’s well-being, if the information provided is not correct or misleading. As such, any websites pertaining to health, finance, law, science, or other disciplines that are held to high standards, may be classified by Google as YMYL. I’ve audited numerous sites in these sectors and many of them make mistakes that probably impact on how Google perceives their expertise, authoritativeness and trustworthiness (E-A-T). Common mistakes include: linking out to dubious sources; not linking out to authoritative sources; not creating distinct author profiles for key people within the business; not marking up key business information such as awards and qualifications.
How to fix this
————————————————————————————————
On-page SEO mistakes
On-page SEO continues to play a vital role in how well a web page ranks, yet sites that I audit consistently make mistakes regarding basic on-page optimisation. Some of the most common ones include the following.
84. Title tags & meta descriptions are not optimised (or missing altogether)
I frequently see WordPress sites where Yoast has been installed but the default settings are active, and no custom tags have been written. As a result, the site features the page name and brand name in the title tag but nothing else, and the meta description is missing altogether. As the page name is often the H1 tag, this means duplication of the exact same key-phrase across both tags. Title tags need to be unique from the H1 and every page should have a meta description featuring a strong call to action message.
How to fix this
85. The site logo is contained in an H1 tag
This is a very common mistake that I see with a lot of sites. It seems to be an issue that is prevalent with certain types of WordPress themes (i.e. badly coded ones). The developer of the theme has wrapped the logo to the site in an H1 tag, so every single H1 across the entire site shows up as the name of the logo when carrying out a site crawl. Every page on a site should have a unique H1, that’s visible on the page, so this is an important error to fix.
How to fix this
86. There are multiple H1 tags on pages across the site
Google’s John Mueller recently said “our systems don’t have a problem when it comes to multiple H1 headings on a page” (source), however, I personally think it is still bad practice and contravenes the basics of semantic HTML. If a client is planning on redeveloping their site, I’d encourage them to ensure that just the one H1 tag is in use for each web page, followed by H2s, H3s etc for sub-headings.
How to fix this
87. The site features poorly structured web pages
Over the years I have seen many varieties of poorly structured web pages, including: pages with enormous amounts of white space; pages overloaded with huge amounts of unbroken text; pages with ridiculously over-sized sidebar areas; pages that are overloaded with information and links making it difficult to know where to click; pages without any CTA areas making it difficult to know what action you should take. An intuitive, well-designed layout to web pages is an important element of good on-page SEO.
How to fix this
88. The site does not link out to authority sources
Linking out to other websites is the basis of the web itself. There’s no “web” without links. And without links, Googlebot and other crawlers can’t find fresh content. Yet many clients that I speak to seem to have an innate fear of linking out, often believing that it may be harmful to do so, or that they should add rel=nofollow to all outbound links. Various studies have shown that linking to other sites has positive SEO impact for the linking site, so I always encourage clients to link more freely (whilst avoiding linking to low-quality resources).
How to fix this
89. Pages feature an excessive volume of internal and/or external links
Yes, I realise this very post features hundreds of links. For in-depth blogs, I think it is ok to feature lots of outbound links (if they are useful and if they are on-topic). However, I sometimes see clients with enormous sitewide blogrolls featuring 100+ links to all sorts of random places (link exchanges with web directories among the worst!) In these situations, it’s definitely a point to be made in the audit, to suggest a tidy up.
How to fix this
90. Images have not been optimised correctly
Earlier I mentioned the frequently encountered issue of non-technical staff uploading very heavy image files. In addition to this, there are several other common mistakes people make when optimising images for the web, including: applying spammy keyword stuffing in the alt tag instead of just describing the image; choosing inappropriate file formats and not using lazy load functionality to serve the images only when they are viewed by the end-user.
How to fix this
91. The site features a lot of content behind tabs for desktop users
Tabs are fine for mobile users but making people on desktop devices click endlessly in order to read the content in full is not a good user experience. For years, Google stated that content behind tabs would not be given full weight, in contrast to the fully visible text. This all changed in the advent of mobile-first indexing. However, it is still important to do what is right for the end-user of the site. Excessive tabs for desktop users is still one to avoid.
How to fix this
92. The site features a high volume of doorway pages
I’ve seen all sorts of variations of this type of content, over the years; the most nefarious implementation being large scale duplication of text, where the target key-phrase is the only unique aspect of each iteration. Sites that have engaged in this practice may face years of ranking oblivion so, in some cases, it may be in the best interests of the site owner to simply start afresh on a new domain. However, smaller scale doorway page issues can often be cleaned up without too much fuss, so this will be a priority recommendation in the audit.
How to fix this
93. Web pages feature forced keyword insertion
Just the other day I found myself in several rounds of communication with a client obsessing over the correct “keyword density” for the new web copy being written for a large, industrial manufacturer. As the site has many hundreds of pages, they sent over a template featuring standard text with [keyword] insertions in the H1s, H2s, body text and CTAs. I pointed out that A) this is large scale doorway page creation (see above) and B) it’s no longer necessary to keep repeating keywords throughout a text document like this (particularly so following the roll-out of Google BERT, which dramatically increased their Natural Language Processing capabilities). Write naturally, for the user, it essentially one of the most common recommendations that spring up in one of my audits.
How to fix this
94. Site content is not properly attributed to the author
I touched on this earlier but it’s worth pointing it out again as it’s important. The authority of a piece of content can be improved dramatically if it is correctly attributed to its author. For many clients that I work with, this usually involves ensuring that they have dedicated team member pages that are marked up with Person schema. Blog content written by that team member should be marked up with Article schema and linked back to their profile page.
How to fix this
————————————————————————————————
Website accessibility & W3C Standards issues
Ryan Robinson, writing for Forbes, says “web accessibility and ensuring your content can be consumed by all people, is one of the most important issues facing online businesses today”. Yet many businesses overlook this important element of website optimisation. The most common issues I encounter include the following.
95. There many validation errors in the site’s code when testing it with the Markup Validation Service
It might not be a ranking factor to have a 100% W3C compliant site, however, hundreds of validation errors do not present a picture of code quality. If these errors present any issue in terms of search engines being able to index the page, or if they inhibit users with disabilities from being able to understand it, then steps should be taken to find a fix.
How to fix this
96. Site is not very user-friendly for people with impairments
Common issues on this front include: poor text colour contrast; missing or misleading alt tag text; inaccessible web forms and uninclusive colour schemes. Fixing these issues means better overall engagement metrics and better long term SEO performance.
How to fix this
97. Contact information is hard to find
It’s surprising, the number of sites I have audited where I have had trouble in trying to find the contact details for the company. In some instances this is deliberate, perhaps because the organisation does not want to field high volumes of phone calls. Frequently though, I’ve worked with clients where they have mistakenly made it difficult for people to get in touch. This might be because the phone number and email address are in the footer, with poor colour contrast obscuring its location. Or it might be because the ‘contact’ page is not prominently linked to, either from the menu system or from within the key landing pages.
How to fix this
————————————————————————————————
User Experience and CRO issues
On-page optimisation and user experience testing are areas of SEO that overlap greatly. The distinction may be connected to direct ranking implications. There are on-page improvements that can lead to an almost immediate boost in ranking position (tweaking title tags, for example) or an immediate loss (sticking a huge block of keyword over-optimised text on a critical landing page). User experience improvements, on the other hand, usually play a more indirect, subtle role in how a web page performs on search engine results pages. Gradually fine-tuning a page so that it converts more of its users is the end-game of both SEO and CRO. When Google released RankBrain back in 2015, their intention was to put the user first by delivering results based on machine learning. This has ushered in an age of focus on behavioural metrics (bounce rate, pages per session etc), where it’s not just about winning the click but keeping them on the site and getting them to take action. This is also an area where a large percentage of the websites that I audit make fundamental errors or omissions. Most commonly, this includes the following.
98. Call to action messaging is unclear
This is one of the most persistent issues that I see when carrying out the CRO section of an audit. Sometimes the issue is connected to an overdose of CTA messaging, where it is difficult to know which one of the half a dozen actions the page is asking you to take. Other times it is down to the fact that the CTA is not prominent enough, or occasionally missing altogether.
How to fix this
- 31 Call-to-Action Examples You Can’t Help But Click
- How to Create Compelling Calls to Action
- 16 Call To Action examples to inspire your marketing messages
99. The site lacks quick contact forms or easy contact options
With increasing numbers of people using mobile devices to search for local services and book online, it has become vital to ensure that you make it as easy as possible for people to get in touch whilst they are on the move. People do not want to have to spend several minutes trying to figure out where the contact page is buried within the burger menu. Quick contact options that are available on the landing page, make it as easy as possible for users to get in touch. These options include: quick forms (just a name/phone number required); live chat; a clickable phone number (so many sites that I audit forget to enable this for mobile users) and information downloads (where just an email is required to access the info).
How to fix this
100. The quality of the site content is average or below par
Improving the quality of a web page is (or should be) at the heart of UX testing. This doesn’t involve adding another 500 words of keyword optimised text (I’ve mentioned this already, but worth mentioning it again), in fact, it may involve removing text. It may involve completely rewriting the copy so that it matches the intent of the search engine user and the needs implied by their search activity. It may also involve many other modifications to the layout, formatting and delivery of the content. I’ve worked with many clients in the past where the number one priority has been to improve the quality of their web pages, and to get an A/B testing platform up and running.
How to fix this
101. The sales funnel is long and convoluted with no one-page checkout
This is probably the most critical error that I see with eCommerce sites and is always number one on my list of things to fix for these sorts of clients. It is undoubtedly one of the biggest stumbling blocks preventing the client from increasing their revenue online.
How to fix this
102. The site features excessive advertising and/or text ad links
Heavy advertising above the fold, forced ad placements, pop-ups and interstitials drive users mad (myself included) and it can have a very damaging impact on organic search performance. Sometimes the advertising is completely internal, requiring users to take an action within the website. But it is all equally bad and usually a highlight in my audit if I encounter it.
How to fix this
————————————————————————————————
International SEO mistakes
Optimising a site for international visitors is one of the most difficult areas of SEO, and it’s also an area where site owners frequently get it wrong. Usually the issues reside in misconfigurations of hreflang tags (the code used to tell search engines which version of a page should be served to which regional user). That said, I have also seen some very bizarre attempts to optimise for international search (including one where the client had gone to enormous effort to create international versions of his pages, but they were all written in the English language). Other common issues that I see include the following.
103. Foreign language translations are available without changing URL
I’ve worked with some clients in the past who have set-up the functionality on their sites so that the page can be translated, without navigating away from it. They have then wondered why this has failed to generate more organic traffic from the countries they have offered translations to.
How to fix this
104. Foreign language translations are on separate URLs but with no hreflang
Sometimes clients have taken the correct steps to set-up international URL structures (for example /en-gb/ for the UK) but not included the hreflang tags within the code.
How to fix this
105. Foreign language translations are on separate URLs but with hreflang errors
Creating international content at scale is a fail-safe way to guarantee errors in the set-up of hreflang (it can even go wrong when using dynamic coding techniques). Regular auditing of hreflang, and fixing the issues found, is a common recommendation that I make in audits for clients with an international audience.
How to fix this
106. The site uses forced redirects to a country/language-specific landing page based on GEO IP
This is bad for user experience as it denies the end-user the option to choose which country or language they prefer. The implementation of IP redirects also has a significant negative effect on how well the site ranks in those target countries.
How to fix this
- Google: Don’t Redirect By IP Location
- SEO Impact of GEO IP Redirects – Don’t Do Them (For Your Own Sake)
————————————————————————————————
Local SEO mistakes
Achieving prominence in the local pack is no mean feat, with its results generated by a completely different algorithm to that which dictates the traditional organic listings. Often mistakes are made by business owners themselves, who are unaware of the implications of using a certain type of phone number, for example. Other times it is a lack of consistency on the part of the SEO agency managing their overall digital activity. The most common mistakes include the following.
107. Inconsistent NAP details across the web
This most commonly occurs when a business changes name or moves address. If changes in business details are not managed properly (*cough* the SEO company), it can diminish the trust Google might have in presenting that business to local users.
How to fix this
108. The client is using P.O Box / Virtual Office addresses
This is usually a (fairly) innocent mistake made directly by the client, who serves customers in a certain territory but does not have a physical space there. So, they set-up a virtual office address enabling them to receive the Google verification card needed to prove that they have a base at that location. The problem is, Google is aware of many P.O Box and V.O locations. Let’s face it, if your business cites an address which brings up a V.O office provider, chances are Google will be able to make that connection. This is often the key reason why many clients fail to rank in the local pack for the cities where they have set up virtual addresses.
How to fix this
109. Client is using 0800 or premium numbers instead of local
This is a frequent error that I spot when conducting local SEO audits. Often the client uses the same 0800 number across a broad range of their GMB listings, denying each of their business locations a unique NAP profile.
How to fix this
110. GMB profiles are not interlinked to the website landing pages and vice versa
A clear connection between the GMB profile for a specific town or city, and the corresponding location page on the client’s website, helps users to find additional information that may be useful to them.
How to fix this
111. Location page on the site doesn’t add value
The location page on the client’s website provides a good opportunity to rank in the traditional organic search results for region-specific search phrases. Yet frequently I see location pages that are either over-optimised (featuring keyword-heavy chunks of text) or not useful for the end-user. Ways to make location pages more useful include: featuring a map and detailed instructions on how to reach the site; including profile pictures and bios of team members who work at the location; outlining location-specific information – what clients and customers are able to do at the site; including open hours, contact details and any other information that may help the customer.
How to fix this
————————————————————————————————
Off-page SEO issues
A meticulous site audit is not complete without an analysis of the client’s backlink profile and other offline activities. Having an extensive, and authoritative, backlink profile is still a pivotal factor in helping a site to rank well (great content & links is what SEO essentially boils down to). Yet many sites that I audit struggle with this critical part of the process. Common issues that I encounter include the following.
112. The domain has few or no backlinks at all
This issue most often affects freelancers, start-ups and smaller businesses, particularly those without a sizable budget for PR and marketing activity. It can feel like a real stumbling block for business owners that are just starting out. However, with a little bit of imagination and ingenuity, SMEs can still build authority in their niche and attract natural inbound links from relevant sources.
How to fix this
113. The domain has a history of low-quality backlinks
The “mistake” often made here is by the client trusting their SEO company to deliver ethical services. Many clients are aware of the need for backlinks but sometimes lack insight into the acceptable methods for trying to secure them. It’s understandable for business owners to want to hand this over to someone else to manage; they have a business to run after all. Yet this can be a costly mistake if the SEO provider is not grilled deeply enough on their tactics and practices. It can be exceptionally hard to get a domain working well again on Google, following particularly egregious link building work, but there are steps that can be taken to recover.
How to fix this
- Does Disavowing Links Work in 2019?
- Google’s “Disavow Links Tool”: The Complete Guide
- Spam Backlinks: Ultimate Guide to Finding & Fixing Toxic Links for Improved SEO
114. The client has an EMD and excessive exact match anchor text
I still regularly speak to clients who have the perception that possessing an EMD (exact match domain) is inherently valuable no matter what you do with it. Unfortunately, this mindset often results in the client accidentally burning the domain because not enough attention is paid to the link building strategy. In this scenario, the client sets up a fairly good quality site, with above-average content, but then pays an agency to build links to it. The agency goes and builds a ton of guest post and directory links, all over a fairly short period of time, all using the same or similar exact match anchor text (matching the name of the domain). For a while, the site performs quite well before disappearing into oblivion. With so many artificial-looking links occurring over such a short period, this is not surprising.
How to fix this
- Exact Match Domains, Backlinks and Google
- How Google Might Turn Down The Knob On Exact Match Domains
115. The client is in a competitive niche and doesn’t have any tier 1 media backlinks
Not having tier 1 backlinks isn’t a mistake, as such. After all, who wouldn’t want links from top-rated domains if they were readily available? The mistake is more often connected to the fact that the client wants to rank in a very competitive niche but assumes this is possible without backlinks from top tier sites. When the client’s competitors all have links from these types of sites, it’s important to make it clear in the audit that they will need a strategy for developing a similar link profile.
How to fix this
116. The client’s backlink profile is made up entirely of directory submissions or guest posts
In certain niches, it’s possible to rank quite well using aged worn manual link building techniques such as directories and guest posts. However, most of the time it is highly inadvisable to rely on this technique. In addition, a lot of SEO agencies do guest posting very badly (a 500-word article, with just the one keyword link through to the client’s website, is not going to be difficult for Google to spot and discount). Directories can be equally troublesome as many of the general themed ones are very low quality. For clients on a low budget, I usually recommend diversifying their link profile by getting involved in real-world activities: sponsoring local events; giving a talk at a local college or university; signing up with the local Chamber of Commerce; helping local journalists with data for a story they are researching or taking part in fundraising/charitable activities. These are all methods for gaining authoritative natural links back to a website.
How to fix this
117. The client has a low level of brand searches and mentions
Google trusts brands, which is part of the reason why they often dominate certain verticals in search. A common issue I find when assessing some clients’ sites is a lack of search volume around their company name when looking in GSC or Google Trends. When this issue arises, it’s one to flag in the audit, with suggestions for how the client can grow their brand mentions online (developing their social media profiles, securing more reviews and answering Q&As, for example).
How to fix this
118. The client isn’t using social media effectively
I’ve worked with a lot of social media phobic clients over the years and I can fully understand their reticence about getting involved with it. For smaller businesses, it is a time drain that can also potentially open a can of worms (multiple channels where complaints can be posted, a backlash to what the client thought was an inoffensive tweet or simply the time and effort required to deliver regular content). Among the most common issues I see are prominent links from the website through to social media profiles that have not been updated for months or even years. This does not look good to end-users and doesn’t help to generate mentions of the company online. I usually recommend that the client finds a way to embrace social media, even if it means paying a savvy sixth form student some part-time hours to manage it for them.
How to fix this
————————————————————————————————
That concludes my long list of common mistakes that I see when auditing websites. Congrats if you made it this far! I hope this helps business owners, or people who are new to SEO, to identify issues that are prevalent with websites. If you are a more advanced SEO and you’ve noticed a common mistake that I have omitted, feel free to get in touch to tell me about it or leave a comment below. I might set-up a bonus round of links if the suggestions are interesting enough.
Thanks for the valuable information. I own a Search Engine Optimization company. Upon hiring a new person in our Web content creation company we get that person a Grammarly sign-in so they look competent to our clients. Now we still have to download a billion of extra tools and that is way too much document sharing. My content team is downloading a brand-new app called INK for All: https://seo.app/MY37InW9k with an included AI that checks grammar, tone, punctuation, and it optimizes how the website is found in search engines.
If your “meta description” is the same as those for other pages, the user experience in “Google” will be hampered. Although your page titles might vary, all pages will appear to be the same because all the descriptions are the same. Instead of creating duplicate meta descriptions, you’d be better off leaving it blank. Google will pick a snippet from the page containing the “keyword” used in the query. That being said, writing a unique meta description for every page you want to rank with is always the best practice.
Very much appreciated. Thank you for this excellent article. Keep posting!