Organic search is, unsurprisingly, a powerful channel that brands look to exploit in order to achieve their business objectives but for so many organisations, hidden issues and flaws in their approach to search hold them back.
The way in which search engines have continually refined their algorithms over recent years, particularly with regards to content and relevancy, is well documented. Search engines depend on their ability to serve the most relevant and useful query as effectively as possible, and this relies on brands to ensure that they are in a position to best serve that query themselves.
But underlying technical issues, cultural challenges and lack of agility regularly prevent organisations from doing just that, and this is what holds them back. Search engines change a much faster pace than most enterprise-level organisations, and this is what often proves costly.
Are your foundations flawed?
There are two key warning signs that your SEO foundations may be flawed.
The first is your web traffic. If you’re receiving less traffic than you could reasonably expect, if you’re traffic seems heavily reliant on brand terms or if you notice an unseasonal drop in traffic, this should trigger alarm bells.
The second obvious warning sign is your search rankings themselves. Whilst no brand or page has a ‘divine right’ to rank, persistently low rankings in product areas that your brand should really be ranking prominently for (considering your brand strength, reputation and the relevancy of the product you offer) are something that should cause enough concern to trigger some investigation. If your brand term rankings in particular are poor, this indicates that it isn’t being accessed properly by search engines.
Finding the hidden problems
If you find yourself in that position, the obvious question to tackle is “where do I start?” Getting under the bonnet of your site can be a daunting prospect (for all manner of reasons), but there are some key starting points that could lead to some ‘quick win’ results.
To some this may seem like an obvious place to start when looking into the way that your site is performing, but if your website doesn’t appear in search engines, then organic traffic cannot come to your site. This is a simple issue to test this – typing into Google site:yoursite.com will show how many pages of your site are indexed, allowing you to make a rough judgement on whether that figure is aligned with your expectations.
This also allows you to check whether old versions of your site are indexed – such as development properties or old pages that should have been migrated.
Checking which pages are indexed is an important factor, as it allows you to ensure that you don’t have URLs which could be causing you issues some of these can include:
- Developmental subdomains: if work has been done to your site in a development environment, this may not have been properly ‘no-indexed’. This can be a cause of pages being served to the user and crawled by search engines which shouldn’t be accessible. Additionally, these pages could contain duplicate content, which can have a negative impact on the content that should be ranking in a live environment.
- Query strings: Another source of duplicate content is indexation of query strings. This can lead to the wrong pages ranking for certain keyword terms.
- Obscure website sections: Old obscure areas of a website might be being indexed, which can often cause issues with duplication or users accessing outdated information.
It is worth noting that the order in which your indexed pages are returned when doing a site search means nothing in terms of the website’s performance.
Previous poorly managed migration
It is best practice to identify where to redirect URLs when they have to be changed for any reason. However, many organisations have gone ahead with a migration hastily in the past, and then suffer the consequences of this further down the line. This can be a migration from HTTP to HTTPS, or migration to a new site.
Ideally this is fully planned ahead; a strategy put in place and a migration checklist created and followed. Failing to do this can cause damage that proves costly in terms of poor user experience from broken links, and lost link authority from links that don’t redirect.
When URLs change for any reason, it is important to ensure that the old URL is redirected to a new URL if the page stays the same; or to a relevant page if the page no longer exists (for example when a product is removed). Similarly, old links in content can send crawlers through redirects unnecessarily – an issue that can easily be rectified by finding outdated links and updating or removing them. These issues are predominantly caused by a site migration such as moving from HTTP to HTTPS.
It is best practice to use permanent redirects, unless the change is temporary. If a temporary redirect is used on a page which has permanently moved, then search engines may continue to index the outdated page. Redirection is a key part of SEO, to ensure that any external link value isn’t lost when URLs change; but also to prevent users being served broken pages.
It is key to test redirects whenever a new batch is created, in order to ensure that any changes are not affecting older redirects and losing the value they have previously provided.
Link profile of the Site
A link profile refers to the inbound links which point to a website, and the characteristics they have, alongside the number of links pointing to your site, link profile is made up of:
- The type of links pointing to your site
- The anchor text of those links
- How those links were acquired
Sites that have evolved over time but retained the same domain are likely to have a variety of backlinks to them. These can potentially be harmful, as Google algorithms have changed over time to catch spammy links. Sites that have previously used link building tactics in the past may be at risk from penalties which can affect their rankings. As such it is important to keep a check on links into your site, through link removal and utilising Googles disavow function.
Thin content is identified as low quality pages which add little value to the reader. Examples of thin content can include.
- Duplicate pages
- Automatically generated (spun) content
- Doorway Pages
Thin content is the problem being tackled by Google’s Panda algorithm, targeting pages which have large amounts of content that, whilst full of search keywords, actually offer little value to the reader.
The best way to measure the content of pages is through user satisfaction – those pages which have a high bounce rate are likely not delivering what the user wants. Look at those pages and see whether you are delivering the right user experience for that audience.
Duplicate content can be a big issue for website performance. Duplicate content is content that appears on the internet more than one place (URL). When there are multiple pieces of identical content on the Internet, it is difficult for search engines to decide which version is more relevant to a given search query and therefore which one to index.
Often duplicate content can go un-noticed, as it is often located on obscure pages of a website, or from pages that utilise query string URLs that have not been properly canonicalised or declared within search console. By having a large amount of duplicate content can hinder search performance, so it is essential that you provide search engines with one sole URL to index, with other instances correctly canonicalised.
Prevention is better than the cure
When it comes to SEO, it is best to be proactive when it comes to your website’s health. Rather than have to deal with an issue when it has begun to cost you in terms of rankings, traffic and revenue; regularly assessing how your website is functioning within search can save long term problems arising.
Continually monitor your visibility and traffic performance, review your site updates to ensure that you haven’t made any amends that could contain hidden errors, and keep an eye on emerging keywords and trends to ensure that your content remains relevant.