What our tool does

Our content optimisation tool mitigates the laborious process of identifying which pages are the best to target for a given keyword, especially across large keyword datasets. Specific URLs can be provided to the system or the system can uncover all potential URLs itself by performing a crawl of the site – similar to how Google discovers the URLs it chooses to rank.

The page crawlers work just like Googlebot and, using a headless browser, render each page’s content before applying our scores. This means that if your site is heavily reliant on JavaScript or built in framework like Angular we can still properly analyse your content.

Our content optimisation process

The tool works by taking in all potential target keywords, optionally categorised, and then cross referencing these against all potential URLs to determining the overall relevancy relationship. The relevancy between keywords and URLs is determined using a bespoke algorithm which looks at all key optimisation elements and scores how well optimised each is, per page, for the given term.

It even uses Google data – as they have a pretty advanced search algorithm – to supplement our dataset and further improve the accuracy of the output.

The algorithm itself is entirely customisable using advanced reporting settings. This allows users to up-weight scores based on current ranking performance, tweak how well an element (i.e. the title tag) is scored and cater for semantic indexation (i.e. pluralisation). This output then determines how well optimised a page is and where content gaps may be present within the site.

Creating optimised content

If you already have content the tool helps you to quickly understand which page is best optimised for your keyword set and how that page could be tweaked to perfection. Where you don’t have content we can quickly identify those content gaps so that new traffic potential can be unlocked.

As well as a content relevancy report the tool produces a page data report, broken links report and redirect chains report – similar to that of tools like deep crawl. As we use server-side crawling this mitigates issues of localised crawling tools (i.e. Screaming Frog). Analysing these supporting reports allows a number of potential SEO issues to be uncovered, specifically areas of the site that should not be indexable or where potential duplicate content issues are occurring.

One fact is for certain, when we increase a client’s content optimisation score, we see a direct improvement in organic rankings.