Bing’s New Robots.txt Testing Tool

Bing has announced a new robots.txt tester in its Webmaster Tools which will be a helpful tool in analysing the robots.txt files to highlight any issues that may hinder Bing from optimal crawling.

robots.txt is an essential file and tells search engine crawlers what to do and what not to do on a website. Unfortunately, many websites upload robots.txt files with an error.

The tool analyses robots.txt, identifies problems, guides publishers through the fetch and uploading process and checks allow/disallow statements. Changes to the file can be updated offline in order to check whether there are any errors before pushing the changes live. Bing’s new tool has also the option to retrieve the latest version of the file to see if any changes to the robots.txt file have been made elsewhere.

Source: https://blogs.bing.com/webmaster/september-2020/Bing-Webmaster-Tools-makes-it-easy-to-edit-and-verify-your-robots-txt/

Do websites have to wait for the next core update for better rankings, if hit by an algorithm update? – John Mueller’s answer

John Mueller explained that Google’s core algorithms are primarily concerned with understanding how web pages are relevant to site queries.

While in the early 2000’s it was true that you had to wait for the next monthly update to see whether the changes to your website would have a positive or negative impact on rankings, nowadays, you should be constantly working on improving your website in order to see a recovery before the next update. In fact, since Google might be working towards the same direction for their next core update, you might see even bigger changes in your site’s performance.

Mueller highlights that one of the oldest but still most useful approaches to content improvement is by diversifying the content. This means a mix of evergreen and temporary content as well as mixing the formats such as how-to articles, tutorials, heavy images etc. Having a mix of different content helps addressing a wider range of needs targeting a broader audience which helps you and Google understand what is currently “trending” amongst users. This then leads on to building stronger signals of authority for your website.

Source: https://www.searchenginejournal.com/google-core-update-recovery-time/380110/

Word Count Is Not A Ranking Factor

There has been a discussion on Twitter as to whether word count is a Google ranking factor or not. Google guidelines state “Content should be factually accurate, clearly written, and comprehensive.” Often, we also see pages with longer content ranking in higher positions although sometimes a content might cover too much (like recipes…). John Mueller confirmed that word count is not a ranking factor. However, using it as a guideline for yourself might help you to become more comprehensive within a given word count.

Source: https://www.searchenginejournal.com/google-comprehensive-content/379652/

Local businesses can highlight COVID-19 ‘health & safety’ measures with Google My Business

The attributes will start appearing in Search and Maps soon. They should give users confidence and help to manage their expectations and will include:

  • Appointment required
  • Mask required
  • Staff get temperature check
  • Staff wear masks
  • Temperature check required

Google confirms search console reports are delayed

Google notified site owners that it’s experiencing longer delays than usual with its search console reporting - “We're currently experiencing longer than usual delays in the Search Console Index Coverage report. This only affects reporting, not crawling, indexing, or ranking of websites. We'll update here once this issue is resolved. Thanks for your patience!”

A delay like this is not unheard of as GSC reporting has been delayed several times in the past. The only thing site owners can do is remain patient and continue to wait.

Source: https://www.searchenginejournal.com/google-confirms-search-console-reporting-is-delayed/380759/

Google search changes make in-store shopping easier

Google is making changes to shopping searches that will make it easier for customers to shop safely from local stores. As the pandemic continues, Google aims to combine the safety of shopping online with the immediacy of buying in-store and is making several enhancements to shopping results designed to help customers shop at nearby businesses easily and safely. These include:

  • Filter by local availability – when searching for products, users can now filter results to show only what’s available near them. After searching, click “nearby” filter (same way as adding “near me” to your query).
  • Comparison shopping – shoppers can conveniently compare local retailers without leaving their home. When doing local queries, Google will display a carousel containing pictures and prices of available products.
  • Curbside or in-store pickup – Google is rolling out labels in shopping searches that indicate which option is avaliable. In addition, there’s a direct link to navigation in maps which will help users get to the store fast.

Source: https://www.searchenginejournal.com/google-search-changes-make-in-store-shopping-easier/381106/

5 Proven ways to keep visitors on your website

  • Lists – readers and search engines love lists. Rich snippets are often lists that appear in the top positions of SERPs. Lists are often numbered or bullet-points and help readers to quickly grasp the content, and bullet-points help readers see a concise summary of the content.
  • Graphs and charts – these are essential when comparing things conceptually or through time frames. Colours help them standout. If you claim yourself as the source when other websites use them, you could get a valuable backlink.
  • Mobile Compatibility
  • Sections – without proper sections, information can get buried and you will not lose visitors before they read more than 2 lines. Headers have been an on-page ranking factor since the beginning, they help both search engines and users understand your content better.
  • Put in the time – when creating web content, put yourself in the reader’s shoes. Invest extra time to make sure your content is both readable and broken up into sections. Great content answers questions and educates the reader quickly. Great content takes time to create but lasts forever.

Source: https://www.entrepreneur.com/article/348159

Sandbox Site Speed Comparison Tool

This tool allows you to compare site speed across results of a Google search. You simply enter the search query and it brings up the results for each site. This could be a useful tool as site speed is a ranking factor, and you can determine which positions have the best/worst site speed to help determine how much an improvement could help improve rankings.

https://tools.sandboxweb.io/site-speed-comparison-tool

Google Rewrites Meta Descriptions Over 70% of the Time

A study found Google ignore meta descriptions over 70% of the time for pages on the first page of search results. The recent study examined search results for 30,000 keywords and found the following:

  • Search position rewrite rate - there is a bump in the rate of rewrites from positions 4 to 6.
  • Search volume rewrite rate - the higher the search volume, the less likely google is to rewrite the meta description. It is speculated that this is due to SEOs prioritising meta descriptions for keywords with the most search volume.
  • Display length - another factor that can vary is how many characters are displayed. On desktop the displayed characters would peak at 156 to drop off at 165. If this were displayed in a snippet, it would drop to around 142. If a description is rewritten, it is likely to display between 160-167 characters.

Takeaways:

  1. Keep meta descriptions between 150 and 160 characters for regular pages.
  2. Keep meta descriptions for blog posts between 138 to 148 characters (and other pages with publication dates).
  3. Put the most important information within the first 100 characters.

Source: https://www.searchenginejournal.com/google-rewrites-meta-descriptions-over-70-of-the-time/382140/

How to Handle Out-of-Stock Products on Ecommerce Platforms

There is no “right way” to handle OOS or retired products, but your primary goal should be to alleviate as much negativity as possible from customer experience. There are a number of options avaliable:

  • Let the URL show a 200 status code. This is common across a few platforms, the majority from Salesforce Commerce Cloud sites. When a product is retired and the URL is requested it displays a 200 SC and returns the header/footer and no body (this is considered a soft 404). On other platforms its common to see the same displayed but with a custom template mentioning the product is OOS and lists alternatives instead, which can be considered frustrating and confusing to the user.
  • 200 + Soft 404. It is common to include text or user call to action to show the product is OOS and get the email of the user so they can be notified when it is back in stock, however including this text snippet can lead to google treating the page like a soft-404 as it knows the user is searching for a specific product and will likely have a high intention to purchase.
  • Redirects. Another common method is to redirect the OOS or retired products. This is an obvious choice as the page no longer exists and you could benefit more by redirecting to the homepage or to the product category page. However, it can also be a negative as users expect to see a product but end up back at the category page.

Google says that retired products should allow the URL to return a 404, which allows Google to process the new page status and drop it from the index. Temporarily OOS products should be left as is. Communicating the stock availability with the user is a good idea. Using structured data is also recommended to show the product isn’t currently available so Google can process it.

Source: https://www.searchenginejournal.com/how-to-handle-out-of-stock-products-on-ecommerce-platforms/381538/