Improving the ranking: What are on-page tools really good for?

Blog, Effective Blogs
William Bachmann

Onpage tools analyze the website for SEO suitability and help to improve the ranking. To do this, they identify loading times that are too long, duplicate content and URL problems – and give recommendations for optimizing the website. However, there are also limits to the tools.

You index page by page, follow internal links and check the content for search engine-relevant criteria: On-page tools help to improve the ranking of your website on Google. The tools start at a specific page and then crawl the individual subpages. It’s not just about getting a list of all pages, but rather checking the content for certain SEO criteria. The onpage tools then also suggest improvements with which operators can optimize their website.

Who is which onpage tool is suitable for, depends on both the purpose of use and the size of the website. If you have very large websites with more than a million pages, you should think about tools like Audisto or Deepcrawl (see table). Other tools can crawl large websites as well, but they are typically more used for smaller websites.  The Screaming Frog SEO Spider, a free tool that is popular in the SEO scene, complains about a lack of memory in the standard installation with a medium five-digit number of pages. Even if the memory problem has not been solved, it cannot be said that the tool is still performing particularly well with large amounts of data. It’s also the only one of the tools mentioned in this article that doesn’t work in the cloud.

The Screaming Frog SEO Spider can now also check the rendering of pages.

The functions of the individual tools are extremely different, the specific scope partly depends on the packages booked. In addition, some functions have not been proven in terms of their effectiveness. For example, some tools have text readability analyzes based on the Flesch score. However, since it is not known whether Google uses the same method for text rating, the use of many tools is a philosophical question anyway.

Advantages of on-page tools

Even if the options primarily depend on the tools, there are specific tests that on-page tools generally enable:

Important page elements

Page title and meta description are important elements of a page, as these are primarily displayed in the search result. SEOs can also check elements like <h1> very easily using the tools: Are they too long and are then displayed cut off? Aren’t they unique that Google discards them a lot? Or are they too short and only consist of one word, which can lead to an unattractive search result?

Canonization, duplicate content

Often there are problems with duplicate content on websites – even if there is only rarely a reason for the often feared punishment or devaluation. Nevertheless, website operators should get to the bottom of such problems, for example to bundle important SEO signals on a single page or to avoid wasting a crawl budget. After all, the number of pages crawled by Googlebot every day is limited.

Content quality / quantity

Content only leads to a good ranking if it has the necessary quality and quantity. Some tools can rate content, for example via WDF-IDF. In this way, they enable website operators to recognize and correct text deficits to a certain extent.

HTTP status codes

In the case of an internal crawl, all URLs should deliver an HTTP code of 200, that is, “content was loaded successfully”. Incorrect internal links and incorrect configurations can, however, result in internal redirects (code 3xx), errors when accessing pages (code 4xx) or server errors (5xx). These errors can be found with the help of on-page tools and then corrected.

Click depth

The click depth defines the minimum number of clicks that a person or a crawler needs to get from the start page to a specific page with just a link click. If this click depth is too high, it can happen that the respective pages are no longer crawled at all and subsequently not indexed. Such errors can arise from incorrect pagination, for example.

Performance / loading time

The loading time of a page is gaining importance as a ranking factor. Some tools now also measure this performance data, as this can vary greatly from side to side. At this point, it should be said that Google Analytics also provides this data from real users – in many cases also much more meaningful data.

Url problems

Many problems can arise with URLs: special characters, excessively long URLs, mixed use of lower and upper case, GET parameters in different order. Since one tool crawls all URLs on the website, it can quickly find cases that can lead to serious problems.

Comparison with the XML sitemap

During a crawl, only content that is internally linked can be found. Sometimes there are also so-called orphans – pages that do exist, but which a crawler and thus Google cannot find. It therefore makes sense to compare the generated XML sitemaps with the real crawls in order to track down such problems with internal linking.

Search analysis data

Some tools also offer the import of search analysis data from the Google Search Console. The main advantage of this is that it gives you access to more data than the 90 days offered in the console. As the duration of tool usage increases, the amount of data increases more and more. The data can be used to generate recommendations, for example keyword opportunities, i.e. rankings in positions eleven to 20, which can possibly be optimized for the first page, or competing pages, i.e. two or more pages that cover the same search term and differ Times rank.

Limits of the tools

The points give an insight into what onpage tools can do. However, there are also aspects where the tools cannot help.

Coverage of search terms

It can happen that a page is incorrectly named and targets the completely wrong search term that no one is typing. And it can happen that there are exciting, relevant search terms for which no suitable content is currently available. In these cases, the tools are usually powerless. It must of course be mentioned that Google is getting better and better in so-called matching and is showing a page for “Carnival costumes” also for the search query “Carnival costumes”. In many cases, however, the matching does not work, as many – especially very special – search queries are not fully understood in terms of content.

Content visibility, rendering

Google no longer just downloads a pure HTML page, but renders it completely – like a browser, i.e. including JavaScript, CSS and image files. Some tools can now also check the rendering (see figure).

(Source: t3n)

Thanks to the rendering, Google can recognize whether a certain text is collapsed (“click-to-expand”) or, for example, is “hidden” behind tabs. In the Google guidelines it is clear:

“Make sure that the most important content on your website is visible by default. Google is able to crawl HTML content that is hidden behind navigation elements such as tabs or areas that can be expanded. However, we classify this content as less accessible to users and believe that the most important information should be visible in the standard page view. “

The common tools cannot tell whether a content is hidden and whether it is relevant. And the question of whether particularly important content is in the directly visible area (“above the fold”) or whether the website displays too much advertising usually has to be checked manually.

Further problems

In addition, the tools are usually not helpful for things like:

  • Implementation of internationalization: Especially international companies that offer content for many countries and languages ​​often have deficits in terms of domain strategy. As a rule, it is not advisable to create a separate country-specific domain for each country. But even a suboptimal internal linking of the different country and language versions or a failed redirect based on the IP address can usually not be recognized by the tools.
  • Optimization of the internal linking: The tools can of course provide information on internally well or poorly linked pages. But only experienced SEOs can determine the optimal link flow.
  • Missing markup: Are there pages that are missing certain markup, although it would be useful there in principle?
  • User-generated content: Are certain options for user-generated content, for example comments or a forum, not currently being used?
  • Crawler problems: There are occasional problems because the Google crawler is treated differently from a “normal” crawler. Anyone who uses redirects based on the IP address will not be able to see these errors if only crawling from Germany is carried out. But how is the Googlebot treated, whose IP usually comes from the USA?

What website operators should consider

A website operator should not only know the possibilities of his tool, but also the faulty gaps so that he can close them with his own analyzes – or other tools. To do this, it is important that he knows the features of his tool inside out. Many manufacturers now offer certifications or online tutorials that operators can use.

The knowledge is not limited to the tool itself. The digital helpers often show problems, but do not help very well with the solution. An example: The click depth of a page is too high. How is this problem to be solved? An operator should therefore not only be adequately trained in the tool, but above all in the area of SEO , in order to derive the right decisions.

It should be clear to him, for example, that on-page tools cannot detect all errors. A website with zero defects doesn’t mean that it is bug-free. You could run into massive keyword coverage shortfalls without the tool raising the alarm. In addition, many errors are systematic in nature. An error can occur 10,000 times for 10,000 pages. Website operators should not be put off by the high number of errors.

Perhaps the most important note: users of onpage tools should trust their own judgment. Recommendations provided by the tools are not necessarily relevant for the individual operator. If you have 10,000 meta descriptions that are too long, you shouldn’t shorten them all manually – as a rule, that would not make economic sense. In the end, it is primarily up to the user to decide whether a change really brings a ranking advantage or whether it is a “nice-to-have”.

Leave A Comment