Top 4 On-site SEO Audit Tools

Undoubtedly, one of the most critical issues to attain permanent success in search engines relates to within-site SEO adjustments. It is possible to automatize such procedures and improve the state of a given website by proper analysis of results, which is made possible by complementary tools that specialize on within-site SEO analysis.

Undoubtedly, one of the most critical issues to attain permanent success in search engines relates to within-site SEO adjustments. It is possible to automatize such procedures and improve the state of a given website by proper analysis of results, which is made possible by complementary tools that specialize on within-site SEO analysis.

A very common task we deal with in SEOZEO pertains to managing optimization projects of large e-commerce websites, and it takes significant effort to run a team solely responsible for within-site SEO operations as well as to ameliorate such sites. Needless to say, in this process, it is possible to benefit from a multitude of practical alternatives in order to reduce the sweat factor. Every site that has over 100 pages require tools for crawling analysis, so as to have a better evaluation regarding the projection of the website. Thanks to these tools, the SEO compatibility of websites can be measured with less time and effort.

In this article, I would like to introduce some tools that we commonly use while undertaking customer operations at SEOZEO. While the pricing of these software solutions varies (some being free and some not), I am of the opinion that they are compatible among themselves. The simultaneous use of these tools will complement any possible shortcomings of one solution, thus allowing for much more effective results in the evaluation process. This way, more control can be exerted over SEO adjustments and effective optimization strategies can be formed.

Moz Pro

This tool, provided by Moz, one of our international partners, allows us to evaluate within-site SEO conditions regardless of the scale of a given project with easily comprehensible research reports. The word “comprehensible” implies that any given person, who might not necessarily be knowledgeable about the field, can evaluate the state of the reported website without much difficulty. Another advantage of Moz Pro is that it provides a variety of services alongside within-site SEO analysis, such as analyses of rivaling projects, rankings and links, although I intend to stay within the scope of within-site SEO for the sake of this article.

I’d like to elaborate on the subject by explaining the procedure. First, a new project is created by entering the website information. A certain waiting period is necessary for the site to be crawled after the project is created. Especially for large-scale e-commerce sites, this crawling process can take a few days since there are too many subpages. The evaluation phase, which follows after the compilation of the analysis, is the assessment of a given site from Google’s perspective and is one of the most critical processes, where changes to be made on the site will be determined.


The Crawl Diagnostics tab allows users to access all analyzed pages and conditions. The illustration above exemplifies an analysis for a medium-sized e-commerce site. It is possible to see every part of the website that is faulty or that requires caution. On the right side is information about which aspects were problematic and how many pages on average had these particular problems. While users may analyze each problem individually by browsing through categories in the Show button seen above, every problem can also be displayed under the Errors, Warnings and Notices headings.

As for the feedback Moz returns in its analysis results as “errors” to be corrected:


Error messages with the codes 400 and 500, missing headings, duplicate page content, duplicte headings and pages blocked by robots.txt are regarded by Moz as some of the critical problems that are in need of improvement. Regardless of how numerous these may seem, they will gradually dissipate as correct strategies are implemented. In case these error messages get confusing, it is possible to troubleshoot by referring to help pages that specifically describe each error code in detail.

Crawl analysis in Moz will continue reporting the current state of the site in regular intervals until the project is terminated. If needed, users may get reports in PDF and in Excel form and continue receiving regular analyses and adjustments accordingly. While the $99/month Pro package is appropriate for individual users or in-house teams, larger scale enterprises are advised to use the Pro Plus package for $199/month or the Pro Elite package for $499/month.

Screaming Frog SEO Spider

It could easily be said that Screaming Frog is an indispensable tool to fully comprehend the performance of your website in organic searches and to allow Google Bots to easily crawl through your website following any changes that are made. Screaming Frog is a fully SEO-compatible and user-friendly desktop solution, and it has an important advantage over Moz Pro in large scale projects, since it doesn’t restrict analyses to 10,000 or 20,000 pages. Such an advantage may allow a large scale project to be finished in minutes rather than many hours as would be the case with Moz. In fact, considering that even Google doesn’t share such data on its own service, Webmaster Tools, it is feasible to say that SEO Spider is more capable of data mining.

After downloading and installing the software, it’s quite easy to begin the analysis phase. All that is needed to start is to enter the URL of the site to be analyzed into the appropriate field and press the start button. After the analysis is completed, various visual aids that illustrate the results are displayed along with parameter indicators.


As the figure above illustrates, the results are listed by multiple headings, thus sorted by category. It is possible to filter these data to selectively visualize more relevant results. To exemplify the content of these categories, the “Internal” category displays results on within-site links and which pages are interconnected. On the other hand, the “Status Code” column shows the state of the page that is linked, such as whether it is problematic or whether the link redirects properly. The “External” tab relates to the external links and their current state, and indicates full links, content (images, text, videos, etc.) as well as methods of redirection (status code).

Other important analysis information Screaming Frog offers its users are “Page Titles” and “Meta Descriptions”. Evaluating these results easily allows users to address problems such as missing/imcomplete meta descriptions or duplicate headings. In addition to these, information about topics ranging from uses of headings to analyses of images are readily available.

While Screaming Frog is in general a very successful piece of software, it arguably has a few fundamental flaws. One of the most important issues is about the lack of detailed information on the duplicate heading problem.  In my personal opinion, the only thing that makes Screaming Frog imperfect is that there is no detailed information on this subject in the analysis results. As I mentioned earlier, the use of multiple tools is necessary exactly because of this reason: certain analysis and reports that are ineligible due to the duplicate page problem are available with the use of MozPro.


Even though it is possible to access all sorts of analyses from faulty servers to double heading issues using MozPro and Screaming Frog, in order to examine our site and fully comprehend how Googlebots “see” our site from the users’ perspective, BrowSEO is an indispensable tool, whose thoroughly simple use and analysis reports make it unique. Although the information provided by BrowSEO is quite limited, it provides a much better perspective of the bigger picture of a website’s structure.


After entering the URL of your website to the address bar, you can access the general HTML outlook and some reports on the right side. BrowSEO implements a color coding scheme to provide qualitative and quantitative information to the user on the link system of the site (within-site links with yellow, external links with red), which helps with visualization especially in extensive sites. Under the “Head” heading, there is fundamental information about the heading, meta descriptions and infrastructure of the website.

This service allows users to prevent some larger pitfalls since it provides Google’s perspective successfully. For example, one could see that critical content created in forms like flash or AJAX isn’t recognized by bots and take measures against that.


Other eye-catching qualities of BrowSEO include (a) information availability about all headings, such as how they would appear to search engines in a possible search as well as (b) the presence of apps dubbed as “Cloaking Attempt” that tests whether pages look similar to bots as they do to real visitors.

This visualization tool is web-based and provided completely free of charge. Analysis outputs can be exported using the “Download Entire Session” button.

Google Webmaster Tools

It is probably safe to say that the most straightforward way to comprehend Google’s within-site evaluation algorithms is to get the information from Google itself. For this reason, using Webmaster Tools actively and effectively is essential for long-term and permanent SEO success. Arguably the best part of this tool is that it is beginner-friendly – Google is successful in thoroughly simplifying everything and providing a suitable setting for inexperienced users. Other important qualities that set Webmaster Tools apart from others are that it is free of charge and easily accessible by mobile tools.

As mentioned, Webmaster Tools allows users to see their own sites through the eyes of Google. Although this is also the case for BrowSEO, Webmaster Tools is more comprehensive in the analysis it provides. In addition, it is possible to see exactly which keywords get caught up in Googlebot analyses. If the targeted keywords are not recognized by Google, website content can be developed further to enhance the coherence between the content and the aimed keywords.


In addition, Webmaster Tools visibly documents the crawling errors with a timeline of development. As can be seen in the example above, Google reports the information from its bots directly to website owners and allows errors to be marked as corrected so that bots can be notified.

“Fetch like Google” tool is another means to visualize the indexing of a given site’s main and secondary pages. Using these outputs that are organized like a tree diagram, preventive measures may be taken against factors that cause poor SEO performance.

Finally, under the “Google Index” tab, it is possible to access the number of pages that are browsed or added to index, content blocked by bots and detailed graphs about the timeline of the entire process.

To conclude;

All four tools mentioned provide unique and effective services. For a comprehensive and superior analysis, I recommend the simultaneous use of these tools. Despite seemingly time consuming, after some practice, these tools will become invaluable to your business. For more detailed information, it may be a good idea to directly use the free services and get trials for the paid services.

Bir Cevap Yazın

E-posta hesabınız yayımlanmayacak. Gerekli alanlar * ile işaretlenmişlerdir