What SEO Audits You Can Do in Netpeak SpiderКейсы
I realize that sometimes Netpeak Spider settings that help deal with a specific range of tasks more effectively may stay out of sights. That’s why I made this blog post to help scratch beneath the surface and show different types of website audits. It‘ll give you insights into how you can solve specific tasks and get various reports for a single project.
Netpeak Spider crawler has a free version that is not limited by the term of use and the number of analyzed URLs. Other basic features are also available in the Freemium version of the program.
To get access to free Netpeak Spider, you just need to sign up, download and launch the program 😉
P.S. Right after signup, you'll also have the opportunity to try all paid functionality and then compare all our plans and pick the one most suitable for you.
1. Quick Audit ‘with Search Robot’s Eyes’
In this audit, you can assemble the pages that get under the search robot’s ‘view’ in the first place. So you'll know what issues on certain pages need to be fixed sooner than others. I’ll showcase each type of audit.
First, enable the settings that help the program imitate a search robot’s behavior:
- In the 'Advanced' settings, select the 'Default: bot' template, which triggers all crawling and indexing instructions needed for this audit.
- On the 'User Agent' tab choose a search engine robot which is a focus for your website promotion: 'Yandex Bot', 'Bingbot', 'Googlebot', etc.
- Search robots don’t crawl the entire website during one session, especially if the website is large. That’s why we’ll set restrictions for the maximum URL depth and maximum crawling depth.
When the settings you need are on, start crawling. In the end, you'll see the main table only with those pages which are likely to be included in the search engine’s index.
Received data can be enriched with pages from a sitemap. Use a built-in instrument ‘XML sitemap validator’ for that purpose. Also, you have the opportunity to improve the report with data from Google Analytics and Search Console that shows traffic, clicks, impressions, which will help define:
- compliant pages without traffic, clicks, or impressions
- non-compliant pages with traffic, clicks, and impressions
Netpeak Spider doesn’t let the same URLs get to the main table. If GA, GSC, or sitemap contain pages similar to those in the table, they won’t slip in the table. Only pages that weren’t detected during crawling will be added. Kept in sight these reasons why the pages can still be missed out:
- The search robot doesn't see links to these pages. For instance, to open the link, you need to click on a particular button.
- There are no links to these pages on the website.
- The links are nested too far from the homepage. In this case, you'll need to optimize the crawl budget. We’ll approach this type of audit in the next example.
2. Audit for the Crawl Budget Optimization
In this type of audit, we’ll use segmentation. Before crawling, opt for any settings you need, but remember to select these parameters:
- ‘Status Code’
- ‘Crawling and Indexing’ checkbox
- ‘Links’ checkbox
- ‘Click Depth’
After the crawling is complete, check out these issues:
- Compliant pages with low PageRank. Compliant pages should get more link equity than non-compliant pages because they potentially can bring traffic to the website. Consequently, you should detect pages that receive an insufficient amount of link equity and place more internal links that will lead to them. To detect such pages, apply a segment and filter PageRank value in the ascending order, as shown on the screenshot.
- Non-compliant pages with high PageRank.
- Pages that dwell on the click depth far from the '5' value from the initial page. To spot what important pages lie far from the initial one, use a click depth segment.
- Pages that don’t receive link equity.
Also, I recommend bearing in mind the links that lead to the pages with redirects. They also deplete the crawl budget.
In the 'Export' menu, you can export a detailed report on redirects.
As in the previous audit, you can extend data with links from the sitemap to deal with the ones that weren't found during crawling.
3. Audit at the Start of Website Optimization
This type of technical audit is the opposite of the 'with search bot's eyes' audit because when you begin optimizing a project, it's essential to give it a heavy kick in the rankings at the very start. To that end, you should find the maximum optimization issues and fix them sooner better than later.
To do so:
- Enable the crawling of the external links.
- Disable crawling and indexing instructions.
- Reset restrictions and rules (if they’re set)
- Enable all parameters.
With these settings in action, the program will check website pages for all available SEO issues. To export all reports on the detected issues, use the 'Export' menu.
For quality SEO audit at the starting line, it’s important to narrow down on fixing:
- broken links
- incorrect content in the canonical tag
- duplicate content (title, description, <body> tag)
- empty title and description
- long server response time
- redirects with bad URL format
To find reports on these issues, go to the 'Special issue report' in the 'Export' menu. Speaking of the 'Special issue report,' you can find an 'Issue overview + descriptions' report, which can be used as a ground for a technical task for developers. If needed, export this report.
4. Audit at the Pre-Sale Stage
If you have a few hot minutes to demonstrate the client what major optimization shortfalls drag down the website performance and give recommendations on tackling them, you’ll need:
- White label report
- The issue overview list with issues detected during crawling, their descriptions, and suggestions on how to handle them.
To get this report, you should go to the ‘Export’ menu.
To sound persuasive at the pre-sale stage, carry out the audit with the settings from the previous audit example so that the program crawls all pages and assembles the maximum of all possible errors.
5. Audit of the External Links
Internal linking optimization is crucial, but don’t forget about optimization of external links. This audit will help you track down low-quality external links, namely:
- broken links
- links that lead to the pages with long server response time
- links to low-quality websites and websites with malicious resources (the checks are conducted in Netpeak Checker).
To do this audit in Netpeak Spider:
- Enable crawling of the external links
- Tick the parameters: 'Status Code' and 'Response Time'.
When the crawling is completed, apply the segment for the external links.
In the end, you'll get the list of all external links and reports on issues that were found on the pages your website refers to.
To swing into high gear, you can open this report in Netpeak Checker to ensure that none of these pages is on Google's 'blacklist'. Google Safe Browsing server will help you find that out.
6. Audit for Websites That Moved to HTTPS Protocol
After your website moved from HTTP to HTTPS protocol, it’s important to ensure that there are no HTTP links left, direct links to HTTPS are set without redirects, and the mixed content issue doesn’t jeopardise your website security.
Before crawling, enable these parameters:
- ‘Status Code’
- ‘Incoming Links’
- ‘Target URL Redirect’
- ‘Page Hash’
- Outgoing Links’ checkbox to fish out the mixed content
In the settings, you can also enable the crawling of the additional pages and files (JS, CSS, etc.).
When the crawling is completed, you look for report on the links with HTTP protocol on the ‘Overview’ tab.
To see the list of incoming links for one page, right-click to summon the context menu or use the F1 hotkey. To view all incoming links for all pages in the report → Shift+F1.
You can also check out the report, which shows the URLs with redirects, where they lead, and the incoming links on these pages.
You’ll find it this way in the technical SEO audit (PDF):
- The total number of links that lead to the pages with redirects and the number of broken links.
- The number of pages with HTTP protocol.
7. Audit of Website Speed
Website speed is an indispensable factor in search engine optimization and one of the most significant ranking factors. Especially in the wake of Core Web Vitals coming into play.
To analyze website speed, crawl your website with these parameters turned on:
- ‘Status Code’
- ‘Response Time’
In such a way, you'll crawl the website in a blip of an eye with your device's minimum resource consumption. As a result, you'll get information on how fast the server responds to requests and what issues may occur with the server response time.
By default, the pages that respond more than 500 mc in Netpeak Spider fall into the report on the 'Long Server Response Time' issue. However, you can set the limits on the 'Settings' tab → 'Restrictions'.
Additionally, you can check these pages in the 'Google Pagespeed Insights' tool. To do so, right-click on the target URL and choose the service.
Also, you have the opportunity to conduct a bulk URL check in the PageSpeed Insights service using Netpeak Checker. Transfer URLs from Netpeak Spider to Checker, select desired parameters in a sidebar, and start the analysis.
In technical SEO audit, you'll see this information about server response time.
Server response time and content loading time can vary according to the load on the website server. For instance, the pages load at flying speed if there are few visitors on the website at the moment, but if there are more visitors, the server capacity to load fast slows down.
In Netpeak Spider, you can increase the number of requests which are sent to the website server, thus creating a denser load on the server and analyzing how fast the requests are processed. We’ll peruse this case in the next audit type.
8. Audit of Website Load
Before you start an audit, turn off all parameters except for those that you need to check website speed. Crawl website with the maximum number of threads. It will help you understand how the server behaves under the high load strain.
Tread carefully because such heedless crawling can temporarily shift the server out of work.
Apart from the information about speed, I recommend paying your attention to the status codes and check if any pages with ‘Timeout’ status code were spotted.
9. Audit of Metadata
To check the accuracy of metadata (website content audit), turn on these parameters before crawling:
- ‘Status Code’
- ‘Meta Robots’
- ‘Title Length’
- ‘Description Length’
- ‘H1 Content’
In the settings, enable all crawling and indexing instructions.
As a result, the tool will check pages for the issues related to meta tags:
- ‘Duplicate Title’
- ‘Duplicate Description’
- ‘Duplicate H1’
- ‘Missing or Empty Title’
- ‘Missing or Empty Description’
- ‘Multiple Titles’
- ‘Multiple Descriptions’
- ‘Same Title and H1’
- ‘Max Title Length’
- ‘Short Description’
- ‘Max Description Length’
10. Audit of Images and Media Files
Optimization of images is yet another important aspect of website promotion that you should keep a weather eye on. That's why it's important to fall in line with search engines' recommendations that concern visual content on the webpages. Netpeak Spider checks these major issues with images and other media files (audio, archives, etc.)
- ‘Images without ALT Attributes’
- ‘Max Image Size’
- ‘Other SEO Issues’
For media files, general issues are detected (redirects, broken links, etc.)
To do an audit of the issues related to images and audio files, it's enough to crawl the website with default program settings.
After the crawling is completed, you can filter results with the help of the segmentation feature. It's shown on the example screenshot below.
11. Audit of Giant Websites
If you fear that your computer data storage resources are limited, follow the recommendations below.
The more parameters are engaged during crawling, the more RAM is spent on displaying results. To decrease the likelihood of an error due to the shortage of RAM, crawl the website twice.
During first crawling, disable all parameters, except status code. Thus the program will quickly find all website pages without a significant load on RAM. When the crawling is completed, save the project.
Before you start crawling for the second time, clear the address bar, go to the 'Parameters' tab, tick the necessary points, and hit 'Restart'.
In this case, the crawling will be carried out according to the URLs list, which is less resource-consuming than crawling in the standard mode.
As far as the crawling continues, you'll see the data on the selected parameters appearing in the results table.
12. Audit of Multilingual Website
If the website targets the audience from different countries, multilingual content is a must. The website may have several alternative language versions for each page. For search engines not to take them for duplicates, it's vital to specify website versions in foreign languages in the hreflang attribute. The use of this tag entails a range of rules, that's why it's easy as ABC to make a mistake.
In Netpeak Spider, you can figure out whether or not all conditions have been met. In case the program notices that you failed, it will point to the issue source in the corresponding reports.
Before you start the audit, it’s enough to enable two parameters from the hreflang group: ‘Hreflang Language Code’, ‘Hreflang Links’ and also the ‘Status Code’ parameter.
The crawler will check this tag, and the spotted issues will be displayed in corresponding reports.
As a result, you’ll check if:
- the current pages contain hreflang links
- the hreflang attribute doesn't include broken links
- the language codes are correct and there’s no duplicate codes
- there’re alternative URLs without duplicates
- there’re confirmation links in the hreflang attribute
- hreflang is clear from relative links and / or links to non-compliant URLs
- language code in confirmation hreflang links is consistent
Besides, you can find a separate report on the links from the hreflang attribute in the program. You can open it in the 'Database' menu.
If a separate domain is used for each language, audit the group of websites to crawl all website versions.
13. Audit of the Group of Websites (Multi-Domain Crawling)
To crawl several domains in different languages:
- Make sure that the hreflang attribute check is enabled.
- Insert the URL of one of the website domains into the main table and crawl it.
- Open the hreflang report, copy the list of all domains and transfer them to the main table.
- Go to the ‘General’ settings, enable the multi-domain crawling and start the program.
In this mode, the program will crawl all pages from domains mentioned in the main table during one session. Thus you'll gather full-package information about all language domains of your website.
Note that the program will mix the pages from different domains in the issue reports. If you want to export reports on each domain separately, use segmentation.
For instance, to get reports separately, you should go to the 'Overview' tab, choose a necessary domain, and apply it as a segment.
Now you can approach all parameters in the table and export the examples of audit reports for a picked domain. The same steps should be repeated for the remaining domains.
Multi-domain crawling is a Pro feature of Netpeak Spider. Eager to have access to this and other PROfessional features? They include:
- white label reports that allow branding
- export of search queries from Google Search Console and Yandex.Metrica
- integration with Google Drive / Sheets, etc.
Hit the button to purchase the Pro plan, and get your inspiring insights!
And if you already have a Standard license, but crave for Pro plan, then you need to upgrade your subscription 😉
14. Internal Linking Audit
Internal linking does its bit into website usability and distributes link equity so that search engines could index landing pages better.
To carry out this audit, it's enough to choose the parameters template 'For PageRank'.
The program will crawl the entire website and fetch all necessary information about internal linking, including:
- Internal links and external links with anchor texts, values in the rel attribute and type of the link in the source code of the page. The reports are stored in the 'Database' module.
- Information about the distribution of link equity inside the website: the number of incoming links that each page gets, important pages that receive insufficient link equity, and vice versa, unimportant pages with high PageRank.
Also, the 'Internal Pagerank calculation' tool will show the 'dead ends' which burn down the link equity and how it's distributed across the website in general.
15. Audit of Optimization Priority
The audit of the optimization priority will help detect pages that receive an insufficient volume of organic traffic compared to paid traffic. To find such pages, you should complete two tasks:
- Task #1. Find pages which receive tons of paid traffic compared to organic.
- Task #2. Among these pages, find those ranked poorly with Google Search Console and analyze parameters in Serpstat and Netpeak Checker.
Let’s get our hands a bit dirty 😃
- Add Google account in Netpeak Spider. We've described how to add your Google account in the article: ‘Integration with Google Analytics и Search Console.'
- Crawl the website or upload the URLs from Google Analytics (look at the 'List of URLs menu')
- Set paid traffic in the segment dropdown menu in the Google Analytics settings.
- Export data on sessions into the main table via the 'Analysis' menu.
- Export the received report.
- Set the organic traffic in the segment settings and repeat the steps described above.
- Merge two reports and compare the balance of paid and organic traffic. In the end, you’ll figure out what pages get more paid traffic over organic.
Upload the crawled URLs into the new project in Netpeak Spider and enable the ‘GSC: Average Position’ parameter.
If you have an active account in Serpstat, you can upload the URLs list and get information about the number of words the website ranks for on particular positions in Google. To do so, turn on the parameters depicted on the screenshot.
By combining different crawling settings and parameters, you can flexibly tailor Nepteak Spider, automatically detect specific optimization issues and get reports relevant to your tasks. This will help you conduct a complex website audit and tackle detected issues more quickly and effectively in the future.