Netpeak Spider 2.1.3: Improvements OverviewUpdates
We haven’t informed you about the updates of Netpeak Spider for quite a long time… This is only because we were fully concentrated on advancing of Netpeak Checker, by the way, have you tried the new version yet? :) And now I’m eager to tell you about some important changes and improvements to our SEO crawler.
You can spot 80+ SEO issues and work with other basic features in the free version of Netpeak Spider crawler that is not limited by the term of use and the number of analyzed URLs.
To get access to free Netpeak Spider, you just need to sign up, download, and launch the program 😉
P.S. Right after signup, you'll also have the opportunity to try all paid functionality and then compare all our plans and pick the most suitable for you.
1. Viewing of custom search and extraction results
An option to search and extract data using Netpeak Spider was implemented about five months ago – peep in here to read more about this function. And now we’ve just a bit modified the functionality so it becomes even more convenient! So let’s have a look at what has changed.
Let me remind you that you can find the results of custom search and extraction in the right side panel of the tool under the ‘Searсh’ tab. Now there are two buttons to handle the data:
- ‘All’ → in this case, all the results of search and extraction will be stored in one table, I’ll tell you how this table works in more detail just in a while
- ‘Selected search’ → shows the results only for the selected search. This is analog of ‘Extraction results’ button that was used in the previous versions
In the new table with all results you can see two tabs:
- First Value → here only first values for each search are displayed. This allows you to see all the results one-to-one: one result for one URL. From our experience, this option will be useful in the majority of cases, this is why it is set as default.
- All Values → on this tab, you’ll find all the values for all searches; if there is more than one result for one search, they will be added to separate columns. Please notice that we’ve set a limitation on a maximum number of columns in this table, it equals 250. We hope this will be enough for the majority of your tasks.
For more convenient search of the necessary data, we’ve implemented two additional types of filters: by all cells in the table or only by URLs.
Thereby, the extended functionality of custom search and extraction feature allows to cover the biggest possible range of works of an SEO specialist.
2. Requirement of administrative rights
We’ve been working hard on this issue and at last, we’re glad to inform you that now you do not need administrative right to install, update, and launch our tools. Hooray!
We recommend you to uninstall the old versions of the tools (with the help of new installers, you’ll be able to do it just right during the update), otherwise, there will be duplicate shortcuts on your desktop. Note that for uninstalling, you’ll need administrative rights for the last time.
3. Other improvements
3.1. Adding domains with http by default
Working in the ‘List of URLs’ crawling mode has become even more convenient: now when you add domains/subdomains from a file or clipboard and they are specified without a protocol (http/https), then by default http protocol will be added to all addresses. This refers only to hosts like ‘example.com’ or ‘subdomain.example.com’ – when adding full page URL like ‘example.com/page.html’, this function won’t work.
After the URLs have been added, you can easily edit them just by a double click of left mouse button.
3.2. Enhanced export
Besides sorting, filtering and column positions, the following features are considered for data exporting:
- column size
- position of frozen column
We put every effort to make an exported table as much similar to the table in the tool interface as possible.
3.3. Sorting order in the table
Now sorting of data in the table starts from descending order. You may think it’s just a small change, however, numerous testings and our users’ feedback showed us that after you’ve crawled your website in Netpeak Spider, you may want to sort the following data from largest to smallest:
- status code → in this case, you’ll at first see 5xx and 4xx errors and 3xx redirections and only then pages that return 200 status code
- number of issues → the more issues there are, the longer is your trip to the top of SERP, and users of the crawled website will also have usability problems
- server response time → the longer the response time is, the more is the possibility that search engines won’t wait and will start crawling other pages or even other websites, not even speaking about the users for whom it’ll be hard to buy or order something
- Title / Description length → these tags should be kept short, otherwise, you can be banned by one of search engines anti-spam filters, especially if you use these tags to enumerate keywords
- number of outgoing links → keep in mind that every page of your website must be optimized and shouldn’t have big number of outgoing links as this makes it harder for search engine robots to crawl the website and this can also be considered as an attempt to spam your users with unnecessary links
- and so on
In a nutshell
The last updates of Netpeak Spider let you:
- view all the results of custom search and extraction in one table
- install, update, and launch all Netpeak Software products without administrative rights
- export tables that store a maximum number of table adjustments
- sort results starting from descending order, which will save your time and let you stay calm
The last updates of Netpeak Spider didn’t significantly expand the tool functionality. However, we’re going to please you with a global update to version 2.2 really soon. And for now, stay tuned as we’ve got a lot of cool stuff prepared for you :)