Netpeak Checker 3.0: New Version Overview
UpdatesRead a detailed post about the update → 'Netpeak Checker 3.0: SERP Scraping'
Our company released a new Netpeak Checker 3.0 and in this video I will give you a detailed overview of the new program features. We expanded program capabilities, developed the new tool and reduced RAM consumption by more than 2 times!
Latest updates turned Netpeak Checker into the essential program for SEO and PPC specialists, marketers and bloggers, webmasters, and sales managers. So stay tuned on our wave to get acquainted with all the important features :)
1. SERP Scraping
The most important change for program functionality is a new built-in SERP scraper for Google, Yahoo, Yandex and Bing. Using this tool, you can get data from search engines for any list of queries. Also, you can refine searches with special symbols and phrases.
Let's take these three queries as an example for analysis (flight to Amsterdam, cheap flights to Netherlands, direct flight to Amsterdam). I chose almost the same queries to show how to work with unique opportunity in our tables – grouping results by any column. Now you can see that it works by values in 'Query' and 'Search engine' columns. When I'm turning it off, results appear as a plain list of pages. Let's scroll to the 'Host' parameter and drag its header to the top table zone. As a result, we get a report that reflects current domains visibility in search engines results for analyzed queries.
By the way, you can enter much more queries here, for example a part of the semantic core from the online shop category and track positions right in Netpeak Checker. You can add queries manually, from clipboard or .txt file. To start the scraping, press the 'Start' button.
Information you scrape contains not only positions, but also:
- Title and Description, that have been scraped from the SERP snippet.
- Highlighted text → search engine uses bold text to mark exact keyword matches or synonyms.
- Found sitelinks in results and snippet ratings created with structured data.
- Featured snippet → shows you if the result is pinned at the top of the results page. In this case, it will have 'TRUE' value in the corresponding column.
- We have 3 columns left at the end of the table: 'Host', 'Query' and 'Search engine'. Using them you will ease grouping, filtering, sorting and differing results from each other:
At the end I've added query «iphone -site:apple.com» to illustrate how to work with search operators in our program. You can use any advanced operators and special symbols, from '-' or 'site:' to more complex combinations. I want to note, that they may differ for each search engine.
SERP scraping is a Pro feature of Netpeak Checker. To have access to this and other Pro features such as estimation of website traffic and export to Google Drive / Sheets, you need to subscribe to Netpeak Checker Pro plan.
Hit the button to purchase the Pro plan, and get your inspiring insights!
1.1. 'SE Scraper' Settings
Let's go deeper in scraper settings. On this tab you can define:
- Search engine which search results you want to analyze.
- Number of results you need (you can set one of the presets, max number or set your own custom one).
- Adding other snippet types as results to the report. It can be a video, image, news, or sitelink.
In the bottom of the sidebar, we placed 4 links to most important for scraper settings. First one leads to the search engines configuration. For Google, you can define query by exact geolocation, country, language, and time frames to get necessary results. It will help you with better local targeting (so called 'Local SEO') and exploring search engine results for the exact region. Bing and Yahoo settings are not as rich as they are for Google because of capabilities of these systems.
Next tab that you will use during the work with scraper → 'CAPTCHA'. Here you can enter key from automatic captcha solving service anti-captcha.com. The functionality of the program gives you an opportunity to check your balance on this service and not to distract you from your work. Automatic captcha solving is one of the key features for your work with bulk search engines checks.
Another feature is using proxy for sharing a huge amount of requests among several IPs to decrease the chance to be banned. You can add the list of proxies, check their internet connection or status code when connecting to search engines on the 'List of proxies' tab. These two features together make the best solution for a captcha solving problem during a huge SERP scraping.
1.2. 'SE Scraper' Interface Overview
Let's continue our tour to elements on tool control panel which you will use for working with retrieved information. First two buttons are 'Transfer URLs' and 'Transfer hosts' → they add scraping results to the main table for the future analysis. Moreover, the tool saves results during active session. Even if you closed the scraper, received data will be saved. Please note that after you completely quit Netpeak Checker, this data will be erased.
To save the data on your PC, you can use the following functions:
- 'SaveURLs' → creates .txt file with the list of scraped pages from SERP.
- 'Export' → downloads the current table. If you've applied any filter, it will be used when exporting.
Some useful things are also placed in the context menu. You can call for it with the right button click on any cell of the table when you want to:
- Open URL in browser → it opens the selected page in your default browser.
- Filter by value immediately cuts off the pages with parameters that are not exactly matching the value used for filtering. This feature is also available for the main program interface.
- Transfer URLs to the main table → use it when you need to analyze an exact page by parameters in the main table of Netpeak Checker.
- Open SERP → allows you seeing the request sent to retrieve the data.
2. Changes in Netpeak Checker Main Interface
Let's switch to the main interface of Netpeak Checker. We've changed the control panel, and 'Parameters' sidebar has been moved to the right. Now, when you click on any parameter or group of parameters, a detailed description of it will be written in the bottom info panel. If you click on a table cell, you will see all data for the selected URL here.
Some improvements have been made for 'URL Explorer' table → now you can see all filter condition and percentage of matching pages. If you often apply the same results filters, you can create your own filter templates for the further use.
All program tables now have a quick search feature, which looks for an entered value within all table columns – quick and accurate, just as you like it!
3. Parameter Updates
We've also added parameter templates → you can use presets or create your own templates to quickly select a lot of necessary parameters for analysis. By the way, after clicking on a parameter, the table will be immediately scrolled to it for a quick surfing in large tables. Try to find a necessary parameter and click on it.
The most popular combinations have been added as presets for the further use:
- 'All free' → created for a quick selection of all parameters that do not require paid subscriptions. We also ticked Moz parameters, as you can get this data for free, but only after you're signed up and have a unique access token on Moz.
- 'Link building' → helps to evaluate websites for further link building. It works best with 'SE scraper', as you can find interesting websites for cooperation in a few clicks.
- 'Dropped domains: basic' and 'Dropped domains: advanced' → allows a 2-steps search for domains with expired registration date: use the first template to cut off domains that will not match your search (according to response status code and IP-address) and then analyze the rest deeper with an advanced template.
- 'Contacts search'→it's used mostly by sales teams for lead generation. You can find websites of potential customers using 'SE Scraper' and extract emails from the page source code using the main interface of the program.
I want to remind that during the analysis of any webpage you can get a list of emails, links to social networks, content of hreflang link attributes and check their language.
4. Changes in Netpeak Checker Settings
In Netpeak Checker 3.0 we have visually separated crawling settings and external services' parameters. Now on the 'On-Page' tab, you can configure the analysis speed, User Agent and set credentials for basic authentication – these settings will affect only On-Page parameters.
We've already gone through the 'Search engines' tab when we talked about SERP scraping. All tabs of external services (Serpstat, Majestic, Ahrefs, etc.) are made for connecting Netpeak Checker to API of these services. You need to enter your API tokens there. Note that you can see limits and current balance right in the settings.
Two last tabs are made for checking and adding proxies to the program. Users often ask us how to avoid limits of different services (e.g. search engines). That's why we decided to make a research to find an optimal amount of requests and used proxies for bulk SERP scraping. As a result, we've implemented the 'Proxy Anti-Ban' algorithm. It helps us to use your resources with a good performance and much lower risks.
5. Optimization in Existing Algorithms
We've improved the data export from the program. Now you get so-called 'snapshot'. So you can upload a table with all your sorting, grouping, quick search, changing order or pinning columns, and filtering.
If exported XLSX format file contains more than 1 048 575 lines, Netpeak Checker will automatically divide data array into several files. Note that you can export reports in .xlsx and also .csv format.
In the end, I want to show features familiar to Netpeak Spider 3.0 users, that we also added to Netpeak Checker 3.0:
- Recheck values / rows / columns opportunity has been added into the main program interface to restart the analysis of a necessary data.
- Monitoring memory limit. If you have less than 128 MB of free RAM or disk space left, the program will automatically stop the analysis preventing data losses.
- If the analyzed page responses with redirect, Netpeak Checker will follow it to get On-Page parameters from the source code of the target page. Previously, Netpeak Checker did not follow redirects and could not get this data.
- We've also added the 'Open URL in a service' option in the context menu as we did in Netpeak Spider. This feature allows opening a selected URL in different services (e.g. Google, Serpstat, Ahrefs, Google PageSpeed, Mobile Friendly Test, etc.). Additionally, we included the 'Open robots.txt' option that works the same as in Netpeak Spider. It opens this file in the root folder of the selected host.
- Multi-window. This feature allows you to work with different projects simultaneously if your computer is able to do this.
In Сonclusion
New version of Netpeak Checker gives you a lot of new possibilities, so don't miss a detailed review of new features from our CEO Alex Wise and sign up to get a free trial access with no limits and credit cards required. I want to remind that our Support team is always glad to answer your questions that might appear when using our programs. Wishing you bright rays of traffic to your websites ;)