How PPC Specialists Can Benefit from Netpeak Spider
Use Cases
Netpeak Spider is a tool designed primarily for SEO audits, but its functionality allows you to perform completely different tasks far beyond the SEO reach. And I’m going to prove to you that 😃
In this blog post, I'll showcase how Netpeak Spider can help PPC specialists solve their day-to-day tasks.
- 1. Get the List of All URLs to Identify Potential Landing Pages on Large Websites
- 2. Scrape Necessary Data from Pages to Quickly Generate Advertising Campaigns
- 3. Check Status Codes and Response Time of Landing Pages in Ads and Sitelinks
- 4. Monitor Competitors’ Prices to Adjust Bids
- 5. Collect Data for a Dynamic Remarketing Feed
- Recap
1. Get the List of All URLs to Identify Potential Landing Pages on Large Websites
With Netpeak Spider, you can quickly and easily collect a complete list of URLs for a large website. To do so:
- Run the program.
- In the sidebar parameters, enable only the ‘Status Code’ parameter since we aren’t interested in anything other than the list of URLs, and besides, this will speed up crawling.
- Add the website address into the ‘Initial URL’ field and hit the ‘Start’ button.
- Wait until the crawling ends.
- Open the main tablet and click on the ‘Export’ button to export all URLs to your computer in the XLSX, CSV, or Google Sheets formats.
- Or open the ‘List of URLs’ menu and select ‘Save the list of URLs to file’ to export the results in the TXT format.
And voila! Now you have a list of all pages you can analyze and sift through them to pick landing pages for contextual advertising.
2. Scrape Necessary Data from Pages to Quickly Generate Advertising Campaigns
When you get down to creating titles for contextual advertising, you can focus on the page metadata. As a rule, these have everything you need for a proper title: keywords and calls to action.
To collect metadata from pages, follow this drill in Netpeak Spider:
- Before crawling, select the ‘Title’ and ‘Description’ parameters in the ‘Content’ group.
- Enter the website address and start crawling.
- When the crawling is completed, export the results.

This data can be used to compose ads. And so, I've prepared a special template for you to oil the wheels.
3. Check Status Codes and Response Time of Landing Pages in Ads and Sitelinks
All landing pages must return the required status code – 200 OK, and the page load speed – meet standards (up to 0.5 seconds). It’s easy as pie to check these parameters in the crawler, especially when it comes to large websites (more than 100,000 URLs) → because it’s a nasty fact that you can check no more than 20,000 URLs at a time using the Google Ads script for checking status codes.
To start checking, follow the drill:
- In the parameters, tick the ‘Status Codes’ and ‘Response Time’ checkboxes.
- Add the list of landing pages → you can copy them from the file to the clipboard (Ctrl + C) and paste directly into the program (Ctrl + V), or you can use the ‘List of URLs’ menu to follow the drill that is convenient for you.
- Start crawling.
- When the crawling is completed, you can approach the data in the main table, and the report on issues → make sure that the crawled pages aren’t down to the ‘Broken Pages’ and ‘Long Server Response Time’ issues.
This check should be done before and after you launch advertising campaigns.
4. Monitor Competitors’ Prices to Adjust Bids
Tracking competitors' prices will help you adjust your bids profitably. For example, when competitors' prices for a product are higher than yours, it makes sense to increase the landing page’s bid with this product.
To collect prices from websites:
- Go to the product page and highlight the price. Then open the source code and copy the class of the element.
If you’re confused at this stage, open our blog post: ‘Comprehensive Guide: How to Scrape Data from Online Stores With a Crawler.’
- In Netpeak Spider, go to the ‘Settings’ menu → ‘Scraping’ and paste the copied element into the cell. Then set the search type ‘CSS Selector.
- In the parameters (in a sidebar), tick ‘Status Code’ and ‘Scraping’.
- Start the crawling.
- When the process comes to an end, take a look at the scraping report → you’ll find it in the ‘Database’ menu → ‘Scraping data’ → ‘Price’.

Thus, you can scrape prices from competitors' sites, analyze and compare them, and revise your pricing policy based on the data.
5. Collect Data for a Dynamic Remarketing Feed
From the Ads remarketing tag, you can pull data for the product catalog and elements to create a dynamic remarketing feed:
- ID
- Price
- Item Title
- Item Description
- Final URL
- Image URL
I elaborated on the metadata scraping in the case #2, and this technique can also be applied to the feed generation.
To scrape id and image_link, open the product page and find these elements in the source code. To collect ids, find dynamic parameters for the remarketing tag in the code, and copy the script tag XPath.

On the ‘Scraping’ settings tab in Netpeak Spider, insert XPath and select the ‘All source code’ data extraction type.
To scrape the image_ link, also copy its XPath.

In the settings, set ‘Data extraction’ → ‘Attribute’, attribute name – ‘scr’ (contains a link to the image).
Crawl the website and transfer the received data into the table to keep on working with it.

If there is no remarketing tag on the website, you can collect data this way:
- Scrap prices, but instead of the ID, there will be the item in the product card.
- To set the Ads remarketing tag, you can use Google Tag Manager. For the item ID in the feed to match the dynx_itemid on the product pages, you need to add the item / code to the dynx_itemid.
Recap
It’s tough to collect the necessary data with bare hands when launching advertising campaigns, so my advice will be to automate this process. I described how Netpeak Spider could help you down the road. Here what you can do:
- Collect website pages to come up with landing pages.
- Collect metadata to create ads quickly and without extra fuss.
- Check the status codes and webpage loading speed.
- Monitor competitors’ prices.
- Get data on the generation of product feeds.