How PPC Specialists Can Benefit from Netpeak Spider

Use Cases
Like
Comments
Share
How PPC Specialists Can Benefit from Netpeak Spider

Netpeak Spider is a tool designed primarily for SEO audits, but its functionality allows you to perform completely different tasks far beyond the SEO reach. And I’m going to prove to you that 😃

In this blog post, I'll showcase how Netpeak Spider can help PPC specialists solve their day-to-day tasks.

  • 1. Get the List of All URLs to Identify Potential Landing Pages on Large Websites
  • 2. Scrape Necessary Data from Pages to Quickly Generate Advertising Campaigns
  • 3. Check Status Codes and Response Time of Landing Pages in Ads and Sitelinks
  • 4. Monitor Competitors’ Prices to Adjust Bids
  • 5. Collect Data for a Dynamic Remarketing Feed
  • Recap

1. Get the List of All URLs to Identify Potential Landing Pages on Large Websites

With Netpeak Spider, you can quickly and easily collect a complete list of URLs for a large website. To do so:

  1. Run the program.
  2. In the sidebar parameters, enable only the ‘Status Code’ parameter since we aren’t interested in anything other than the list of URLs, and besides, this will speed up crawling.

    How to crawl website in Netpeak Spider
  3. Add the website address into the ‘Initial URL’ field and hit the ‘Start’ button.
  4. Wait until the crawling ends.
  5. Open the main tablet and click on the ‘Export’ button to export all URLs to your computer in the XLSX, CSV, or Google Sheets formats.

    How to export data in Netpeak Spider
  6. Or open the ‘List of URLs’ menu and select ‘Save the list of URLs to file’ to export the results in the TXT format.

    How to export crawl results in TXT format from Netpeak Spider

And voila! Now you have a list of all pages you can analyze and sift through them to pick landing pages for contextual advertising.

You can perform this task in the Freemium version of Netpeak Spider and approach the results in the program table absolutely for free, but to save, filter, segment, and export data, you’ll need the Standard plan.

Compare Plans

2. Scrape Necessary Data from Pages to Quickly Generate Advertising Campaigns

When you get down to creating titles for contextual advertising, you can focus on the page metadata. As a rule, these have everything you need for a proper title: keywords and calls to action.

To collect metadata from pages, follow this drill in Netpeak Spider:

  1. Before crawling, select the ‘Title’ and ‘Description’ parameters in the ‘Content’ group.

    How to crawl metadata from website pages in Netpeak Spider
  2. Enter the website address and start crawling.
  3. When the crawling is completed, export the results.

How to crawl metadata from website pages in Netpeak Spider

This data can be used to compose ads. And so, I've prepared a special template for you to oil the wheels.

Take a Look at Template

3. Check Status Codes and Response Time of Landing Pages in Ads and Sitelinks

All landing pages must return the required status code – 200 OK, and the page load speed – meet standards (up to 0.5 seconds). It’s easy as pie to check these parameters in the crawler, especially when it comes to large websites (more than 100,000 URLs) → because it’s a nasty fact that you can check no more than 20,000 URLs at a time using the Google Ads script for checking status codes.

To start checking, follow the drill:

  1. In the parameters, tick the ‘Status Codes’ and ‘Response Time’ checkboxes.

    How to check status codes and page loading speed in Netpeak Spider
  2. Add the list of landing pages → you can copy them from the file to the clipboard (Ctrl + C) and paste directly into the program (Ctrl + V), or you can use the ‘List of URLs’ menu to follow the drill that is convenient for you.

    How to insert a list of URLs to crawl them in Netpeak Spider
  3. Start crawling.
  4. When the crawling is completed, you can approach the data in the main table, and the report on issues → make sure that the crawled pages aren’t down to the ‘Broken Pages’ and ‘Long Server Response Time’ issues.

This check should be done before and after you launch advertising campaigns.

4. Monitor Competitors’ Prices to Adjust Bids

Tracking competitors' prices will help you adjust your bids profitably. For example, when competitors' prices for a product are higher than yours, it makes sense to increase the landing page’s bid with this product.

To collect prices from websites:

  1. Go to the product page and highlight the price. Then open the source code and copy the class of the element.

    Source code
    If you’re confused at this stage, open our blog post: ‘Comprehensive Guide: How to Scrape Data from Online Stores With a Crawler.’
  2. In Netpeak Spider, go to the ‘Settings’ menu → ‘Scraping’ and paste the copied element into the cell. Then set the search type ‘CSS Selector.

    Как просканировать сайт в Netpeak Spider
  3. In the parameters (in a sidebar), tick ‘Status Code’ and ‘Scraping’.

    Crawling settings in Netpeak Spider
  4. Start the crawling.
  5. When the process comes to an end, take a look at the scraping report → you’ll find it in the ‘Database’ menu → ‘Scraping data’ → ‘Price’.

How to crawl data from website using Netpeak Spider

Thus, you can scrape prices from competitors' sites, analyze and compare them, and revise your pricing policy based on the data.

5. Collect Data for a Dynamic Remarketing Feed

From the Ads remarketing tag, you can pull data for the product catalog and elements to create a dynamic remarketing feed:

  • ID
  • Price
  • Item Title
  • Item Description
  • Final URL
  • Image URL

I elaborated on the metadata scraping in the case #2, and this technique can also be applied to the feed generation.

To scrape id and image_​link, open the product page and find these elements in the source code. To collect ids, find dynamic parameters for the remarketing tag in the code, and copy the script tag XPath.

XPath of the script tag

On the ‘Scraping’ settings tab in Netpeak Spider, insert XPath and select the ‘All source code’ data extraction type.

To scrape the image_ link, also copy its XPath.

XPath of the image_​link tag

In the settings, set ‘Data extraction’ → ‘Attribute’, attribute name – ‘scr’ (contains a link to the image).

Crawl the website and transfer the received data into the table to keep on working with it.

Data in the table for further work

If there is no remarketing tag on the website, you can collect data this way:

  1. Scrap prices, but instead of the ID, there will be the item in the product card.
  2. To set the Ads remarketing tag, you can use Google Tag Manager. For the item ID in the feed to match the dynx_itemid on the product pages, you need to add the item / code to the dynx_itemid.

We have a detailed guide on collecting all the necessary data: ‘How to Create a Feed for Google Ads Dynamic Remarketing with Netpeak Spider.’

Recap

It’s tough to collect the necessary data with bare hands when launching advertising campaigns, so my advice will be to automate this process. I described how Netpeak Spider could help you down the road. Here what you can do:

  • Collect website pages to come up with landing pages.
  • Collect metadata to create ads quickly and without extra fuss.
  • Check the status codes and webpage loading speed.
  • Monitor competitors’ prices.
  • Get data on the generation of product feeds.