How to Create a Feed for Google Ads Dynamic Remarketing with Netpeak Spider

9
9
How to Create a Feed for Google Ads Dynamic Remarketing with Netpeak Spider
Use Cases

This post will be useful for PPC specialists and can help you solve the following cases.

Case #1: The website contains Ads remarketing tag with dynamic parameters. You have a feed which can be downloaded via link but it is not automatically updated. So it's outdated.

Case #2: The website contains Ads remarketing tag with dynamic parameters. Feed is automatically updated but a part of data is incorrect or absent, so some feed elements won't be approved. Incorrect data can contain whatsoever, for example, broken links to images or products.

Case #3: There is no feed and no Ads remarketing tag.

We'll find out how to solve these cases in your project, especially if you can't turn to a developer ;)

1. Case #1

In this case we need to create a feed from scratch. So let's go.

We need the following columns for our feed:

  • ID
  • Item title
  • Item subtitle
  • Final URL
  • Image URL
  • Item description
  • Price
  • Sale price
Elements for Ads dynamic remarketing feed

Keep in mind that not all of them are necessary for you. You can check out Google's specification here.

We'll get data for our feed using scraping feature in Netpeak Spider.

1.1. ID and Price Scraping for Remarketing Feed

  1. Go to the good card and find in the page code (Ctrl+Shift+I) our dynamic parameters for remarketing tag.

    Dynamic parameters for remarketing tag
  2. Right-click on the <script> tag and copy XPath. We need it for scraping settings:

    Copy XPath
  3. Open 'Scraping' tab in Netpeak Spider:
    • name your extraction in the field #2
    • choose XPath in the field #3
    • paste your XPath from the second step in the field #4
    • choose 'Inner HTML content' in the field #5

    Scraping settings for ID and price

After scraping we'll get script code with ID data and price for our feed from each webpage.

Script code with ID data and price

All that's left to do is to split data information into 2 columns. Here is an example of how you can do it with Google Spreadsheets function:

Google Spreadsheets function to split data

Using scraping we can get all other columns (you can use not only XPath, but also CSS selectors). Keep in mind that you must paste XPath request correctly for appropriate flypage elements in scraping settings.

Let's go further and see other examples of data extraction.

1.2. Image URL Scraping for Remarketing Feed

  1. Get XPath:
    Getting image XPath
  2. Set 'Scraping' settings:
    • #1 - XPath
    • #2 - XPath request
    • #3 - choose data extraction 'Attribute'
    • #4 - attribute name 'src' (contains link to our image):
    Scraping settings to get image

    And here are scraping results.

    Scraping results for images

1.3. Final URL

Final URL is our URL field by default. You don't have to set anything for scraping. URL will be in the sheet.

1.4. Item Title and Item Description

You can get item title и item description using Netpeak Spider. Just choose 'Title' and 'Description' in the 'Parameters' and start crawling.

Title and description extraction

Of course, it will work if the website contains correct meta tags. In Item subtitle you can specify domain address.

1.5. Sale Price

If there are campaign products on the website, you can set additional rule to extract product prices with discounts (Sale price parameter in the feed).

Here is an element we need for extraction:

XPath for sale prices

And here are scraping settings:

Settings in Netpeak Spider to scrape sale price

This case has a nuance. Firstly, we've copied XPath as on the screenshot below.

Copy XPath

And gоt such results:

//*[@id="product-price-4005"]/span[1]

We couldn't scrape prices with discounts by this XPath and rewrote it. So new and correct XPath was written and specified in the scraping settings:

//p[@class="special-price"]/span[@class="price"]

We face such situation quite often, that's why I recommend you to learn basic XPath syntax. If you learn that, your scraping will become much easier ;)

Actually, that's all settings. After scraping you'll get 2 sheets:

All results (here we need URL, Title, Description):

All results table

All scraping data:

All scraping data table

Now your main task is to make one sheet from these two. You can do that using QUERY function in Google Spreadsheets.

Also, we have prepared a docs with an example where you can check out how to do it.

Before working with your docs, copy it. QUERY function is in the 'Feed Adw - step1' tab in boxes under E column:

QUERY function

2. Case #2

This situation is similar to the first one. If you create a feed from scratch, you can do everything according to the manual above.

If you don't have to create a feed and only need to add some data (for example, a column that is absent in the source feed) or correct links in the Image URL column, it becomes easier.

Scrape needed data from the list of URLs in the source feed and add new data to it. You can use QUERY function again to match your results from two sheets.

3. Case #3

This case is different from the previous ones because the website doesn't contain remarketing tag and we can't get ID and Price data from dynamic parameters.

You can cope with it in two steps:

  1. Scrape all data according to the manual above, but use product item from the good card instead of ID. Extract price from the good card too.
  2. Set Ads remarketing tag via Google Tag Manager by yourself. Use product item from the good card in the dynx_itemid variable. So the main task here is to match ID element in the feed and dynx_itemid on the good card.

The last case is the most complicated and I hope you will never face it. And it'll be better if you find a developer for it =)

Summary

Creating a feed for dynamic remarketing from scratch can become a real pain in the neck for PPC specialists. But you can create it less painfully using scraping in Netpeak Spider and extract such necessary elements for feed as:

  • ID
  • Item title
  • Item subtitle
  • Final URL
  • Image URL
  • Item description
  • Price
  • Sale price

And what can you recommend in cases we've described? Let us know in the comments below =)

Read this post inRussian