How to Create a Feed for Google Ads Dynamic Remarketing with Netpeak Spider
Use Cases
This post will be useful for PPC specialists and can help you solve the following cases.
Case #1: The website contains Ads remarketing tag with dynamic parameters. You have a feed which can be downloaded via link but it is not automatically updated. So it's outdated.
Case #2: The website contains Ads remarketing tag with dynamic parameters. Feed is automatically updated but a part of data is incorrect or absent, so some feed elements won't be approved. Incorrect data can contain whatsoever, for example, broken links to images or products.
Case #3: There is no feed and no Ads remarketing tag.
We'll find out how to solve these cases in your project, especially if you can't turn to a developer ;)
1. Case #1
In this case we need to create a feed from scratch. So let's go.
We need the following columns for our feed:
- ID
- Item title
- Item subtitle
- Final URL
- Image URL
- Item description
- Price
- Sale price

Keep in mind that not all of them are necessary for you. You can check out Google's specification here.
We'll get data for our feed using scraping feature in Netpeak Spider.
You can test scraping absolutely for free → Netpeak Spider crawler has a free version that is not limited by the term of use and the number of analyzed URLs. Other basic features are also available in the Freemium version of the program.
To get access to free Netpeak Spider, you just need to sign up, download, and launch the program 😉
Sign Up and Download Freemium Version of Netpeak Spider
P.S. Right after signup, you'll also have the opportunity to try all paid functionality and then compare all our plans and pick the most suitable for you.
1.1. ID and Price Scraping for Remarketing Feed
-
Go to the good card and find in the page code (Ctrl+Shift+I) our dynamic parameters for remarketing tag.
- Right-click on the <script> tag and copy XPath. We need it for scraping settings:
- Open 'Scraping' tab in Netpeak Spider:
- name your extraction in the field #2
- choose XPath in the field #3
- paste your XPath from the second step in the field #4
- choose 'Inner HTML content' in the field #5

After scraping we'll get script code with ID data and price for our feed from each webpage.

All that's left to do is to split data information into 2 columns. Here is an example of how you can do it with Google Spreadsheets function:

Using scraping we can get all other columns (you can use not only XPath, but also CSS selectors). Keep in mind that you must paste XPath request correctly for appropriate flypage elements in scraping settings.
Let's go further and see other examples of data extraction.
1.2. Image URL Scraping for Remarketing Feed
- Get XPath:
- Set 'Scraping' settings:
- #1 - XPath
- #2 - XPath request
- #3 - choose data extraction 'Attribute'
- #4 - attribute name 'src' (contains link to our image):

And here are scraping results.

1.3. Final URL
Final URL is our URL field by default. You don't have to set anything for scraping. URL will be in the sheet.
1.4. Item Title and Item Description
You can get item title и item description using Netpeak Spider. Just choose 'Title' and 'Description' in the 'Parameters' and start crawling.

Of course, it will work if the website contains correct meta tags. In Item subtitle you can specify domain address.
1.5. Sale Price
If there are campaign products on the website, you can set additional rule to extract product prices with discounts (Sale price parameter in the feed).
Here is an element we need for extraction:

And here are scraping settings:

This case has a nuance. Firstly, we've copied XPath as on the screenshot below.

And gоt such results:
We couldn't scrape prices with discounts by this XPath and rewrote it. So new and correct XPath was written and specified in the scraping settings:
We face such situation quite often, that's why I recommend you to learn basic XPath syntax. If you learn that, your scraping will become much easier ;)
Actually, that's all settings. After scraping you'll get 2 sheets:
All results (here we need URL, Title, Description):

All scraping data:

Now your main task is to make one sheet from these two. You can do that using QUERY function in Google Spreadsheets.
Also, we have prepared a docs with an example where you can check out how to do it.
Before working with your docs, copy it. QUERY function is in the 'Feed Adw - step1' tab in boxes under E column:

2. Case #2
This situation is similar to the first one. If you create a feed from scratch, you can do everything according to the manual above.
If you don't have to create a feed and only need to add some data (for example, a column that is absent in the source feed) or correct links in the Image URL column, it becomes easier.
Scrape needed data from the list of URLs in the source feed and add new data to it. You can use QUERY function again to match your results from two sheets.
3. Case #3
This case is different from the previous ones because the website doesn't contain remarketing tag and we can't get ID and Price data from dynamic parameters.
You can cope with it in two steps:
- Scrape all data according to the manual above, but use product item from the good card instead of ID. Extract price from the good card too.
- Set Ads remarketing tag via Google Tag Manager by yourself. Use product item from the good card in the dynx_itemid variable. So the main task here is to match ID element in the feed and dynx_itemid on the good card.
The last case is the most complicated and I hope you will never face it. And it'll be better if you find a developer for it =)
Summary
Creating a feed for dynamic remarketing from scratch can become a real pain in the neck for PPC specialists. But you can create it less painfully using scraping in Netpeak Spider and extract such necessary elements for feed as:
- ID
- Item title
- Item subtitle
- Final URL
- Image URL
- Item description
- Price
- Sale price
And what can you recommend in cases we've described? Let us know in the comments below =)