How to Use Segmentation in Netpeak Spider [with Useful Examples to Ease Your Work]

Кейсы
11Нравится
Комментарии
Поделиться
How to Use Segmentation in Netpeak Spider [with Useful Examples to Ease Your Work]

In this article, we’ll take a closer look at the segmentation feature in Netpeak Spider. This feature allows to cut off a certain data segment to work with from all results. This comes in handy when you have to analyze pages with some common characteristic. So let’s find out more about this feature and how to use it.

1. What Is Segmentation

First of all, let’s start from the definition. Segmentation is limiting data sets by URLs that meet the specified conditions. You can view all reports based on this data set and find valuable insights into your or competitor’s websites. To make it clear, remember segments in Google Analytics.

For example, the segment ‘Converters’ will show analytics on those visitors who have made a conversion.

Google Analytics as an example of segmentation

Segmentation in Netpeak Spider works in the same way. You can sort URLs by the needed criteria and learn from it.

2. How to Create a Segment

You can create segments in Netpeak Spider in three ways.

How to create segments in Netpeak Spider

  1. The first one is a button ‘Set segment’. You can set the needed conditions and you’ll see the results in the dashboard according to them.

    Setting segment

  2. You can set a filter on the ‘URL Explorer’ tab and then click the ‘Use a segment’ button. It’s quite similar to the third way.
  3. The third way is to go to the ‘Reports’ tab in a sidebar. Choose the needed string, click on it and then click ‘Use as segment’ button. Now you can work only with URLs corresponding to it.

Pro tip. You can save your templates with the necessary issues and parameters and use them when you need to.

3. Useful Examples of Segmentation

As we mentioned above segments come in handy when you need to analyze pages with the same characteristic. In this chapter, we’ll share some cases with this feature.

3.1. Segmenting Pages by Their Status

Two extra ‘no’ letters, added accidentally to the <meta name=”robots”> by a developer, can significantly hamper your traffic.

After crawling your website, you can go to the ‘Overview’ tab and segment pages by their status to start a deeper analysis. For example, you can choose non-compliant pages, use it as a segment and explore which types of pages can’t be indexed and why.

Segmenting non-compliant URLs

3.2. Segmenting Pages by Certain Click Depth

Shallow website structure is a significant aspect of your internal linking. Each page should have 2-3 click depth from the home page. There are a few reasons to keep it shallow: user experience and crawling budget. Also, you have to keep in mind that your important pages such as landings should have the minimum click depth.

To analyze pages with certain click depth you can set a segment and find out their types, indexable they or not.

In the example below, we can see that 89% of pages have 5 click depth and most of them are non-compliant.

Segmenting pages by click depth

This article can be useful for you 8 tips on internal linking.

3.3. Segmenting Pages by the URL Part and Using Site Structure

This segment usage can be helpful for analyzing Ecommerce project. For example, you want to check the response time of product cards. Let’s set a segment for URLs with the corresponding part.

Sorting pages by the part of URL

Without segmentation, you will find not only product pages in your selection, but also category pages, filters and so on. Now you can analyze response time only for product cards.

To analyze a certain part of a website even easier, you can use the ‘Site Structure’ tab. This is a quite obvious but very helpful example of using segmentation. So if you need to analyze a certain category on your website, use ‘Site Structure’ tab and use the needed category as a segment.

In the example below, we crawled a website and I want to take a closer look at the blog. I can just click on it and then use as a segment, so now I have a dashboard and results about only this part of a website.

Segmenting pages using site structure

3.4. Pages Segmentation by Word Count

This example will help you to find indexable pages with little content. You can read more and find out why it’s important and how to fix thin content.

So to find pages with potentially thin content on the indexable pages, you just need to set the following segment.

Sorting pages by word count

3.5. Segmenting Pages by Content Type (HTML, Image, JS, CSS, PDF, etc.)

This is a little lifehack for analyzing different types of content on your website. If you enable checking of other types of pages in the ‘General’ tab before crawling, then it’ll be easy to segment data by the content type and check issues accordingly.

In the example below, we use ‘Images’ as a segment and now we have results only for images on our website and can easily analyze and export the needed info.

Pro tip. After segmenting, check ‘Site structure’ tab and make sure that everything is in the right place. That’s how we’ve found out that some of our images were in the incorrect folder. For example, blog image is placed on the subdomain but it is not supposed to be there.

3.6. Segmenting Pages with Excessive Amount of External Links

Using this segment you can find pages with an excessive amount of external links. Such pages can be suspicious and disperse page’s PR. Just set a segment with the needed parameters.

Sorting pages with a lot of external links

Summing up

Segmentation can become a powerful and useful tool for analyzing a huge amount of data. Using Netpeak Spider, you can not only crawl large websites and find issues but also segment the results according to your needs and explore more insights about the website. In our tool, you can set a segment using flexible options and sort pages according to them. Also, you can use anything from the ‘Reports’ tab as a segment.

We’ve described a few examples of segmentation to ease your work: how to sort pages by content type, status, click depth, word count, part of URL and amount of outgoing links. And how do you use this feature in your work? Share your examples in the comments below and we’ll add them to the post :)

Digging This Use Case? Let's Discuss Netpeak Spider Perks in Person

Book a personal demo