In this article, we’ll take a closer look at the segmentation feature in Netpeak Spider. This feature allows to cut off a certain data segment to work with from all results. This comes in handy when you have to analyze pages with some common characteristic. So let’s find out more about this feature and how to use it.
Segmentation and filtering are the features available in the Netpeak Spider Standard plan, where you can also analyze 80+ SEO parameters, scrape websites, export various reports, save projects, and much more. If you are not familiar with our tools yet, after signup, you’ll have the opportunity to give a try to all paid features immediately.
Check out the plans, subscribe to the most suitable for you, and get inspiring insights!
1. What Is Segmentation
First of all, let’s start from the definition. Segmentation is limiting data sets by URLs that meet the specified conditions. You can view all reports based on this data set and find valuable insights into your or competitor’s websites. To make it clear, remember segments in Google Analytics.
For example, the segment ‘Converters’ will show analytics on those visitors who have made a conversion.
Segmentation in Netpeak Spider works in the same way. You can sort URLs by the needed criteria and learn from it.
2. How to Create a Segment
You can create segments in Netpeak Spider in three ways.
The first one is a button ‘Set segment’. You can set the needed conditions and you’ll see the results in the dashboard according to them.
- You can set a filter on the ‘URL Explorer’ tab and then click the ‘Use a segment’ button. It’s quite similar to the third way.
- The third way is to go to the ‘Reports’ tab in a sidebar. Choose the needed string, click on it and then click ‘Use as segment’ button. Now you can work only with URLs corresponding to it.
Pro tip. You can save your templates with the necessary issues and parameters and use them when you need to.
3. Useful Examples of Segmentation
As we mentioned above segments come in handy when you need to analyze pages with the same characteristic. In this chapter, we’ll share some cases with this feature.
3.1. Segmenting Pages by Their Status
Two extra ‘no’ letters, added accidentally to the <meta name=”robots”> by a developer, can significantly hamper your traffic.
After crawling your website, you can go to the ‘Overview’ tab and segment pages by their status to start a deeper analysis. For example, you can choose non-compliant pages, use it as a segment and explore which types of pages can’t be indexed and why.
3.2. Segmenting Pages by Certain Click Depth
Shallow website structure is a significant aspect of your internal linking. Each page should have 2-3 click depth from the home page. There are a few reasons to keep it shallow: user experience and crawling budget. Also, you have to keep in mind that your important pages such as landings should have the minimum click depth.
To analyze pages with certain click depth you can set a segment and find out their types, indexable they or not.
In the example below, we can see that 89% of pages have 5 click depth and most of them are non-compliant.
This article can be useful for you 8 tips on internal linking.
3.3. Segmenting Pages by the URL Part and Using Site Structure
This segment usage can be helpful for analyzing Ecommerce project. For example, you want to check the response time of product cards. Let’s set a segment for URLs with the corresponding part.
Without segmentation, you will find not only product pages in your selection, but also category pages, filters and so on. Now you can analyze response time only for product cards.
To analyze a certain part of a website even easier, you can use the ‘Site Structure’ tab. This is a quite obvious but very helpful example of using segmentation. So if you need to analyze a certain category on your website, use ‘Site Structure’ tab and use the needed category as a segment.
In the example below, we crawled a website and I want to take a closer look at the blog. I can just click on it and then use as a segment, so now I have a dashboard and results about only this part of a website.
3.4. Pages Segmentation by Word Count
This example will help you to find indexable pages with little content. You can read more and find out why it’s important and how to fix thin content.
So to find pages with potentially thin content on the indexable pages, you just need to set the following segment.
3.5. Segmenting Pages by Content Type (HTML, Image, JS, CSS, PDF, etc.)
This is a little lifehack for analyzing different types of content on your website. If you enable checking of other types of pages in the ‘General’ tab before crawling, then it’ll be easy to segment data by the content type and check issues accordingly.
In the example below, we use ‘Images’ as a segment and now we have results only for images on our website and can easily analyze and export the needed info.
Pro tip. After segmenting, check ‘Site structure’ tab and make sure that everything is in the right place. That’s how we’ve found out that some of our images were in the incorrect folder. For example, blog image is placed on the subdomain but it is not supposed to be there.
3.6. Segmenting Pages with Excessive Amount of External Links
Using this segment you can find pages with an excessive amount of external links. Such pages can be suspicious and disperse page’s PR. Just set a segment with the needed parameters.
Segmentation can become a powerful and useful tool for analyzing a huge amount of data. Using Netpeak Spider, you can not only crawl large websites and find issues but also segment the results according to your needs and explore more insights about the website. In our tool, you can set a segment using flexible options and sort pages according to them. Also, you can use anything from the ‘Reports’ tab as a segment.
We’ve described a few examples of segmentation to ease your work: how to sort pages by content type, status, click depth, word count, part of URL and amount of outgoing links. And how do you use this feature in your work? Share your examples in the comments below and we’ll add them to the post :)