On-Page SEO: A Start-to-Finish Guide for SEO NewbiesHow to
Over the past few years, Google has released several updates that affected the way webmasters tailor their websites.
The fact is that Google’s ultimate goal is to bring users to websites that provide high-quality content and good user experience. This is the reason why you can’t ignore on-page SEO if you want to be noticed by both Google and your target audience.
In this article, you'll find out what on-page SEO is, what important ranking factors it covers, and how to conduct basic on-page SEO audits.
Let’s dive right in!
1. On-Page SEO vs. Off-Page
On-page SEO or on-site SEO is the practice of optimizing individual web pages to enhance a website’s search engine ranking, increase engagement, and earn relevant organic traffic. It deals with both textual and graphic information a page consists of in addition to the quality of your HTML source code, and many other technical aspects.
On the other hand, there is off-page SEO. So, what’s the principle difference?
Off-page SEO focuses on the backlinks (the links which you receive from other website pages to yours) and other external signals happening beyond your website, like how often your website link is shared via social networks, how often it was mentioned, and all other external marketing activities.
On-page SEO embraces the optimization of both the content and HTML source code of a page. Typical on-page SEO tasks include optimization of:
- HTML meta tags – title and description meta tags
- content – its structure, headers, target keywords
- visuals – their size and alt texts
- internal linking
- anchor texts
- secure HTTPS protocol
- page load speed
With on-page SEO, you can control all practices within the realm of your website, while off-page SEO focuses on the collaboration with external partners, such as outreaching a partner for a link exchange to get traffic to your web pages.
2. Why On-page SEO Matters
Even though it’s believed that off-page and on-page work together to improve your website’s search engine rankings in a complementary way, on-page SEO is the roof for your entire SEO efforts.
Since Google’s cornerstone is making a website friendly, relevant, and helpful for users, its search robots analyze the content of the web pages and assess whether the pages contain information relevant to what searchers are looking for (it’s called a searchers’ intent).
At this stage, websites that have been through on-page SEO will be more likely to flesh out as the search robot gets the gist of what a web page is about, then identifies if it matches the searcher’s query. That’s what makes on-page SEO incredibly imperative to every website.
The search engines estimate the relevance of a web page to the searcher’s query based on various factors. Keywords usually dominate the narrative. If your site has keyword-rich content (don’t confuse it with keyword-stuffed content, which is a serious blunder), it has more chances to appear on the top position of the search engine result pages (SERPs).
In simple words, with optimized web pages, you can relish such merits as a high ranking position, authority, and trustworthiness of your domain in the future. This, in turn, allows you to build reliable and advantageous relationships with other domain owners and work on the off-page promotion. Because nobody wants to link to the low-authority website with pages that appear to be dull and poorly written, do they?
3. On-Page SEO Ranking Factors, and How to Shape Them Up
3.1. High-Quality Content
Google’s been advocating for quality content for quite a while, introducing many core updates that changed the game on the Internet significantly. The last update that took place in May 2020 suggests stepping up efforts in creating even more exceptional content.
It probably sounds like the same old same old to you, but let’s delve into the very essence of what quality content actually is. It’s the piece of information that is easy to read, unique, helpful, and supplied with all necessary media means like audios, videos, presentations, and so forth that contributes to the better disclosure of the subject.
The relevance to a searcher intent plays not the last role. Before creating any content, try to answer the question: ‘Why should people visit my website?’. Search crawlers take into account behavioral factors (how people behave when they land on your webpage), such as dwell time, time on page, click depth, etc. So if you write quality and compelling content, use the words your audience speaks with, comprehensive text structure, and cover the suggested issue in-depth, this means you succeeded.
Ranking factors that affect the quality of your content are:
- Keywords. It’s critical that you target the right keywords for your content and put them in the right places. Search engines estimate how relevant your web pages are to the searchers’ query based on the keywords. Therefore, you can get ideal keywords by being in your visitors’ shoes. What will they type in search engines when looking for a product or a solution? What intent do they have? Your content should be concise enough to provide answers.
Remember to always include keywords in the title, description, and H1 heading. Search engines rely on the keywords in the title tag and meta description to identify what a page is about, so it’s essential to mention keywords early in the title.
Let me illustrate. So if you want to see the pictures of pandas eating bamboo or read an article explaining why pandas eat bamboo or how these bears survive on a bamboo-only diet, you google something similar to my query and see that most titles and descriptions contain the necessary keywords.More more about keyword research: 'How to Do Keyword Research: A Go-To Guide for Beginners'.
- Language. Go for a natural language and enrich your content with synonyms and close variants, which Google uses to identify a page’s relevance better. Shakespearean sophistication is a good thing only if you know that your audience shares your preferences.
- Content length. Writing long content is no use at all if it doesn’t provide any value to the readers. Ensure that it’s comprehensive and viable. The ideal length of content ranges within 2500 words.
3.2. URL, Title, and Meta Description
URL (Uniform Resource Locator) is a location of a page on your website. It’s seen to visitors so that they can rely on URLs to get what your page is about. Also, the URL provides informative signals that enable search engines to understand a page’s content.
To optimize URLs, you can follow a few tips below:
- Use human-readable and descriptive URLs. The shorter, and more comprehensive, and descriptive URLs are, the easier for both visitors and search engines to grasp your page’s content and determine to rank it higher.
It’s the kind of URL that gives you an overall summary of what the page is about:https://netpeaksoftware.com/blog/what-does-ctr-mean-in-seo
While this one is twisted up and gives you nothing:https://flowershop.com/bd/id=405050
- Include target keywords in URLs. Add at least one keyword to the URL, but refrain from keyword stuffing.
- Use non-case sensitive URLs. At their nature, URLs are case sensitive. If the developers didn’t add .htaccess file to automatically make any uppercase URLs lowercase, a user is likely to hit the 404 page.
- Don’t use underscore to separate content in the URL. Use a hyphen to do it.
- Add mobile-friendly URLs to Sitemap. This technique is for your mobile-friendly pages to rank higher in mobile search results.
Another essential element of on-page SEO is the title tag. Title tags are placed in the <head> section of the HTML code on each page and look like this:
You can see the HTML code of each page hitting the ‘Ctrl+U’ shortcut.
The title should include keywords. Keywords make it easier for Google search engines to classify the topic of the page, determine its content, and know when to serve it on the search engine. Even though you should make a title unique and descriptive, Google can adjust the title when displaying it in search results.
So to make your title tag effective:
- Use keywords.
- Watch the length – don’t make it too long since Google truncates long titles on the SERP.
- Use brand words – it increases brand awareness and gives the touch of authenticity.
- Avoid using ‘clutter’ words.
- Use brackets to give a brief recap of what is inside this article. Like:
Google also checks on meta description. Like title tags, the meta description is an HTML element located in the <head> section. It can be (sometimes it cannot) used as a display feature for snippets (a small paragraph under the title on the SERPs). It should contain the keywords or keyword phrases. Usually, you see the keywords that match the query are highlighted in bold.
Besides, the title tag and meta descriptions are the first information searchers see about your page that creates their first impression of your page and makes them decide on whether to click on the snippet.
Don’t forget to leverage the headings. These elements also play a vital role in your site’s rankings.
Think about it this way: you’re going to write a report or a book that needs a table of contents full of headings. Your pages should have descriptive and distinct headings that describe precisely what your page is about. They show the hierarchical levels of information on your page. For example, H3 is the sub-heading of H2 and supports H2.
One rule is using one H1 only. H1 is like a title that tells the reader what they’re going to see on your page. If you use more than one H1, it’ll be more likely to confuse your visitors and search robots (these will be even more confused than users).
H1 heading is NOT the same thing as title tag. The title tag is what dwells on the HTML code level and is used to be a part of snippets in the SERPs. While H1 heading is the title of your article. H1 heading in action:
Steering back to keywords (you probably got that they’re a ‘big boss’ in SEO), use keywords in H1-H2 heading.
And finally, make sure you’re using the right content format. Split the text into small paragraphs, including sentences with a decent length. Use bold, underline, or italics, bulleted points, and lists in the right parts of your content.
3.4. Visual Content
Visual elements also add their bit into pages’ rankings since they nourish a plain text with eye-stoppers, and explain things that can’t be explained with sheer words. But heavy images and videos are the main enemies of page load speed. Although with a bit of optimization, you can ward off these nasties.
- Comprise heavy images through tinifiers (like TinyPNG).
- Use adaptable formats: PNG, or JPEG.
- Add alt text (alternative text) to the pictures. Alt texts are used to describe the content on the picture, which helps readers with visual impairs understand it. Sometimes, something derails, and the images don’t display correctly, so alt texts also can ‘tell’ what the image is about. With that in mind, think about search robots who also aren’t good enough at understanding the content of the page – they read the alt texts, which gives them a clue of what is depicted.
- Optimize your videos: craft an engaging title, description, and thumbnail for your video.
- Don’t host videos on your website. If you host video content right on the website, it may slow your website load speed, and harm it rather than bring benefits. Instead, embed a YouTube video player on your website and show videos from your YouTube channel.
3.6. Internal Links
Internal links are the links that redirect to other pages within your own website. Internal linking makes your website more crawlable for robots. When they find a page and start crawling it, and there appears to be a link to another page on your website, robots will follow this link. It contributes to link equity (ranking weight) passing to this page. This linking structure creates a web effect that is crawlable and accessible.
Remember to cross-link all pages since if too many links lead to one page and no links go from these pages to others – this can create an dead end issue and stop robots from the further journey on the website.
Speaking of humans, internal links provide your visitors with profuse information, and keep them on exploring your website. It increases the number of page visits and reduces the bounce rate.
3.7. Anchor Text
Another thing to pay attention to is anchor text. It’s the visible, clickable text in a hyperlink, and often blue and underlined. Ensure using anchor text that is descriptive and relevant to the link you insert. That means if the anchor text ‘teddy bears’ links to the page with polka-dotted dresses, it’s pretty confusing, and irrelevant.
Keywords in anchor texts are also a good practice. But don’t go overboard with keywords in anchors since it’s believed to be a spammy practice.
3.8. Page Load Speed
Page load speed is like a metaphorical pillar that holds your website’s rankings.
Did you know that 40% of users abandon a website that takes more than 3 seconds to load? That’s what you’ll not want to happen with your site, so make sure to optimize the loading speed of your pages.
Here are a few tips to speed up your website (spoiler: most of them relate to technical SEO):
- Optimize image sizes. It willreduce the size of the transmitted data to 90%.
- Use website caching. By caching your website, many users can access your web pages at once without having to wait for them to render over and over again.
- Reduce redirects. Redirects devour the page load time because each new redirect means a new HTTP request-response cycle.
- Optimize your code. Minify CSS, JavaSrcript, and HTML.
- Optimize server response time. Server response time is the time between the request from the client and the server's response to this request. The shorter server response time, the faster page loads. This is when the robots can leave a website if it takes too much time for them to wait. Bear in mind that they have a limited crawl budget.
4. The Factors Damaging Your SEO Success
Along with do’s, you should get familiar with the don'ts. Avoid using:
- Duplicate data.It’s a ‘shared’ or entirely identical content within one or several domains. The content which has been scraped from other web platforms can cause:
- Indexing issues – Google robots are short of crawl budget, and they will fail to index all identical content on your website.
- Issues with showing priority pages up in organic search.
- Thinning out internal and external PageRank (the significance or ‘popularity’ of any page within a certain site).
But still, there are too many legit reasons why duplicate content may occur on the website. The remedy is rel=canonical tag, which literally tells robots ‘dude, don’t index this page, go to that one, it’s original.’
I wholeheartedly want you to catch up on:
- Keyword stuffing. The practice of inserting as many keywords as possible can turn to be detrimental for a domain rating. It’s considered to be a manipulation of rank. It puts a website at risk of being penalized by the Penguin Google algorithm.
- Thin content. Too small, usually ugly pieces of content inflated with keywords. Thin content is a manipulative practice, so the Panda algorithm was intentionally created to punish those who disfigure the web.
- Automatically generated content. The content which wasn’t composed by humans and, as a result, it makes no sense for humans. Users will cringe and bounce at the sight of it, and the search robots will ban it.
- Cloaking. It’s a nasty trick with search engine robots which serves them the content that is different from the data placed on the page. This practice is a part of Black Hat SEO, which can lead to terrible results when crawlers recognize the substitution.
5. How to Do On-Page SEO Audit with a Crawler
If you made it up to the end of this post, and now you feel like having even more questions than in the beginning, I’ll try to answer at least one: ‘How do I find all these things on my website?’
There are many SEO tools on the market that you can use to conduct at least basic website audits. But there are also more advanced tools that are used for a broader range of SEO purposes. Netpeak Spider perfectly fits both demands.
You can carry out a comprehensive website analysis even in the free version of Netpeak Spider crawler that is not limited by the term of use and the number of analyzed URLs. Other basic features are also available in the Freemium version of the program.
To get access to free Netpeak Spider, you just need to sign up, download, and launch the program 😉
P.S. Right after signup, you'll also have the opportunity to try all paid functionality and then compare all our plans and pick the most suitable for you.
I’ll quickly show you how to conduct a basic on-page audit:
- Launch the program.
- Enter the website address into the ‘Initial’ URL field. I’ll inspect www.sephora.com website.
- Go to the sidebar and select the parameters which are enough for basic audit:
- General → ‘Status Code’, ‘Content Type’, ‘Issues’, ‘Response Time’, ‘Content Download Time’.
- Crawling and Indexing → ‘Compliance’, ‘Allowed in Robots.txt’, ‘Canonical URL’.
- Head Tags → ‘Title’, ‘Title Length’, ‘Description’, ‘Description Length’.
- Content → ‘Images’, ‘Content-Length’.
- H1-H2 Headings → ‘H1 Content’, ‘H1 Length’, ‘H1 headings’.
Here how it looks:
- Start crawling, hitting the ‘Start button.’
- When the crawling is completed, you’ll be able to see the results in the main table with issues overview in a sidebar.
- Go to the ‘Export’ tab in the upper right corner to export table results or SEO audit in the PDF format (can be reached from the main window).
Here how some parts of the PDF audit look like:
On-page SEO plays a pivotal role when it comes to optimizing your website’s rankings. If you don’t take it seriously, all your off-page effort might not pay off as you expected. I hope that with these practical rules, you can leverage on-page SEO and make your site rank higher.
Apart from content optimization, on-page SEO includes many other technical aspects, which require the help of a webmaster or developer.
An on-page audit can be easily conducted in the Netpeak Software SEO tool. It takes a few minutes literally to analyze the whole website. In the end, you receive a comprehensive report on the detected issues.