82 Resources to Learn SEO Basics from ScratchLists
If you’re new to SEO and website audit, start here. Everyone was a beginner once, so we decided to collect fundamental terms and links to reliable resources organized by the subject relevance. Some of the blog posts contain more hands-on information about SEO audits that will help you level-jump.
Follow the jump links to switch to the desired topic.
What Is SEO?
How Search Engines Work
How to Do SEO Audit
Competitive SEO Analysis
How to Measure and Track Results
SEO Community Resources
1. What Is SEO?
Search Engine Optimization (SEO) is a set of different practices that are applied to get more quantity and quality traffic to the website. Plainly speaking, it’s all about getting the right audience to reach the highest possible position in the organic search engine result pages (SERP).
Though at first glance, SEO may seem a bunch of dull technical techniques that need to be reiterated each step at a time, it requires strong analytical and creative skills. Despite a good website structure, optimized technical background, proper keywords in the proper places, you should also think about relevant content that should meet the website visitors' needs and the links from other websites as proof of trust and worthiness of your project. So it’s an undying process.
These days, SEO is vital as never before since many businesses switch online because online means more opportunities and broader audience.
On the one hand, there is a user who has an urge or a question, and he types in a search engine bar, let’s say, 'how to make a smoothie.' As soon as he hits 'Search,' a list of results appear, he clicks on the one that seems the best fit and jumps to the website to find the answer to the question (in best case scenario).
On the flip side, there's backstage magic that makes all these things happen. The main point is to decipher the user’s intent, understand what they behave online, what words they are using, and what results they expect to receive.
1. An SEO ‘bible’ from Google
2. Beginner’s guide to SEO from Moz
2. How Search Engines Work
2.1. Crawling, Indexing, Ranking
The web is often compared to an enormous library that stores billions of books. It happens in three stages:
- Crawling. A discovering stage. Google sends an army of crawlers (in no-fluff terms, it’s a software that uses peculiars algorithms to crawl the web), whose mission is to find web pages and follow the links on those webpages in order to collect information and bring it to the databases of search engines.
- Indexing. The process of sorting out and organizing results to furnish them to the users. Once everything is organized, the pages can take part in ranking.
- Ranking. Search engine algorithms estimate stored web pages according to their relevance and serve them up in the search result pages from the most relevant to the least.
3. A visualized theory on Youtube by Matt Cutts. Easily digested 👌
2.2. Is Everything Going to Be Crawled? Crawl Budget Explained
Crawl budget stands for the limited number of URLs a search robot can and wants to crawl during one visit before leaving your website.
It’s the case for large websites that should optimize crawl budget not to waste it on unimportant pages.
2.3. Ranking Factors
Google ranking systems are made up of different algorithms that come into place when a query is typed into a search bar. Google's staying up to speed with the extremely volatile world and continually changes its algorithms to improve the search.
However, there are ranking factors that keep being significant despite all the changes.
2.4. SERP Features
SERP stands for search engine result pages. On the SERP, you can see organic and paid (ads) results:
In most cases, SEO's sweet dream is to get not in the first position of the SERP, but the 'zero position,' called 'featured snippet'.
If your web page provides an answer in the form of 'how-to' or 'what is' tutorial and has a structured data markup (more on this later), it is likely to appear on that extremely clickable 'zero’ position.
2. A detailed guide to all SERP features
2.5. How People Are Doing Search and How to Understand Their Intent
This is how a standard search process looks like:
SEO’s job is to understand the intent behind a search query. Why are they searching? What do they want to achieve through their search? So 'search intent' is the goal a user wants to accomplish by using a particular keyword. Understanding search intent isn't that easy, but these days it's hugely important.
1. How to identify search intent through SERP analysis
2. How to best optimize for user search intent
2.6. Google Algorithms
During the last decade, Google rolled out several algorithm updates that shook up the web immensely and forever changed the game rules. Google algorithms were created to ‘purge’ the internet of spammy and low-quality content and combat thriving Black Hat SEO practices (some of them are still commonly used).
These are the key algorithms and the web evils they fought against:
- Panda – affected duplicate, plagiarized or thin content, user-generated spam, and keyword stuffing.
- Penguin – affected spammy or irrelevant links, and links with over-optimized anchor text.
- Hummingbird had an influence on keyword stuffing and low-quality content.
- Mobile targeted mobile version of the page and punished those who didn’t have such.
- RankBrain dealt with shallow content and bad user experience.
- Possum tensed local SEO.
- Fred took over thin and affiliate-heavy content.
1. About Google algorithms update history
2. The most annoying Google penalties and updates, and how to recover from them
3. The Google algorithm cheat sheet
3. Keyword Research
Keyword research is a bread-and-butter of SEO. It helps determine what words your audience uses to speak of your business and what keywords can bring you traffic. Keyword research is not a one-time cost. It should be reiterated at any stage of your website development to keep it up to speed.
3.1. What Is a Keyword?
One bite at a time, let’s figure out what a ‘keyword’ term actually means in SEO. In SEO, keywords are the words and phrases that help users find pages with necessary information via search engines. Keywords help search engines understand if the webpages are relevant to the users' search intent. SEOs are working hard to collect those words to understand their audience's language and show search engines that their content is most relevant.
Keywords are the foundation for website structure and the content itself.
Go deeper and read the extended definition of a keyword from Moz.
3.2. How to Do Keyword Research
Keyword research starts with discovering your ‘seed’ or ‘primary’ keywords. Since this type of keywords is extremely competitive, this list should be expanded to the keywords that imply the user's intention. Such keywords are called long-tail keywords.
3.3. Long-Tail Keywords
Long-tail keywords consist of more than three words and often look like a sentence: ‘how to make apple juice in a juicer machine’.
They are more specific about the user’s intention. Long-tail keywords have more potential to bring quality audiences – the more intentional one, and that is more ready to convert.
1. Guide to long-tail keywords from Ahrefs.
2. What does the ‘long-tail’ keyword really mean?
4. On-Page SEO
Now let’s deal with this little stumper.
On-page SEO or on-site SEO is the practice of crafting individual web pages to enhance ranking, increase engagement, and earn relevant organic traffic. It deals with both textual and graphic information a page consists of in addition to the quality of HTML source code, and many other technical aspects such as meta tags and structured data.
4.1. Quality Content
Google will never get tired of fortifying the idea that the web should be a place of great content.
Great content should be unique, well-structured, showcased with images / videos, nourished with keywords, and most importantly – it should bring value to the users.
1. A good starting point to how to build killer content. Whiteboard Friday by Rand Fishkin.
2. We talked about blog optimization and gave a few actionable tips.
3. Here we explained how to create a content strategy.
4. And showed how to detect and optimize old content (since it’s a valuable source of traffic) with Netpeak Spider and other tools.
5. Blogging for business course from Ahrefs. Gives a decent in-depth view of how to work with content.
6. Beginner’s guide to content marketing from Moz. For those who need to get a general grasp.
What induces you to open and read an article? In most cases, it’s a title. Being the main element of the snippet, catchy titles draw users’ attention in the SERP and help to the final decision: to follow your website or keep looking for another result. Just as important, titles provide search engine robots with a clue to what the page is about.
A title tag is an HTML element that dwells in the <head> chunk of the code. It’s an essential place for the keywords that you need for promotion.
4.3. Meta Description
Meta description is another HTML tag that incorporates a pithy summary of the webpage. Quite often, search engines use them in the snippets, just below the title and URL. Meta description is not a ranking factor itself, but it contributes to CTR (click-through rate) improvement for several reasons:
- A properly-composed description acts as an advertisement in SERP and catches the eye.
- You can sprinkle keywords over the text, and they will be highlighted bold.
- When you want to share an article on famous social media such as Facebook and Twitter, they fetch the page's meta description.
4.4. Header Tags
Header tags (or heading tags) are HTML elements that are used to create headings on the page. They structure the text and divide it into logical parts. Headers tags will help catch user’s attention during the first seconds they hop on your page.
Why should you be vigilant about those first seconds? A harsh truth about online reading is that people have a proclivity to scan the text, not read it word-to-word. Here’s recent research that perfectly demonstrates it.
A URL (Uniform Resource Locator) is a web address that points to a unique resource. Here’s an example of a URL:
URL should be user-friendly too. It means that it shouldn’t be a random jumble of letters and numbers but a human-readable and understandable piece that informs users about the content.
1. Here’s a guide to SEO-friendly URL structure: ‘10 Tips to Create SEO Friendly URL Structure.’
2. And here's basic dive-in information about URLs from Moz blog.
4.6. Anchor Text
Anchor text is a clickable text that transfers a website visitor to another page using inserted reference. It means that anchor text is a part of the provided hyperlink. Its key features are clickability and readability. Here’s an anchor text in raw:
4.7. Media Optimization: Image, Video, Alt Text
An article furnished with catchy details has more chances to be looked through.
However, a reckless image stuffing won't do that much.
- Use the right file type: JPEG for photos, PNG instead of GIF to achieve a better compression ratio.
- Compress images – heavy files are the main hazard of slow page loading.
Each image should go with alt text (alternative tag) to assuage the perception for visually-impaired readers and search robots. Enhance alt text with keywords.
Another tip is don’t host videos on your website because it may slow your website loading speed due to the limited bandwidth.
4.8. Internal Linking
Internal links are all links that point to the pages within one domain. Internal linking is immensely important for SEO because it eases search engine crawling and helps Google find new content. It also helps distribute PageRank around your site and pass link equity to new pages that require an additional boost to get visible.
4.9. On-Page Issues
Thin content is a generic term for content that has no value, is stuffed with meaningless and awkward keywords, duplicated or copied content, that is low in word count, and is meant only to promote sales.
Duplicate content is another SEO's foe. Duplicate content represents pieces of information within one or several domains which are entirely identical. It leads to significant indexing problems and, thus to ranking drops.
Broken links are the links that lead to non-existent pages, files, or images. If everything is set right on the technical side of the website, the user will see a 404 or 410 error. The companies often let their creative spirits go on these pages.
Missing ALT attributes go unexplained to search engines since they don’t understand what is depicted on the images, they ‘read’ what is depicted in the alt text.
5. Technical SEO
Now let's plunge into technical SEO, major terms, and how they correlate with SEO.
5.1. Page Speed
5.2. Top-Level Domain, Domain, Subdomain, and Subfolder
Let’s look at the example first:
- https:// – a primary protocol used to send data between a web browser and a website server. ‘S’ in the end stands for ‘secure’.
- www. – most common subdomain choice
- tasteofhome – a domain. It’s a website name bought from registrars and web hosting platforms (such as GoDaddy). A domain name is a unique website location – no two identical domain names can exist.
- .com– top-level domain. It dwells at the highest level in the hierarchical Domain Name System.
- /recipes/apple-pie – subfolders.
And the website’s domain structure may be a sink or swim for SEO.
5.3. Website Structure
Remember about one of the ranking factors – a crawlable website? The website structure for search robots is its internal links. If you work out a clear structure, it will be easier for robots to find, crawl, and index the content. Meanwhile, it also positively affects how the users perceive your website – a win-win for all parts.
Get a better grasp of how to build and analyze website structure.
5.4. Directives in Meta Robots, X-Robots-Tags, Robots.txt
SEOs can manage search robot behavior on the website, pointing to the most important pages and closing the least important ones.
Robots meta tag, robots.txt, and X-Robots-Tag are the members of Robots Exclusion Protocol (REP) – a merge of different standards that determine the way search robots crawl and index the website.
- Robots.txt is a text file in the root directory of the website.
- Robots meta tag is placed into the <head> section of the HTML code of the page.
- X-Robots-Tag is a part of an HTTP header sent from a web server.
They are located in different places and share common directives, but their influence is different. For instance, directives in robots meta tags have more power than in robots.txt. Directives in X-Robots-Tags are harder to implement, that’s why it’s a rare case.
Sitemaps are necessary for big websites because they contribute to crawling and indexing.
There are two widespread types of sitemaps: HTML and XML.
XML sitemaps are a text file with links to relevant pages and specific tags.
HTML sitemap is a clickable list of all subpages in the footer of the website. Its purpose is to show readers and search robots what’s on a site, organize large websites, increase SERP visibility, etc.
1. Google broke the process of building and submitting the sitemap in three cohesive steps
2. We also covered this topic on our blog: ‘What Is XML Sitemap? How to Create and Validate It.’
3. And more about HTML sitemap here: ‘How to Create the Correct HTML Sitemap.’
5.6. Structured Data
The structured data or schema markup provides search robots with more information about the page and its content. By deploying additional tags and attributes to HTML, you can:
- Extend an ordinary-looking snippet to a rich snippet
- Appear in the ‘zero’ position (featured snippet)
- Give clues about what content should be crawled
- Tell more about page type
- Increase CTR (click-through rate)
2. In case you want to explore click-through rate in SEO: 'What Does CTR Mean in SEO?’
5.7. Status Codes
HTTP response status code is a way a server responds to a browser request. If the request is successfully completed, you (your browser) will be returned a 200 OK response code, literally translated as ‘Everything is okay’.
Redirects are a way to lead users and search robots from a page they initially requested to a new one.
There are many redirect types, but the most common are 301 and 302 (these numbers refer to HTTP status codes).
301 ‘Moved Permanently' redirect is a permanent redirect set when the old one is set out of range. This type of redirect is known for ‘link equity’ passing to a new page. That’s why it’s recommended to opt for this type of redirect.
302 ‘Moved Temporarily’ or ‘Found’ redirect is used to temporarily move one URL to another. This redirect is unlikely to pass link equity, so it should be treated cautiously.
5.9. Canonical Tag
Unlike the 301 redirects, canonical directive is intended only for robots to point to the most representative page among the identical pages (duplicates). Merely speaking, canonicals tell search robots what URL is more preferable to appear on search result pages.
5.10. Server Response Time
Server response time is the time between the moment a browser sends a request to a server and the server responses. This time is measured by Time to First Byte (ttfb).
It’s a tentative ranking factor because the less it takes for a server to respond, the faster the website will load. And vice versa.
5.11. Mobile-First Indexing, AMP
Mobile-first indexing means that among two versions of pages – mobile and desktop-oriented – robots will choose the first one to index. Moreover, Google's continuously going all out to improve mobile search. New Page Experience Update is about to come, and it also includes such features as mobile-friendliness.
AMP stands for Accelerated Mobile Pages. It's an open-source framework used to create a better and more lightweight experience of mobile users.
5.12. HTTPS vs. HTTP
HTTP (Hypertext Transfer Protocol) is a client-server protocol used to transfer data over the World Wide Web. A client (your browser) communicates with the server by sending it individual messages (requests) and receiving the answers (responses).
If you’re more into tech stuff, dive deeper into this subject in this article: ‘An Overview of HTTP.’
HTTPS stands for Hypertext Transfer Protocol Secure. The 'secure' in the end gives a clue that communication between your browser and the server is encrypted and thus protected. Why is this needed? Under HTTP protocol, your private and sensitive data such as logins, credit card numbers, etc. may be intercepted by attackers. That's when HTTPS protocol comes into play. It ensures three levels of protection:
- Data integrity
How do you know whether the website is on HTTPS protocol? In the address bar, right from the URL, you’ll see a lock icon.
Sometimes a mixed content issue occurs when a website was transferred to HTTPS protocol, but some components such as scripts, images, videos, PDF files, etc. were left on HTTP. Then you'll see that this website is not fully secure.
Hreflang attribute is used to tell a search engine about several versions of the same page in different languages with different URLs.
Hreflang in the raw:
In which "en-US" stands for the American English version of the page.
6. How to Do an SEO Audit
If you’re serious about website performance, regular audits are the core SEO task. SEO audits help give a better understanding of the website, individual pages, and overall traffic. If done timely, many critical issues such as broken pages, duplicate content, broken redirects, or redirect loops will be spotted and fixed, not severely affecting the website. In Netpeak Spider, you can detect 100 issues.
Since each issue deserves a separate blog post, we created an SEO academy in Netpeak Software to cover the on-page and technical audits basics.
7. Off-Page SEO
Off-page SEO focuses on all actions outside the website realm that can improve rankings. Building a backlink profile is a massive part of it (and it's proven worthy). However, many other activities can bring results: linkless brand mentions, NAP citations, shares on social media, etc.
Backlinks are the links that lead to your site from other sites, blogs, and social networks. Google values backlinks a lot, considering them the signs of trustworthiness, credibility, and authority. This makes backlinks one of the top-ranking factors.
2. For those who don’t like reading and prefer to digest information on YouTube: ‘What are Backlinks and Why are They Important?’
6.2. Link Building
The process of obtaining the backlinks is called link building. A robust backlink profile is a great advantage of any website that can easily lead it to the SERP's first position.
However, not all backlinks are created equal. Some of them have more value than others. What makes that difference? For instance, a link from an old and authoritative domain is more valuable than from freshly-minted, developing domain with low traffic or, in the worst-case scenario, with reduced authority rates.
As you can see, link building is a subtle process that gives tangible results in the end.
That’s why we prepared several guides to link building to help you get a whole picture with all its peculiarities:
8. Competitive SEO Analysis
As an SEO beginner, you should know that learning from your competitors can give you the most valuable insights into what’s working best in your industry. There’s nothing wrong with implementing the practices that were used by competitors in your project. Before becoming a tough guy in SEO, you’ll need to learn from other tough guys 😃
It will help find their weaknesses and strengths and come up with your own strategy. Analyze these important parts of the competitor's SEO strategy:
- Technical optimization
- Backlink profile
- Top content
But, first and foremost, identify your true competitors on the SEPS.
9. How to Measure and Track Results
Set measurable and specific goals from the outset. Changes without further tracking and analysis won’t give you the main thing: the insights of what optimization efforts worked out and what failed.
To track results, you’ll need some tools:
- Google Search Console – get reports on website issues, performance, and traffic, see which queries bring users and optimize your content.
- Google Analytics – get reports on organic traffic, conversions, and engagement metrics. You can take Google Analytics Courses to learn how to create reports, track business performance, and carry out complex analysis.
- PageSpeed Insights – check out how fast the page loads.
- Structured Data Testing Tool – validate structured data markup.
- Mobile-Friendly Test – check out if the mobile version of the page is working well.
- Netpeak Spider – do website audits, save reports, and keep an eye on your website’s health.
10. SEO Community Resources
We understand that it might be too overwhelming to find your feet in all these SEO trends, techniques, and information. There are many SEO hubs where you can share your concerns and receive hands-on advice and learn from others. Anyway, the feeling of community is something that you’ll need not to feel left in the lurch.
So if you’re a Quora fan, we suggest joining these topics:
Reddit community, for the high-brow ones:
The GrowthHackers Community is yet another helpful resource. Anyone who has something valuable to share can post an article there. The articles are gathered according to related tags, which you can filter and focus on specific pain points – either SEO or Social Media Marketing, or whatsoever. If you don’t want to sift through numerous posts, you can subscribe to their newsletter and receive only editor’s picks.The Search Engine Land Journal is a holy grail place for digital marketers. Here you can find articles about digital marketing, news coverage, industry trends, ebooks for beginners, and many more.
To Wrap It Up
SEO isn't that difficult as it might seem from the outset. With this kit, you'll get a grasp of the main aspects of SEO. To dig deeper, do several website audits to put your skills into work. For only practice makes perfect.