What Is SEO?
Table of Contents
SEO stands for Search Engine Optimization. It’s best defined as the steps a webmaster takes to increase the visibility of his/her web pages in the search engine’s organic search results.
Let’s define a couple of terms at this point.
A webmaster is simply a person who is responsible for a website.

Organic search results are those listings in the SERPs (Search Engine Results Pages) that are there on merit because of their relevance to the search term typed in. It’s important to differentiate between organic and paid listings. With paid listings, webmasters pay the search engines to have their pages (in the form of ads) listed at the top of the SERPs. You can identify paid listings in Google because they have the “Ad” label next to them.
There can be several Ads at the top of the SERPs before any organic listings appear. You may also see ads at the bottom of each results page.
Before we go on any further, I should just mention that this book focuses on Google SEO. Google is the largest search engine on the planet. In fact, most search traffic from around the world comes directly from Google Search. Google has become synonymous with the term “search engine.” It’s an expression that has even found its way into the English dictionary as a verb.
I expect you’ve heard someone describe how they “googled” something or other. Google is the most important place to rank well, and if you rank well on Google, chances are you will rank well on Yahoo and Bing too.
Why Is SEO Necessary?
The answer may be obvious, but let’s start at the beginning.
When you build a website, how are people going to find it?
The sad truth is that you can have the best website on the planet, but if people cannot find it, it’ll be a very lonely place. This brings us back to search engines.
Fact: Search engines are the number one way people find websites.
The good news is that it is easy to get your pages into the search engines.
The bad news is that it is difficult to get your pages to rank high enough to be visible in the search engines.
Data from May 2018, published by Smart Insights on their website, showed that the page which ranks on page one in position one of the SERPs typically gets around 30% of all clicks on that page.
The web page ranked in position two gets 14%. The page in position three gets 10%. In positions 9 to 10, the clickthrough rate is as low as 2%.
From this information, you can see that it is important to not only rank in Google but to rank as high as possible. Even one or two places can mean big changes in potential search engine traffic.
I should point out that these figures are just averages. As you will see later in the book, click-through rates are related to how appealing your listing is on Google. Increase the appeal, and you can increase the CTR. If you increase CTR, your rankings may also improve.
What Makes a Page Rank Well?
To rank high in the organic search results, your page needs to be one of the best matches for the search term typed into the search box. Your page needs to match the “intent” of the searcher. Your page also needs a certain level of trust and authority.
How much is defined by the competition in your niche and even the niche itself? For example, after the “Medic Update” in August 2018, it’s now very difficult to rank in health and finance niches without a lot of trust and authority.
A little later in this chapter, we’ll look at the top-ranking factors in a little more detail, but to fully understand how SEO is different today, we should consider the SEO of the past.
How We Used to Rank Pages
In the good old days (that’s up to about 2010), ranking in Google was relatively easy.
At that time, Google’s algorithm (the complex code that determines where a web page will rank) was heavily based on two things. The keywords found on the page and the links pointing to it.
These were the main ranking factors, and webmasters knew it. Since both of those factors could be controlled and manipulated by the site owners, many webmasters began to organize and manipulate things so that they could rank well in the SERPs.
Here was the process for doing this:
1. Identify the keywords you want to rank for (the ones people type in at the search engines).
2. Create a page that was “optimized” for that keyword or phrase. The quality of the content was not important back then. You simply needed to include your chosen keyword in as many places within the HTML code as possible. This was aptly named keyword stuffing.
The keyword would be placed in the title of the page, the opening header, in the body of the content (maybe five times or more per 100 words), and in the ALT tags (a text alternative for an image or object on a page). Keywords were also sometimes stuffed into the domain name itself. The more times you could get your term on the page, the better your results.
3. Build backlinks to the page. These were often built by the thousand using automated backlinking tools that dropped links into a variety of spammy, worthless pages around the net. These backlinking tools would use your chosen keyword in the link text.
Basically, that was the strategy, and it worked. You could rank for literally any term using that simple formula.
Webmasters were able to rank in the SERPs for anything they wanted, for whatever they wanted. Back in the very early days, if you got into the first top 10 positions on page one of Google, you would remain there for three months until the next update came out.
Google had lost control.
As you can imagine, the SERPs started to fill up with junk. Pages were ranking because of the spammy techniques employed by webmasters, rather than on merit, and that made Google look bad.
As a search engine, Google’s goal was to return the best, most relevant, and high-quality results as possible. The reality was very different. In many cases, the top 10 was filled with trashy content; spammy sites that offered little or no value to the web surfer.
Over time, Google refined its algorithm, making it more and more difficult for webmasters to game the system. In Google’s ideal world, its prized algorithm would be based entirely on factors that webmasters could not control or manipulate.
As you will see later in the book, Google’s Panda and Penguin updates (as well as several other major updates) were designed to take back control from the webmasters. By removing factors from the algorithm that webmasters could easily manipulate or give these factors less importance, Google made it increasingly difficult for webmasters to game the system. Bear this in mind when we look at the top-ranking factors later in this chapter.
Personalized Search Results
In recent years, Google has been applying more and more personalization to the search results. So, what does that mean exactly? It means that what you see when you search for a phrase in Google may not be the same as what someone in a different part of the country would see for the same search term. In fact, it might not even be the same set of results that your neighbor sees or your mom sitting next to you on the couch while searching on her mobile phone.
It is important to realize this. Not everyone sees the same results for the exact same search query.
You see, when someone searches on Google, the search giant will apply filters to the results to show that searcher the most relevant results, based on their own personal circumstances and viewing history.
As a quick example, let’s say you search Google for an antivirus program on your iMac. The chances are you’ll probably see a bias towards Mac anti-virus tools and the discussions of Mac users as to whether antivirus is needed on a Mac computer. Do the same search on a PC, and you’ll get PC antivirus software discussions and reviews. Repeat the search on an Android phone or iPhone, and you’ll get results tailored to those operating systems.
This is one simple example. Google looks at much more than just your operating system. Other factors it’ll try to use include:
- Your location (whether that is your reported location, IP address, or GPS location as given out by mobile devices).
- Your search history, which looks at what you have been searching for in recent days or weeks.
- Your social networks, including your likes, your friends, and your circles.
- Whether you are searching on a mobile device, desktop, or even SmartTV.
Your preferred language.
Personalization of the search results is meant to make our lives easier, and in many ways, as a consumer, it does. However, as far as SEO is concerned, personalization can be a pain if we don’t know what other people are seeing in their SERPs.
Top Ranking Factors in 2022
Earlier I suggested that in an ideal world, Google would build its algorithm around factors that were not easily controlled (and therefore manipulated) by webmasters.
In this section, we’ll look at some of the factors we know are important. Think about each one in turn and how much control a webmaster has over it. Ranking Factors can be split into two groups, namely “on-page” and “off-page.”
On-Page Factors
1. Quality Content
Google is looking for high-quality content that is relevant to the search query. It will look at the language used on a page and for words and phrases that are related to the search query. I call these “theme words.” Longer pages tend to do better, and the inclusion of photos and/or video works to your advantage too.
Furthermore, pictures and videos also help to retain the visitor’s interest. Google also looks at the design of pages and what a visitor will see “above the fold” (before scrolling down) when they land on a page. Good user experience is
essential. If a visitor landing on your page sees a bunch of adverts and very little else, you can imagine how Google would view that page.
2. Page Load Time
Nobody likes waiting around for a page to load. If your web pages are taking five or more seconds to load, your visitors may not wait and hit the back button. According to research, your average web user has an attention span of just a few seconds, less than a goldfish. So, it’s important that your pages load quickly.
Slow page load times are unlikely to be directly penalized by Google. It’s more how a visitor reacts. If the searcher came from Google, a slow loading page will make that visitor unhappy and even click the back button to Google. Google sees both the “bounce” and “exit” rates as negatives for your page. An unhappy visitor from Google means an unhappy Google.
3. Internal Links from Other Pages on the Site
If you look at a website like Wikipedia, you’ll see a lot of internal links on the pages. Internal links go from one page on a website to a different page on the same site. These should be distinguished from external links. External Links point to a different website. Internal links are there to help the visitor navigate around your website’s pages. As someone reads a page on Wikipedia, they might come across a word or phrase they do not understand or simply want to know more about. By “internally” linking keywords or phrases to other pages on Wikipedia, visitors get to navigate around the site more easily and find the information they are looking for quickly. Internal links also help Google fully index your website.
4. Bounce Rates
We mentioned bounce rates earlier in the context of fast loading pages. A “bounce” is simply a visitor who clicks a link in the SERPs and then returns to Google. The quicker the return, the worse it is for your page as it tells Google the visitor was not satisfied.
Let’s think about how this might work.
Say a visitor on Google searches for “vitamin A deficiency” and visits the first page in the SERPs. Not finding what they want, they then click the browser’s back button to return to Google. They may then click on another site further down the SERP to see if that can provide what they are looking for.
What does this tell Google about the first page?
The visitor did not find the information they wanted on that page.
Google knows this because they returned to the search results and repeated (or refined) their search. If lots of people around the world search for a certain phrase, and an unusually high percentage of them bounce back from the same web page that is ranked #1 in Google for the search term, what do you think Google will do?
Doesn’t it make sense that it would demote that page in the SERPs – for that search phrase – since lots of people are not finding it relevant to their search query?
Bounce rates go hand-in-hand with searcher intent. If visitors find a page relevant, they’ll stay on the page for longer. They may even browse other pages on that site, so don’t bounce right back. This tells Google the visitor was happy with that recommendation, and Google is happy.
5. Time a Visitor Stays on Your Page / Site.
Google monitors the time visitors spend on web pages. One of the ways it does this is through its Google Analytics platform. Google Analytics is a freemium web analytics service for site owners. What it does is track and report on your website traffic. Because it’s free, a lot of webmasters install it on their sites. This gives Google the ability to accurately track the site’s visitors. It’ll track lots of variables, including things like time spent on the site, the route a visitor takes through your site, how many pages they visit, what operating system they use, the screen resolution, the device they are using, and so on. Even if a site does not have Analytics installed, it is possible that Google monitors visitor behavior through its popular Chrome web browser.
6. Trust and Authority
I’ll cover this here even though it is controlled by off-page factors, simply because we think about the authority of a site as an on-site property.
This factor became huge in 2018. It was always important, but with the introduction of the “Medic Update” in August of that year, high trust & authority is now vital for ranking in health and finance niches (and other niches will follow). Essentially, if a site can hurt your health or your financial well-being with the information (or products) available, it will require a lot more trust before Google will rank its pages. It is my opinion that the way Google is monitoring these factors is down to what other authoritative sites (and people) are saying about you and your site.
As we’ve seen, votes (links from other sites) pass on this authority. Now more than ever, it is important to focus on high-quality, relevant, and authoritative links, rather than high numbers of links. Quality over quantity is the key. While trust and authority are things your site will accrue over time, they are largely controlled by off-page SEO, and we’ll come back to this later.
Those are the main on-page factors used by Google in its ranking algorithm.
Except for the last factor, most of the on-page factors are within the control of the webmaster. Even bounce rates and the time the visitor stays on your site is within your control, to a certain extent. If you provide quality content and the rich experience visitors demand these days, then you’ll get lower bounce rates while keeping the visitor on your page/site for longer.
Off-Page Factors
1. Click-through Rates (CTR)
As webmasters, we do have a certain level of control over Click-through Rates.
Let’s say a web page ranks in position #5 for a search term, and searchers seem to like that listing because 15% of them click on that link. Usually, a page listed in position #5 would get around 5% of the clicks. When Google sees more people than expected clicking that link, it may give the page a boost in the rankings. After all, it’s apparently what the searchers are looking for and therefore deserves a higher slot on the first page.
On the other side of the coin, imagine a spammer. This is an “official” term used by Google to describe someone trying to manipulate rankings for one of their web pages. Let’s suppose the spammer manages to bypass Google’s algorithm with a “loophole” and ranks #1 for a search term. Remember, in position #1, a link typically gets 31% of the clicks. However, this #1 ranking page only gets 15% of clicks because searchers are not impressed with the link title or its description.
On top of that, 99% of people who do visit that link bounce right back to Google within 30 seconds or less because it’s rubbish. Google now has clear user signals that the web page ranking #1 is not popular with searchers. Because of this, Google starts moving the page further down the rankings until it finally drops out of the top 10. It will continue to drop. Today, bad content will rarely get to the top of Google, and if it does, it won’t get to stay there for long.
2. Social Signals
Social signals like Tweets, Facebook shares, Pinterest pins, and so on, are clearly used as ranking factors in Google, though they are not major factors. Any boost that social signals might offer your site will be short-lived. This is because of the transient nature of “social buzz.”
For example, let’s say that a new piece of content goes viral and is shared around by thousands of people via social media channels. This is typically done within a relatively short space of time. Google will take notice of this because it realizes the content is something visitors want to see, so it gives it a ranking boost. After the social interest peaks and the shares inevitably start to decline, so does the ranking boost in Google.
Social sharing is a great concept and should be encouraged on your site. Even so, don’t expect the backlinks created from social channels to give you a big or long-lasting ranking boost because they won’t.
3. Backlinks
Re-read what I wrote about trust & authority a minute ago. When “web page A” links to “web page B” on another site, page B gets a “backlink.” Google sees this as page A (on site 1) voting for page B (on site 2). The general idea is that the more backlinks (or “votes”) a page gets from other sites on the web, the more important or valuable it must be.
Today, and probably for the foreseeable future, backlinks remain one of the most important rankings factors in Google’s algorithm. However, more is not always better. Let me explain.
A web page that has dozens of links from authority sites like CNN, BBC, NY Times, etc., is clearly an important web page. After all, quality, authority sites like the ones above would hardly link to trash.
On the other hand, a page that has thousands of backlinks, but only from spammy or low-quality websites, is most probably not very important at all. Backlinks are a powerful indicator of a page’s value, but the quality and relevance of those backlinks is the most important factor, not the quantity.
High-quality links build authority and trust. Low-quality links have the opposite effect.
A site with hundreds or thousands of low-quality backlinks is helping Google to identify itself as a spammer.
Google factors in the authority of each backlink. Backlinks from high quality “authority” web pages will count far more than backlinks from low-quality pages/sites. Therefore, a page that gets relatively few high-quality backlinks will rank above a page that has a lot of low-quality backlinks. Google may even penalize a page (or site) for having too many poor-quality backlinks.
NEXT READ: