What is SEO in Terms of Ranking At Google#01
SEO (Search Engine Optimization) is the practice of driving traffic to a website from organic search results. This includes things like keyword research, content creation, link building, and technical audits. SEO stands for “Search Engine Optimization”. It’s the practice of increasing both the quality and quantity of website traffic, as well as your brand exposure, through unpaid (aka “organic”) search engine results.
Despite the acronym, it is as much about people as it is about search engines themselves. It’s about understanding what people are looking for online, the answers they’re looking for, the words they use, and the type of content they want. consume. Knowing the answers to these questions will help you connect with people online who are looking for the solutions you offer.
If knowing your audience’s intent is one side of the SEO coin, presenting it in a way that search engine crawlers can find and understand is the other. Expect to learn how to do both in this guide.
Read More about Google Analytics
SEO Guide To Search Engines
Search engines are answering machines. They sift through billions of pieces of content and evaluate thousands of factors to determine which content is most likely to answer your query. Search engines do all this by discovering and cataloging all content available on the web (web pages, PDFs, images, videos, etc.) through a process called “crawling and indexing” and then ranking them. Based on their match with a query. in a process, which we call “ranking”. In the following sections, we’ll discuss crawling, indexing, and ranking in more detail. Search engines work by crawling billions of pages using web crawlers. Also known as spiders or bots, crawlers browse the Internet and follow links to find new pages. These pages are then added to an index from which search engines pull results.
Methods of Fetching Data
- Crawling
- Indexing
- Ranking
Every search engine provide the best and most relevant results for it’s users. This is how they gain market potential. Every time someone clicks on a paid search result, the advertiser pays the search engine. It’s called pay-per-click (PPC) advertising and that’s why market share matters. More users means more ad clicks and more sales.
Crawling
While crawling, a computer robot called a spider visits and downloads known URLs. Google’s crawler is Googlebot. Crawl is the discovery process by which search engines send out a team of robots (called crawlers or spiders) to find new and updated content. Content may vary – it could be a webpage, image, video, PDF, etc. – but regardless of the format, the content will be found via links. Googlebot starts fetching certain web pages and then follows the links on those web pages to find new URLs.
By jumping to this link path, the crawler can find new content and add it to its index called Caffeine – a huge database of discovered URLs – for later retrieval when a searcher searches for information indicating that the content of this URL fits well for. During processing, Google strives to understand and extract important information from the crawled pages. To do this, he needs to render the page where he executes the code of the page to understand how it looks for users. No one outside of Google knows all the details of this process. But that doesn’t matter. All we really need to know is that it’s extracting links and storing content for indexing.
Indexing
Search engines process and store the information they find in an index, a huge database of all the content they’ve discovered that they think is good enough to make available to Internet users. Indexing adds processed information from crawled pages to the search index.
The search index is what you look for when using a search engine. That’s why it’s so important to get indexed on major search engines like Google and Bing. Users can only find you if you are in the index. Discovering, crawling, and indexing content is just the first piece of the puzzle. Search engines also need a way to rank matching results when a user searches. This is the job of search algorithms. Search algorithms are formulas that match and rank relevant index results. Google uses many factors in its algorithms. No one knows all of Google’s ranking factors because Google has not disclosed them. But we know some keys. Let’s look at some of them.
Ranking
When someone searches, search engines scan their index for highly relevant content and then rank that content in hopes of resolving the searcher’s query. This sorting of search results by relevance is called ranking. In general, you can assume that the higher a website is ranked, the more relevant the search engine considers that website to be for the search query. It is possible to block search engine crawlers from part or all of the website, or to ask search engines not to include certain pages in their index. While there could be reasons for this, if you want your content to be found by people, you first need to make sure it’s crawlable and indexable. Otherwise, it’s almost invisible. By the end of this chapter, you’ll have the context you need to start working with the search engine, not against it!
Why is SEO important?
People are likely to search for what you do, and you can attract customers by ranking for those terms. But you are unlikely to rank effortlessly because others try to do the same. This is why SEO is important. This helps show Google that you are most deserving of ranking.
What are the benefits of SEO?
Most people click on one of the first search results, so higher rankings usually lead to more traffic. Unlike other channels, search traffic tends to be consistent and passive. Indeed, the number of searches is generally quite constant from month to month.
SEO involves five main steps:
- Keyword research. Find what people search for.
- Content creation. Craft content for searchers.
- On-page SEO. Make your content as clear as possible.
- Link building. Build trust and authority from other websites.
- Technical SEO. Help search engines find, crawl, and index your content efficiently.
Keyword Research In SEO
You probably have a few keywords in mind that you want to rank for. These are things like your products, services, or whatever else your site is about, and they are great seed keywords for your search, so start there! You can enter these keywords into a keyword research tool to find average monthly search volume and similar keywords. We’ll dive deeper into search volume in the next section, but during the discovery phase, it can help you determine which variations of your keywords are most popular with searchers.
Once you’ve entered your seed keywords into a keyword research tool, you’ll begin to uncover other keywords, FAQs, and topics for your content that you might have missed while typing in keywords. search engines like Google to search for products, search for services and information.
Usually, the higher the search volume for a keyword or keyword phrase, the more work it takes to rank higher. This is often called keyword difficulty and sometimes includes SERP features; For example, if many SERP features (like snippets, knowledge graph, carousels, etc.) are clogging a keyword’s results page, the difficulty will increase. Big brands often take the top 10 results for high volume keywords. So if you are new to the web and search for the same keywords, the uphill battle for rankings can take years.
Generally, the higher the search volume, the greater the competition and effort required to achieve organic ranking success. However, if you go too low, you risk not attracting Internet users to your site. In many cases, it may be more beneficial to target very specific search terms with less competition. In SEO, we call these long tail keywords.
Competitor Analysis With Keywords
You will probably collect a lot of keywords. How do you know what to tackle first? It may be a good idea to prioritize high volume keywords that your competitors are not currently ranking for. On the other hand, you can also see which keywords in your list your competitors are already ranking for and prioritize them. The former is ideal if you want to capitalize on your competitors’ missed opportunities, while the latter is an aggressive strategy that puts you in competition for the keywords your competitors are already successful with.
Knowing seasonal trends can be beneficial when determining a content strategy. For example, if you know that the UK “Christmas Box” will increase sharply from October to December, you can prepare content months in advance and give it a big push during those months.
You can target a specific location more precisely by limiting your keyword search to specific cities, counties, or states in Google Keyword Planner, or by rating “interest by sub-regions” in Google Trends. Geo-specific research can help make your content more relevant to your audience. For example, you might find that in Texas the preferred term for a large truck is “big rig”, while in New York “tractor trailer” is the preferred terminology.
ON Page SEO
On-page is anything you can do on the site itself to improve its rankings. It aims to help Google and Internet users better understand and digest your content. Google looks at what you can influence with on page SEO to decide if your page is a relevant search result. This includes whether the search query keywords are on the page and how searchers interact with them.
Title Page
Heading tags, including H1, help Google understand the content of your pages. Best practice is to use one H1 per page for the title. To find pages with missing or empty H1 tags, use Ahrefs Site Audit to crawl your site and access the content report. It’s free with an Ahrefs Webmaster Tools (AWT) account.
Improve the visual hierarchy of your content by including subtitles in H tags. Use H2 for subtitles, H3 for sub-subtitles, etc. This makes it easier for researchers to digest and skim.
keep them short. Less than 70 characters is preferable to avoid truncation.
Match search intent. Tell callers you have what they want.
Be descriptive. Don’t be vague or general.
The clickbait does not work. Make sure they match your content.
Add the keyword. Use a narrow variant when it makes more sense.
Enter the year. For subjects that require freshness.
Meta Description
Keep them short. Under 160 characters is best to avoid truncation.
Expand on the title tag. Include USPs that you couldn’t fit there.
Match search intent. Double down on what searchers want.
Use an active voice. Address the searcher directly.
Include your keyword. Google often bolds this in the results.
URLs
If your website is set up for SEO success, your URL structure must be solid. But you still need a descriptive slug for each page. Google recommends using relevant words for your content. Often the easiest way to do this is to use your target keyword.
Image Optimization
- Use the exact keyword
- use dashes within words
- Do not stuff keywords
- Use Alt Text
- Compress Images
External Links
Some people think that linking to external resources is bad for SEO. There is no proof of this. According to Google, linking to other websites is a great way to add value to your users. So don’t be afraid to do it where it makes sense.
Tactics You Must Avoid Doing ON Page SEO
Thin content
While it’s common for a website to have unique pages on different topics, a legacy content strategy was to create a page for each iteration of your keywords to rank on page 1 for those very specific searches. .
For example, if you sell wedding dresses, you might have created separate pages for wedding dresses, wedding dresses, wedding dresses, and wedding dresses, even though each page essentially says the same thing. . A similar tactic for local businesses was to create multiple content pages for each city or region they wanted customers from. These “geographical pages” often had the same or very similar content, with the location name being the only uniqueness.
Such tactics were clearly not helpful to users, so why did publishers do it? Google hasn’t always been as good at understanding word-phrase relationships (or semantics) as it is today. So if you wanted to rank on page 1 for “wedding dresses” but only had one page for “wedding dresses” you might not have made it.
This practice has resulted in countless thin and low-quality content on the web, which Google specifically addressed with its 2011 update known as Panda. This algorithm update penalizes low quality pages, resulting in more quality pages occupying the top positions in the SERPs. Google continues to repeat this process of downgrading low quality content and promoting high quality content today.
Google understands that you should have a full page on a topic, rather than several weaker pages for each variation of a keyword.
Duplicate content
Apparently, “duplicate content” refers to content shared between domains or between multiple pages within the same domain. “Scraped” content goes one step further and involves blatant and unauthorized use of content from other websites. This may include taking and reposting the content as is, or modifying it slightly before reposting it without adding any original content or value.
There are many legitimate reasons for internal or cross-domain duplicate content, which is why Google recommends using a rel=canonical tag to refer to the original version of web content. Even though you don’t need to know this tag yet, the most important thing to remember for now is that your content should be unique in terms of keyword and value.
Cloaking
A fundamental principle of search engine policy is to show search engine crawlers the same content that you would show a human visitor. This means that you should never hide text in your website’s HTML code that a normal visitor cannot see.
When this policy is violated, search engines call it “cloaking” and take action to prevent these pages from ranking in search results. Camouflage can be done in a variety of ways and for a variety of reasons, both positive and negative. Below is an example of where Spotify showed users different content than Google.
Keyword stuffing In SEO
If you’ve ever been told, “You need to insert {critical keyword} X times on this page,” you’ve seen keyword confusion in action. Many people mistakenly think that if you include a keyword in your page content just X times, you will automatically rank for it. The truth is that while Google looks for mentions of keywords and related concepts within the pages of your website, the page itself should provide value beyond the mere use of keywords. If you want a page to be valuable to users, it won’t look like it was written by a robot, so include your keywords and phrases naturally in a way your readers can understand.
Below is an example of a keyword-filled content page that also uses another old-school method: bolding all of your targeted keywords.
Auto-generated content
Arguably one of the most objectionable forms of low-quality content is that which is automatically generated or programmatically created with the intention of manipulating search rankings and not helping users. You might recognize auto-generated content by how little sense it makes when you read it – they’re technically words, but put together by a program rather than a human.
It should be noted that advances in machine learning have contributed to more sophisticated auto-generated content that is only getting better over time. This is probably why Google explicitly mentions the brand of auto-generated content that attempts to manipulate search rankings, rather than all auto-generated content, in Google’s Quality Guidelines for Auto-Generated Content.
What is link building?
Link building is the process of allowing other websites to link to pages on your own website. The purpose of link building is to increase the “authority” of your pages in the eyes of Google, so those pages will rank higher and generate more search traffic.
Why is link building important?
According to Google’s Andrey Lipattsev, links are one of the top three ranking factors on Google. So if you want your site pages to rank well in searches, you almost certainly need links.
Google (and other search engines) consider links from other websites to be “votes”. These votes help them identify which page on a given topic (out of thousands of similar pages) deserves to be at the top of search results.
Generally, pages with more backlinks tend to rank higher in search results. Our own study of one billion pages found a strong positive correlation between the number of websites linking to a page and the search traffic generated by Google
Conceptually, most link building tactics and strategies fall into one of the following four buckets:
- Add. Manually add links to websites.
- Ask. Reach out to website owners directly to ask for a link.
- Buy. Exchange money for links.
- Earn. Get organic links from people who visited your page.
TECHNICAL SEO
What is technical SEO?
Technical SEO is the process of optimizing your website to help search engines like Google find, crawl, understand and index your pages. The goal is to get found and improve the rankings.
crawling work
When crawling, search engines fetch page content and use the links within it to find even more pages. There are several ways to control what is crawled on your site. Here are some options.
Robots.txt
A robots.txt file tells search engines where they can and can’t go on your site.
Crawl rate
There is a crawl delay directive that you can use in robots.txt that many crawlers support. This lets you specify how often pages can be crawled. Unfortunately, Google does not respect this. For Google, you need to change the crawl rate in Google Search Console.
How to see crawl activity
Specifically for Google, the easiest way to see what’s being crawled is to use the Crawl Statistics report in Google Search Console, which gives you more insight into how your site is being crawled.
If you want to see all the crawling activity on your website, you need to access your server logs and maybe use a tool to better analyze the data. It can get quite advanced. However, if your hosting has a control panel like cPanel, you should have access to raw logs and some aggregators like AWstats and Webalizer.
Crawl adjustments
Every website has a different crawl budget, which is a combination of how often Google wants to crawl a website and how much crawl your website allows. The most popular pages and pages that change frequently are crawled more often, and pages that seem unpopular or well related are crawled less frequently.
Typically, if crawlers see signs of stress while crawling your site, they will slow down or even stop crawling until conditions improve.
After the pages are crawled, they are rendered and sent to the index. The index is the master list of pages that can be returned for search queries. Let’s talk about the index.
Add internal links
Internal links are links from one page on your website to another page on your website. They help find your pages and also help pages rank better. We have a tool in Site Audit called Internal Linking Opportunities to help you find these opportunities quickly.
This tool looks for mentions of keywords that you already rank for on your site. Then it offers them as contextual internal linking opportunities.
For example, the tool displays a mention of “faceted navigation” in our duplicate content guide. Since Site Audit knows that we have a page on faceted navigation, it suggests that we add an internal link to this page.
Technical Tools For SEO
Google Search Console
Google Search Console (previously Google Webmaster Tools) is a free service from Google that helps you monitor and troubleshoot your website’s appearance in its search results.
Use it to find and fix technical errors, submit sitemaps, see structured data issues, and more.
Bing and Yandex have their own versions, and so does Ahrefs. Ahrefs Webmaster Tools is a free tool that’ll help you improve your website’s SEO performance. It allows you to:
- Monitor your website’s SEO health.
- Check for 100+ SEO issues.
- View all your backlinks.
- See all the keywords you rank for.
- Find out how much traffic your pages are receiving.
- Find internal linking opportunities.
It’s our answer to the limitations of Google Search Console.
Conclusion
If your content is not indexed, it will not be found in search engines. If something is broken and affecting search traffic, fixing it can be a priority. But for most websites, it’s probably best to spend some time on your content and links. Many of the most impactful technical projects revolve around indexing or linking.