Your browser does not support JavaScript!'

Best Crawlers/indexing Blogger SEO Setting Updated 2022

Welcome to MBF, as you guys know. MBF is all about Blogging fundamentals.
All published articles are organized in simple and short English sentence to teach the Blogging Tips for Beginners.
In that respect today's tutorial is Best Crawlers/indexing Blogger SEO Setting Updated 2022


Today's Topic
Best Crawlers/indexing Blogger SEO Setting Updated 2022


2022 Best SEO Blogger Settings

If you'r here that's means you are looking for Best SEO Setting For Blogger. In Today's Blogger Tutorial i'll teach you the best, fast, 100% practical and SEO friendly settings for your Blogspot website. Please stay with MBF Blog till the end of the Best Crawlers/indexing SEO Blogger Settings

Let's Explore the best Blogger SEO Settings, so that your Blogger/Website can rank faster.
Point to Remember there is no Plugin available for Blogger SEO like WordPress. So all SEO on blogger must be implemented manually.

Blogger SEO Setting
Blogger SEO Setting

Are Blogger Crawling/Indexing Settings Important?

Yes. Crawling and indexing of a site and Blog plays the most important role in the ranking of the Blog/Site. These SEO Settings on Blogger will play a key role in your SEO and ranking in search engine to get traffic. If you want all posts indexed fast, you have to set it properly.

What are Crawlers?

A crawler, can be called as a bot/web bot OR spider. In simple word a crawler is an algorithm that is run by search engines to craw the content of the pages/posts. A crawler scans or read all submitted web pages and then assign them some Keywords and index them or add them in the database.

As a result, those web pages and posts can be shown in the search results.

Should we allow Crawlers on Blogger

Think about the above scenario, and decide do you want to index your web pages in search engines. If you don't want to index your some web pages like (try_pages, Experiment_page, Search_Query, Blogger_Labes, etc) by search engine you can block them using robots.txt file. Blogger SiteMap and Robots Txt Generator

What is indexing?

In simple and easy terminology the word indexing means adding a book to a library with just a label or title. So, it is important for our Blog or it's posts to get indexed, otherwise, the web pages will not be shown in the search results.

Consider these Crawler & indexing settings on Blogger

By default in the Blogger platform, the Crawler & indexing settings are turned off.
But It doesn't mean you will not get any index and ranking, but somehow it will take more time to index your blog's posts. Because Crawlers will crawl all pages on your blog including archive Pages, Search Pages, 404 pages, Label, etc.

So, you must have to set the advance Blogger SEO settings. It will help the Crawler to understand which posts and pages to crawl, and which will be ignored and your posts will get indexed quickly.

Custom Robots.txt for Blogger

As we discuess before Robots.text plays a vital role in the ranking of the blog. It is the most importent step to improve Blogger SEO. Here is a complete article and free Sitemap and robots txt generator. Just visit Blogger Sitemap Generator

How to add Custom Robots,TXT in Blogger

Step 1: Log in to the Blogger dashboard

Step 2: Go to the settings Section (Left_Bottom)

Step 3: Scroll down to the Crawlers and indexing settings.

Step 4: First, you need to enable the custom robots.txt option.

Now just visit the Blogger Sitemap Generator then generate Blogger sitemap copy it from the tool and paste it in the text box. Shown in the Figure below.

enable custom robots.txt blogger
Custom robots.txt Blogger

Blogger Custom robots header tags

Once you enable the option for custom robots header tags, you will see that Home page tags, Archive and Search page tags, and Post and Page tags options are available. Let me just explain the meanings of those tags.

  • all: The "All" means there is no restrictions for all crawlers and pages.
  • noindex: The "No_Index" means crawl but don't Index the page.
  • nofollow: blog posts contain both internal and external links. Crawlers automatically crawl all the links on any webpage, but if you don't want crawlers to crawl any links on any page, then you have to enable this option.
  • none: it means both "noindex" and "nofollow" at once.
  • noarchive: it will remove any cached link from search.
  • nosnippet: it will stop search engines to show snippets of your web pages.
  • noodp: it will remove titles and snippets from open directory projects like DMOZ.
  • notranslate: it will stop crawlers to show translation on pages.
  • noimageindex: in this case, your post will be crawled and indexed, but all the images will not be indexed.
  • unavailable_after: this option helps to noindex any post after a particular time.
enable custom robots header tags
Enable custom robots header tags in Blogger

Now all the details about the robot tags, are in front of you. The decision is yours. Choose which you want to allow and which to block. By the way, here I have shared the best Blogger SEO Settings for Blog or website.

Best Crawlers and indexing Blogger SEO Setting
Crawlers indexing Blogger SEO Setting

For Home page: all, noodp

For Archive and search pages: noindex, noodp

For Posts and pages: all, noodp

Custom robot tags for Blogger post

After the changes; the Crawlers and indexing settings will also appear in the edit post tab. At the bottom right-hand side of the post writing dashboard. The Settings can be upgraded for each post from the Edit Post section.

Crawlers and indexing Blogger SEO Setting For individual Posts
Blogger SEO Setting For individual Posts

Post a Comment

0 Comments

^