How to Submit URL to Index Using Fetch as Google Tool

Submit URL to Index Using Fetch as Google Tool
The Fetch as Googlebot feature in Webmaster Tools now provides more better way to submit new and updated URLs to Google for indexing. After you fetch a URL as Googlebot, if the fetch is successful, you’ll see the option to submit that URL to Google's index. When you submit a URL in this way Googlebot will crawl the URL, usually within a day. Google then considers it for inclusion in their directory. Note that Google doesn’t guarantee that every URL submitted in this way will be indexed; they’ll still use their regular processes - the same ones they use on URLs discovered in any other way - to evaluate whether a URL belongs in Google's index.

By today's tutorial we show you all how to submit URL to Google's index using Fetch as Google tool and how to fix crawl errors.

What is Fetch as Googlebot?

Google updated "Fetch as Googlebot" in Webmaster Tools (WMT) with a new feature, called "Submit URL to Index," which is a powerful tool working over sitemaps and force Google to include URLs manually. Webmasters use this tool only if they find any error in Webmaster accounts regarding URL inclusion that comes from Google. Sometimes Googlebot's inability to crawl and index specific URL causes Google to report website owners about the problem.

Why Googlebot's Inability to Crawl and Index Particular Pages?

Googlebot crawls pages only when it gets the permission from website owner. The permission is issued by Robots.txt file that is remaining on site's server. Robots.txt file is formatted in such way
User-Agent: Googlebot
Disallow: /
This change will allow Googlebot to index the content of your site. But if don't want crawler to index your URL then use following format
User-Agent: Googlebot
Disallow: /http://example.com/example
The yellow highlighted URL will be disallowed by Googlebot to index and store in Google's directory.

But there are several reasons that Googlebot stops indexing pages if it is not being issued by robots.txt such that
  • Your page may contain malware or phishing
  • Your page may be hacked by spammy attack
  • Your page may include materials that can harm a Portable Computer
 So Googlebot will stop crawling and indexing your page and your get notification in Webmaster tools

How to Use Fetch as Googlebot Tool for Manual URL Inclusion?

1. Go to Google Webmaster Tools > Login your account
2. Now select your website and go to site dashboard
3. Under Site Dashboard access to Crawl > Fetch as Google
4. Now you will see the Fetch as Google tool looks fresh if you didn't perform anything before with this tool
5. By default your domain name will be remaining with back slash ( / ) and you have to build your URL which you want to request Google for manual inclusion in their directory

For blogger the URL constructure will be like this
2014/07/how-to-install-google-fonts-on.html [only permalink with blogger default post date for URL construction]
But if you use WordPress the URL constructure would be
how-to-install-google-fonts [only the permalink without .html because this is not the constructive URL feature of WP sites]
6. Click on Fetch button
7. Now you will see the Fetch Status is Success and a button is created says "Submit to index"

Fetch as google bot

8. Click on Submit to index
9. Immediately you would see a popup letting you to choose submit method
10. Choose Crawl this URL and its direct links and hit OK

Submit URL and all linked pages

11.Now you will see a message that is URL and linked pages submitted to index
12. And you're successfully done!

When you're submitting only URLs, you have a maximum limit of 50 submissions per week, but if you submit URLs with all linked pages then the limit is 10 submissions per month. You can also see the submissions you have left on the Fetch as Googlebot page. Try submitting fresh content having texts but if you're trying to submit only images or videos you should use Sitemaps instead.

Submit URLs to Google Without Verifying

For webmasters ease Google added a new tool to WMT that is "Add your URL to Google" form. This is an updated and publicly usable Tool which lets you submitting pages to the index as the Fetch as Googlebot feature but doesn't require verifying ownership of the site in question, so you can submit any URLs that you want crawled and indexed.

Check Crawl Errors

Now it's time to check our crawl errors as we have fixed the problems but still Google may report crawl errors for older or newer URLs. Essentially this worth nothing but may be a system lag with some of these notices.

1. Go to Crawl > Crawl Errors
2. Select the URLs you think you've fixed the errors and click on Mark as Fixed button, listen you only do this if you're confident that you've fixed all the problems.

Mark as Fixed GWT

3. This will give you a fresh and healthy report of Crawl errors after the fixation which may take 24 hours delay.

Check Crawl Statistics

After the event you may be willing to see an upgrade in your crawl report by GWT, using Crawl stats tool you could see a live report of your Crawl index

1. Go to Crawl > Crawl Stats
2. You should see a reasonable growth in both the two reports of Crawl stats

Crawl report statics on Google webmaster tools

I do believe that your pages (that Google reported errors) already have indexed by GWT and they're also on individual ranking on search engine. You should see a reasonable growth in your site impressions and clicks reports also improvement in search queries.

Your Final Thoughts on Fetch as Googlebot and Crawl Errors

Google has upgraded almost all of the tools within GWT also their functions have been improved 10x faster than before. Now you submit multiple sitemaps of a site and see the results updated on SERP on your next search terms, this is because of the power of GWT. 

Using Fetch as Googlebot you can give more power to individual URLs and force Google to index pages over and over times so that your URLs ability to report Google a fresh and crystal clear content that your targeted audiences are badly needed. Also Crawl URL form can help to apply for hundreds of manual URLs inclusion without verifying with GWT but they have less effect on inclusion by Google than Fetch as Googlebot.

Hope that tutorial helped you to fix a big error in Google Webmaster Tools. Feel free to ask if you have more queries regarding this article 

4 comments

  1. thanks for this best tutorial for beginners >

    ReplyDelete
  2. http://prediksiangkahoki88.blogspot.com/

    ReplyDelete
  3. It is very Informative Post..... I Like this Post.... Thanks for Sharing this Informative Post with us.....

    ReplyDelete

About

The Wild Blogger is a technology blog which covers all popular and trending news of the web, tech tutorials, blog, blogging tips and tutorials, seo tips, social media marketing, content marketing and tools. Read More...

Browse by Topics

SEO Social Media Make Money

Blogger Tricks Tutorials

Blogging Tips Photoshop