Default - All Robots are:
Crawl-Delay:
Sitemap: (If you don't have leave blank)
Search Robots Settings:
Google
Google Image
Google Mobile
MSN Search
Yahoo
Yahoo MM
Yahoo Blogs
Ask/Teoma
GigaBlast
DMOZ Checker
Nutch
Alexa/Wayback
Baidu
Naver
MSN PicSearch
Restricted Directories:The path is relative to root and must contain a trailing slash "/"



Now, Create 'robots.txt' file at your root directory. Copy above text and paste into the text file or 'create and save as robots.txt' button click and upload it.

User-agent: * # all robots
Disallow: / # are disallowed to crawl all pages
User-agent: Googlebot # beside Googlebot
Allow: / # can crawl all content

Allow all web crawlers to access all content:
User-agent: *
Disallow:
Allow all web crawlers to access all content:
User-agent: *
Disallow:
Allow a directory:
User-agent: *
Disallow: /listing-identify/
Prevent entry to all content :
User-agent: *
Disallow:
Some search engine crawlers like Google accepts the use of the “allow” attribute as below for allowing all content access:
User-agent: Googlebot
Allow: /
Limiting a single page:
User-agent: *
Disallow: /listing-identify/page-title.Html
” Disallow” and “Allow” attributes during a file also can be feasible. That you could make access only to Google and block all other crawlers for a site:

Custom robots.txt generator - FreeTool

Custom Robots.txt Generator is a FreeTool used to generate robots.txt file instantly for your website. Whenever a search engine crawls any website, it always first looks for the robots.txt file that is located at the domain root level Ex : (www.xyz.com/robots.txt).

Yes, you can generate it freely and download robots.txt file.

No, there is no currently chrome extension for generate robots.txt file.

Yes, there is so many application available on play store you can use anyone which you like.

No, We don't store any sort of personal data. No IP Address, No cookies no anything..