Hello,
A few days ago I found an awesome tool that all of us can use. Whether you're designing mobile sites, web sites or even just own a site you can use this robots.txt generator. Websites use a robots text file to let the crawlers know about their site. The crawler visits your site and looks for the.txt file. If that file says not to crawl, the robot will not crawl your site. It's that simple.

Why wouldn't I want a robot to crawl my site? Give Your Opinion?