A robots.txt is a text file kept on your website’s home directory i.e., Public_HTML. The file is checked by all search engines spiders before they start crawling your website’s content to know which files, file types and directories are not allowed to be archived. So by using a Robot.Txt file you can exclude content you don’t want to appear in search engine results. Even it allows you to block the way for search engine crawlers including the safe ones like Google.
For crating a Robots.Txt file visit YellowPipe Robot.Txt Generator. Now fill in the fields like what URLs, files and directories you like to exclude from search engines, block or unblock spiders and crawlers, exclude their sorted list of 135 additional unsafe robots that crawl for spamming and dirty purposes etc.
Finally click Create robots.txt button, download and save the file in your computer then upload it to your website’s home (public_html) directory via cPanel or FTP.