A robots.txt is a text file kept on your website’s home directory i.e., Public_HTML. The file is checked by all search engines spiders before they start crawling your website’s content to know which files, file types and directories are not allowed to be archived. So by using a Robot.Txt file you can exclude content you don’t want to appear in search engine results. Even it allows you to block the way for search engine crawlers including the safe ones like Google.
This tutorial explains you how you can have full control of
meta robots tag for each WordPress post or page.
meta robots have direct influence to indexing behaviour of search engines. So first you o install and activate the plugin called WordPress Meta Robots. After activation you will find it a its meta box on every Add New and Edit pages in your WordPress admin area dashboard.
Continue reading How To Control Indexing Behavior Of Search Engines For Specific WordPress Posts And Pages ?
Here is a tiutorial for quick and easy feature that allows you to manage your WordPress Multisite Network files in an advanced and easy way without opening any cPanel or FTP interface but first thing you should know Multisite Robots.txt Manager is a free plugin only for WordPress Multisite Networks which is needed to be network activated which means you cannot use it by activating it in individual subsites of your WordPress Network Install.
Installation & Usage: Install and network activate Multisite Robots.txt Manager. Upon activation visit your Network Admin Dashboard -> Settings -> MS Robot.txt and start playing with robot.txt files all over your multisite network. The plugin provides you following useful features:
- You can instantly add Sitemaps URL’s to all your robots.txt files.
- Allows you to manage all sites from Network Administration Area.
- You can quickly publish preset robots.txt files to your Network or a subsite.
- Allows you to manage a single Website through the Website Settings Admins.
- You can run mass update the all sites across your Multisite the Network in just a click.
- You can easily create custom and unique robots.txt files for each subsite present on your network.
Following screenshot shows this plugin’s option page showing a quick view of various hidden sections like presets & examples, plugin usage:
More plugin’s screenshots are available here.
You should know that d
- efault “Network Wide” robots.txt file is not live robots.txt file. Deactivating this plugin doesn’t change or remove the preset option instead it stops displaying the plugins robot.txt files. But deleting this plugin removes all settings from your database for all your sites.
For making Robots.txt file live for a Website, either click “Publish to Network” button or select website using dropdown then click “Change Sites” buttons. Now adjust the displayed robots.txt file and click the “Update this Website” button.
Both methods simply publishes robots.txt file to Website(s) and make it live. Clicking the [ view robots.txt ] link next to the Websites dropdown shows the changes within your web-browser. Visit Plugin’s Website for documentaion and user guide here.