|
You may or may not use it. robots.txt example Robots.txt file examples The above commands are the basis for creating rules that constitute the entire structure of the robots.txt file. At the beginning, creating such rules may seem difficult, but it is actually a piece of cake and a task that can be completed in a few dozen seconds. To summarize the entire process into one sentence: first, you specify which robot/search engine you are referring to ("User-agent"), and then list the rules that the robot should follow - using "Allow" and "Disallow". Below we will present some examples and describe what exactly they mean.
Blocking a single bot from accessing your site User-agent: Google Disallow: Blocking the Google search engine from indexing the entire website. Block all bots from accessing the site User-agent: Disallow: Blocking all search engines from indexing the entire website. Lock a specific photo editor folder or file User-agent: Disallow Blocking access to the /images/ folder and the support html page for all search engines. Unlock a single file in a locked folder User-agent: Disallow Blocking access to the /images folder, excluding the zdjecie pg file, for all search engines.

Blocking different folders for different bots User-agent: User-agent: Bing Disallow User-agent Google Disallow: Blocking the info folder for all search engines, blocking the entire page for the Bing search engine, and blocking the images folder for the Google search engine. How to test your robots.txt file You can easily test the robots rules on the Google Search Console page to make sure it's configured correctly. Just select your website in the panel above and then select "Robots txt File Tester" from the "Download" tab. At the bottom of the page there is a URL bar that you can submit for testing.
|
|