Presentation is loading. Please wait.

Presentation is loading. Please wait.

Don’t look at Me!. There are situation when you don’t want search engines digging through some files or indexing some pages. You create a file in the.

Similar presentations


Presentation on theme: "Don’t look at Me!. There are situation when you don’t want search engines digging through some files or indexing some pages. You create a file in the."— Presentation transcript:

1 Don’t look at Me!

2 There are situation when you don’t want search engines digging through some files or indexing some pages. You create a file in the root directory called robots.text and place these files in here. Example: dynamic search results pages that may display improperly without user input. 404 pages Image directories, Login pages, General content that you don’t want search engines seeing

3 All spiders automatically look for this file in your root director, so all you need to do is create it, Upload it Wait for the spiders to read it This file is not a secure file. Your stuff isn’t safe in this file, it simply prevents spiders from indexing it. In fact, anyone can read your robots.txt file by simply going to the domain name/robots.txt file: http://whitehouse.gov/robots.txt

4 It’s easy User-agent is the search spider agent you want to receive the message. If you use and * you will indicate all spiders. Preventing spiders from indexing content is done with the keyword Disallow. Followed by the path to the private content. You can do more than one with the command for example:

5 User Agent: googlebot (or * for all spiders) # My private folder path Disallow: /private-folder/ Disallow: /404.php/ If you want to disallow just a photo image folder then you would do: User-agent: Googlebot-Image Disallow: /photos/

6 http://www.robotstxt.org


Download ppt "Don’t look at Me!. There are situation when you don’t want search engines digging through some files or indexing some pages. You create a file in the."

Similar presentations


Ads by Google