Robots.txt block all
NOTE: This is a very old migrated blog post. It may have incorrect formatting.So I ran into a new situation with my old web portfolio, in which a subdomains keywords were affecting my main sites keywords. For search optimization this just wasn't going to fly. The subdomain I couldn't get rid of however, as it was where I hosted all the web design projects I have in development. I needed a solution to prevent robots from indexing those sites. The solution: It's a simple one. Create a robots.txt file in the sub domain's root directory, and just include the following short lines of code:
User-agent: * Disallow: /This tells robots that respect the robots.txt file to not index the sub domain. Most of the robots you would care about your search result keywords in will notice the file, and so it's a functional solution.