Do you have search engines finding things you don’t want them to find? Creating a robots.txt file will keep them from indexing folders you want to remain off of their listings. Inside the text file, put a line saying “Useragent: *”
Each line after that should include the subfolders you wish to remain free from indexing, after “Disallow:”
Example:

User-agent: *
Disallow: /administrator/
Disallow: /classes/
Disallow: /components/
Disallow: /editor/
Disallow: /images/
Disallow: /includes/
Disallow: /language/
Disallow: /mambots/
Disallow: /media/
Disallow: /modules/
Disallow: /templates/
Disallow: /uploadfiles/

Feel free to copy this example and change the folders to fit your situation. Name the file “robots.txt” and put it in the root folder of your website.

I added a robots.txt file to my server to keep my blog from showing up on search engines after it looked like a lot of my hits were coming from Google and Yahoo. It took a while for the hits to drop off, but they did. I have since removed my robots.txt file, and expect traffic to go back up.

*Note: I “borrowed” the text from someone else’s robots.text file.