Robots.txt is a great tool, because it essentially lets you throw a lot of “Do Not Enter” signs up for the different bots that crawl your site (we’re mostly concerned with search engines here).
The reason you want to do this is to prevent Google and others from stumbling onto duplicate content, or crawling something like a “Thank You” page, etc. This video will teach you how to create and use Robots.txt, so you can manage the bot traffic on your site as you see fit!
You can also check out the post I wrote about using Robots.txt on a WordPress blog,, which has even more specific examples.
If you have any questions, drop them in the comments!