Many times we have clients contacting us because they have an unusual bandwidth usage, many times the used bandwidth is cause normals things like, the client advertised the site or they worked in the site SEO and the site has more visits now, but some times the traffic increase is cause a bad robots scanning the site, in most small-medium sites this do not use to much traffic because most probably the site do not have too much information to crawl but when you have a site with a lot of images or videos the bandwidth used to crawl the site can be a lot.
This article is to give you some tools and information so you can evaluate and apply any of the techniques to block bad robots in your account.
The following links explain and give you some examples about the robots.txt file to block the site indexing.
http://en.wikipedia.org/wiki/Robots_exclusion_standard
http://en.wikipedia.org/wiki/Robots_exclusion_standard#Examples
Here you can find an Apache project with .htaccess rules that you can add in your .htaccess file to block many bad robots.