Bots are software applications which perform automated services. Bots perform tedious and repetitive tasks quicker and more efficiently than it would take humans.
On the internet, servers use robots.txt files to state the rules for bot behavior on the server. It is estimated that up to half of all internet traffic is made up of computer bots performing tasks on the web. Networks of zombie computers are called botnets… scary.
Use Cases for Bots
- To index web content (e.g. Google’s web crawlers)
- To moderate or monitor forums for undesirable content (e.g. Twitch’s moderator bots)
- To provide round the clock customer service support (e.g. Shopify’s customer support bot)
- To simulate human speech patterns or entertain Chat bots (e.g. Apple’s Siri)
Malicious bots can be used to perform actions on servers which do not fulfill the intended desire of the users who maintain the server or attacked application. These bots are often used for denial of service attacks, spam and other nefarious activities.In order to combat malicious bots a number of strategies have been devised. The following are some steps an oganization can take to limit the damage malicious bots can do:
- Static Approach: Analyzes header information to determine a bot’s identity, and blocking it if necessary.
- Challenge-based Approach: Use CAPTCHA or other Turing tests to filter bots from humans.
- Behavioral: Compares behavioral signatures to previous, known signatures of bad bots.
- Learn more about how to get involved.
- Edit this page on GitHub to fix an error or make an improvement.
- Submit feedback to let us know how we can improve Docs.