Bots are software applications designed to perform automated tasks, such as compiling analytics, indexing websites, and search engine crawling—well, the good ones anyway. The bad bots used by hackers and spammers tend to take control of your system with malicious intents such as distributing spam, phishing sites, installing spyware, and malware. The good and bad bots together account for the majority of web traffic, according to a study done by Incapsula, a web security and traffic monitoring service.
While humans account for only 40 percent of the traffic, the bots cover the rest, almost equally between them (good and bad). The findings were based on approximately 1.45 billion bot visits over a 90-day period to thousands of websites on Incapsula’s network in 249 countries. This calculation is nothing in terms of the real numbers of bot activities on the web. But it does give us a general idea to make an estimate of web usage between humans and bots.
Legitimate bot activity has increased considerably over the last year. This can be attributed mainly to new and refined algorithms and Google’s pro-active stance on cutting down spam. Search engine bots such as those used by Google and Bing constantly scour the Web landscapes to update themselves on website changes and to give consistently relevant search results to users.
Although the percentage of bad bots has decreased over the year, they still control more than 30 percent of all traffic, according to the report. These impersonators mostly gather market information and cause website downtime with service degradation, while the more dangerous ones install hacking tools and spread spam. They roam the vast networks searching for loopholes to squeeze in and cause trouble.
You will find more statistics at Statista