Less than half of the world’s web traffic is actually human. According to Internet security experts, more than 50% of website visitors are ‘bots’ and ‘spiders’. From benign crawlers to destructive bugs, all can infest websites and wreak havoc. In this article we’ll take a look at the different kinds of bots, good and bad, and how to reduce the risk of attacks from malicious bots (malware).
Good Bots Vs Bad Bots
“Google and Facebook account for about 8% of the Internet’s good bots (about 4% each).”
Good bots are those sent out by search engines and social media apps to do jobs that make our internet experience easier. They fetch information from websites and make it available to us in searches and social media feeds.
Behind the picturesque scenes of smooth-scrolling web pages, battles rage between webmasters and bad bots. Website defenses are constantly bombarded by bots that are sent out to do the dirty work of hackers and spammers seeking ways to breach website security.
Up to 50% of the bots visiting websites are nasty little creatures which can seriously harm your website’s health – even forcing your site to be shut down if they are allowed to have their wicked way. The bad bots’ modus operandi is usually denial of service (DDoS) attacks.
As a line of defense, website and hosting server programmers set up ‘trap doors’, ‘sand pits’ and ‘honey pots’ to divert and snare the bad bots. However, the threat from malware is relentless and the bad bots do occasionally get through, no matter how tough the security measures are.
Search Bots, Crawlers and Feed Fetchers
Search bots, crawlers and feed fetchers are the good bots which scurry around the web. They look for content to update search engine listings and let people share information about websites and feeds. Here are 10 examples good bots which roam the web.
|BOT NAME||% OF SITES CRAWLED||BOT TYPE|
|Baidu Spider||89%||Search Bot|
|Yandex Bot||73%||Search Bot|
|Soso Spider||61%||Search Bot|
|Sogou Spider||31%||Search Bot|
|Google Plus Share||24%||Crawler|
|Facebook External Hit||24%||Crawler|
|Google Feedfetcher||22%||Feed Fetcher|
Google’s web crawling bot or “spider” visits websites to read sitemap data provided by webmasters. The spider then reports back to Google to update search engine listings.
This is a robot for leading Chinese search engine Baidu, which lists websites, audio files, and images.
Formerly MSN Bot (retired October 2010), this web-crawling bot is sent out by Microsoft to feed the Bing search engine.
The web crawler for the Internet’s fifth largest search engine operated by Russian Internet company Yandex, which has a 60% market share in that country and serves more than 150 million searches per day for about 25.5 million visitors.
The web crawler for Chinese search engine Soso, which rank among the top-40 most visited websites in the world.
The web crawler for France-based ExaLead, a subsidiary of Dassault Systèmes, which provides search-based applications for consumer and business users.
Crawls the web for Chinese search engine Sogou, which has an Alexa global rank of 151 (35 in China) and provides search results for about 10 billion web pages.
Google Plus Share
This is the bot behind Google’s +1 button, which lets you instantly share recommendations with friends, contacts and the rest of the web.
Facebook External Hit
This bot allows Facebook users to share links to websites of interest, fetching the header info from websites so details and images of the shared website can be displayed in Facebook posts.
This bot grabs RSS or Atom feeds when users add them to their Google homepage or Google Reader. However, this bot doesn’t index the feeds in Google’s other search services.
What are the Impacts of Bad Bot Attacks?
If your web pages get infected with malicious code injected by a bad bot, your website can become a source of spam. In this case, your hosting server will likely shut down your website until you clean up your code – a tedious task to fix by yourself, and an expensive job if you employ a web security company to do it for you.
Bad bots may also access private or personal information and make it public. Prevention is better than cure, but sometimes even the toughest security measures can’t stop a determined bot.
Denial of Service Attacks
Really bad bots can cause widespread disruptions to web services with denial-of-service (DoS) attacks or a distributed denial-of-service (DDoS) attacks.
Both types of attack aim to flood machines or systems with illegitimate requests (traffic) so they become overloaded and cannot service legitimate web traffic. The difference is that a DoS attack is usually from a single source whereas a DDoS attack originates from many different sources.
The attacks often target high-profile web servers such as banks or credit card payment gateways and may be motivated by revenge, blackmail or activism.
Bad Bots in Internet History
“Mirai” is a Japanese word meaning ‘future’. The term refers to malware which turns networked devices such as unsecured IP cameras and home routers into remotely controlled bots that can be collectively used to launch large-scale network attacks.
The Mirai botnet virus scans IP addresses looking for devices that are still using their factory default username and password. It can infect tens of thousands of devices to mount distributed denial-of-service (DDoS) attacks.
A highly versatile and illusive malware bot which can perform many malicious tasks with the aim of holding the victim to ransom. The bots can download and execute additional malware, steal login and password information, and they can lock and encrypt content
This naughty bot once infected Windows PC units inside the factory, meaning customers bought computers already infected with the Nitol bot, which is transmitted mostly through USB flash drives to carry out DDoS attacks.
A uniquely disruptive bot that took Windows files and randomly scrambled them to create chaos in machines using Microsoft.
GITHUB: 1.35 TBPS
This was the largest DDoS attack in history. A sudden onslaught of traffic hit the developer platform GitHub on February 28, 2018, choking the system with a record-breaking volume of traffic, clocking 1.35 terabits per second.
OCCUPY CENTRAL, HONG KONG: 500 GBPS
This was a coordinated DDoS attack carried out by five botnets in 2014, targeting web hosting services as well as two independent sites: PopVote, an online mock election site; and Apple Daily news site, which supported a Hong Kong-based grassroots pro-democracy movement known as Occupy Central. The botnet created peak traffic levels of 500 gigabits per second.
CLOUDFLARE: 400 GBPS
Security provider and content delivery network CloudFlare was overloaded by an attack that swamped servers with 400 gigabits per second of web traffic. This malware’s strategy was to mirror and amplify traffic, which was difficult to block because it appeared to come from legitimate sources.
SPAMHAUS: 300 GBPS
This was apparently a revenge DDoS attack on nonprofit anti-spam organization Spamhaus after it blacklisted Dutch company Cyberbunker in 2013. The attack brought down the Spamhaus website and disrupted their email services.
U.S. BANKS: 60 GBPS
Six major US banks were targeted by a string of DDoS attacks in 2012 using hundreds of hijacked servers. The victims included Bank of America, JP Morgan Chase, U.S. Bancorp, Citigroup and PNC Bank. They were flooded with a variety of strategic onslaughts of traffic that totalled 60 gigabits per second.
What Kind of Websites are at Risk from Bot Attacks?
According to web security firm Saintcon, the popularity of your website determines the amount of bot traffic you’ll get compared to human web traffic. Their figures revealed that about half of the web traffic arriving at emerging websites is bot traffic – evenly split between good and bad bots. More popular websites appear to attract a greater ratio of bad bots to good bots. See the table below:
|WEBSITE TYPE||DAILY HUMAN VISITS||BAD BOTS||GOOD BOTS|
|Small Website||10 – 1,000||27.1%||42.6%|
|Medium Websites||1,000 – 10,000||24.4%||22.8%|
|Large Websites||10,000 – 100,000||21.0%||16.9%|
|Alexa Minimum Viable Product (MVP) websites||100,000+||18.2%||14.5%|
Statistics portal Statista analysed data from 100,000 randomly-selected domains world wide for five years, between 2012 to 2016, totalling 16.7+ billion visits.
In 2016, bots accounted for 51.8 percent of online traffic, comprising 22.9 percent good bots and 28.9 percent bad bots.
1 in 3 Website Visitors is a Bad Bot
Cyber security firm Impervia’s marketing director Igal Zeifman said in a blog post about the company’s research on web traffic that the most alarming statistic and most persistent trend is that “every third website visitor was an attack bot”.
Impervia revealed that more than 94 percent of the 100,000 domains they analysed experienced at least one bot attack over a 90-day period.
Learn All about Good Bots and Bad Bots
Watch this comprehensive presentation by Adam Fisher from Saintcom to get in-depth information on the different types of bots and what you can do to block or encourage them.
VIDEO: Bot, Bots, Everywhere a Bot
How Do I Prevent Bot Attacks?
Website security and bot attack prevention is big business. But it’s costly to repair damage done by bad bots that penetrate a website’s defenses.
Prevention is better than a cure, and the more information you want to keep secret, the higher the price you pay for website security. However, there are several ways you can keep an average website secure for little or no expense.
Keeping CMS Websites Secure from Bot Attacks
If you’re running a website with an open source content management system (CMS) system such as WordPress, simply make sure you have the latest version of WordPress installed on your hosting server. Bad bots and hackers actively seek out outdated versions of WordPress to exploit known vulnerabilities.
Website Security Plugins
There are plenty of free security plugins available for WordPress websites. As long as you regularly scan your website’s database and update the plugins, you should be fairly safe from bot attacks.
3 Free Bot Defense Tools
Here are three free tools for websites and computers connected to networks which help prevent bots gaining unauthorized access to your system.
1. Use CAPTCHA on Contact Forms
Bots love unsecured contact forms. To secure the contact form on your website and prevent bots spamming you and getting into your system, add a CAPTCHA script, which randomly displays text that bots can not read and must be typed in by the visitor before clicking ‘send’.
2. Install Anti-Virus Software on Your Computer
Bad bots can infect your website via your computer. Install anti-virus software on your computer to prevent bots getting a foothold. AVG is a free anti-virus program offering basic protection with the option to pay for higher levels of security and automated updates. If you use the free version, make sure you manually update the software and scan your computer regularly.
3. Get Email Spam Protection
Spam emails often contain links to malicious scripts that can infect your computer with bad bots and subsequently infect your website. Activate anti-spam protection and filters from your email service provider to block known spamming bots and malware from even entering your inbox.
Watch this presentation by Cloudflare for expert information about how to protect your website from bot attacks.
Buy Web Traffic Now
Make Sure Your Managed Web Traffic is Bot Free
If you buy web traffic, make sure those visitors are only human visitors. Cheap web traffic deals will likely comprise mostly bots, as do some unscrupulous pay-per-click schemes. So it’s worth paying a little more for a web traffic plan to ensure you’re guaranteed web traffic consisting only human visitors.
Web traffic plans by WebTrafficGeeks come with the assurance of 100% human web traffic. When you buy a WebTrafficGeeks plan, you can choose up to three niches to target as well as three different countries to source your web traffic.
Find out more about prices and plans to suit your requirements here.