Advertisement

How To Better Understand The Bot Ecosystem

1akamaiBots get a bad rap. They’re paralyzing web sites, scraping unique content, and taking over the Internet, right? It’s true, bad bots can do a lot of damage, but what about the good ones? We’re often talking about how to avoid bots, deter them, and block them. But much opportunity lies in figuring out how to distinguish between good and bad bots, and understanding how the distinctions change across applications and environments.

Nearly all online businesses can be impacted by various types of bot traffic. This traffic may include scrapers that grab content or price information, automated “clicks” that fraudulently increase ad revenues and “transactional” bots that can be used to purchase limited availability goods and services, making them unavailable to legitimate customers.

Further, there are situations where the impact of bot activity on the business may be beneficial, while the impact on site performance is not. Based on analysis of traffic across the Akamai Intelligent Platform, upward of 60% of an organization’s Web traffic may be generated by bots — or programs that operate as an agent for a user or another program, or simulate human activity.

Advertisement

Many bots play a legitimate role in online business strategies. Others harm businesses by reducing competitive advantage, getting between an organization and its customers, or committing fraud. As such, organizations need a flexible framework to manage and better understand the wide array of bots accessing their web sites every day.

So, What Is A Bot?

A bot, or web robot, is a software application that runs automated tasks over the Internet. Web bots and screen scrapers — which scrape information from web site pages — may be deployed by search engines, competitors, partners or price comparison engines as well as many other third parties. Examples range from so-called good bots, such as Googlebot, which collect information to index and display web site pages, to bots that overtax servers and can appear to be distributed denial-of-service (DDoS) attacks. Bots can be identified in a variety of ways using behavioral indicators such as the information in a request and the volume and rate of requests made.

Why Are Web Bots A Problem?

From an IT perspective, aggressive or poorly coded bots can overtax the web infrastructure, cause site slowdowns, and increase latency. In some cases observed by Akamai, bots have made several thousand requests per second — far in excess of what human users generate through a web browser, often times requesting the same exact content only milliseconds apart. This volume of bot traffic, even from good bots, can impose an undesired load on IT infrastructure.

For businesses, web bots can have both a positive and negative impact. Good web bots, like those that help Internet users find your site online, are essential. Other bots, such as those operated by competitors, content aggregators or scrapers, get between a business and its customers. This results in less control over the online business strategy, with fewer sales opportunities and the potential to damage customer relationships.

What Is The Best Approach To Reduce The Negative Impacts Of Web Bots?

Traditionally, organizations applied a one-size-fits-all approach to mitigating or just blocking identified bots. This can lead to lower search-engine visibility and negatively affect online business objectives. In addition, it is ineffective over the long term, as blocked bots often mutate and return better disguised. To effectively address bots and avoid lost opportunities, site managers should include identification, categorization and differential treatment of all bot types.

For example, it is necessary for an airline to show up in the search results of the many different online travel agencies. These bot requests can cause additional load times for the systems or query costs incurred on the back end. To manage this while still maintaining a presence in the OTA listings, airlines can return content that is cached for minutes or hours depending on the variability of the pricing.

Another common scenario is competitors using dynamic pricing, changing their prices based on price scraping bots. Managing these types of bots by providing alternate content with static pricing allows retailers to continue to provide human customers with sale pricing without alerting competitors.

Lastly, transactional bots can be very frustrating for retailers and their customers when long-anticipated product releases are quickly gobbled up by bots, and in turn, show up on auction sites at significantly inflated prices. This results in dissatisfied customers who often blame the brand or retailer for lack of supply.

These kinds of bots can be slowed down, virtually putting them in the slow lane for checkout, which serves two purposes. First, the bot won’t be alerted that it won’t be able to complete the transaction, and second, actual customers are able to compete against other human traffic and hopefully complete their purchase, as opposed to battling the bots.

By implementing a “manage, don’t mitigate,” approach to web bots, site owners can lower operating costs by reducing the infrastructure and IT overhead needed to handle the additional bot traffic, improve (human) user experiences, maintain a competitive advantage and combat fraudulent activity.


 

Jason Miller is the Chief Strategist of Commerce, where he’s responsible for leading Akamai’s e-Commerce Strategy, including driving thought leadership in the industry, driving decisions on Akamai’s future technologies, and working with key businesses partners to drive innovation.

Feature Your Byline

Submit an Executive ViewPoints.

Featured Event

Join the retail community as we come together for three days of strategic sessions, meaningful off-site networking events and interactive learning experiences.

Advertisement

Access The Media Kit

Interests:

Access Our Editorial Calendar




If you are downloading this on behalf of a client, please provide the company name and website information below: