Astroturfing: how to make thousands of people believe in your product?

Астротурфинг

It is already impossible to imagine the modern advertising market without the opinion of real buyers who have used the offered product or service. Today, you can easily find reviews for any product: from a toothbrush to an apartment in a new residential complex, in which buyers willingly share their impressions about their investments.

But how often are these comments checked for accuracy? And is it possible to be sure that these reviews are left by real web users?

If companies used to have to almost manually write laudatory odes to their products or pay for each custom positive review, then modern technologies have greatly simplified this task and increased the volume of false comments hundreds of times.

How does this happen? Let’s figure it out.

What is astroturfing?

Astroturfing is an attempt to give the impression of widespread support for an event, company, product or project. This effect is achieved through the creation of a large number of fake comments or reviews on the Internet with one goal: to mislead the public into believing that this position is a widely held point of view [The Guardian, 2012].

Astroturfing is the practice of disguising a message’s sponsors or organizations (such as political, advertising, religious, or civic organizations) to make it appear that they come from and are supported by disinterested citizens.

The term “astroturf” comes from AstroTurf, a brand of synthetic carpeting designed to mimic natural grass and popular with the athletic community. The first surface was installed in 1966, and over the next 40 years, artificial grass surfaces have improved so much that it has become almost impossible to distinguish them from real ones.

The meaning of this term is that instead of a true or natural grassroots effort behind the activity in question, there is a false or artificial appearance of support.

Recently, consumers have less and less confidence in advertising and PR campaigns, rightly believing that there will not be so much truth in them.

The goal of astroturfing is to hide the financial and business ties between the sending company and reviews of its products and make them as attractive and believable as possible.

Astroturf examples

Although the practice is now associated with the Internet, it has been in widespread use since newspaper editors first invented the reader’s letter page.

In any local newspaper during the election campaign, one could easily find a huge number of letters from “concerned residents” objecting to the pernicious policies of this or that candidate. Often these reviews were written at the request of competitors.

Today, in order to overcome the danger of exposure, most of the false reviews are based on Internet forums, in the comment sections of blogs and newspaper sites. It is in the online space that it is realistic to leave comments under different names without fear of being discovered [ Big Commerce, 2020 ].

Here are some more examples of astroturfing:

  • Bloggers who publish product reviews for companies and pass off the ad as an unbiased review
  • Creating several fake characters (sockpuppets) on popular message board sites like Reddit, Digg or 4Chan that spread similar messages, giving the illusion that the product, service or idea is popular;
  • Paid social media accounts that target specific product brands [ Big Commerce, 2020 ].

American Todd Rutherford worked in the marketing department of a company that provides services to self-published writers. His task was to convince traditional media and Internet bloggers to review the books of these authors.

Soon, Rutherford came up with the idea to simplify his work and make money from it. In the fall of 2010, he launched a website where he announced that he would review any book for $99, 20 online reviews for $499, and 50 reviews for $999 [The New York Times, 2012].

Complaints quickly surfaced on online forums that the service violated the sacred commercial relationship between reviewer and author. But along with this came a large number of orders. Todd’s business took off – soon he began to earn $ 28,000 a month from his business [ The New York Times, 2012 ].

Of course, no one can claim that these reviews are fictitious. Perhaps Todd and his team actually read all of these books. However, the growing number of rave reviews suggests that this is nothing more than astroturfing.

The work of one of Rutherford’s clients, for which the author himself ordered hundreds of reviews, soon after their publication became a bestseller. This once again proves that a huge number of positive comments draws more attention to the product and increases the demand for it.

Why is astroturfing so popular?

The Internet is a valuable resource that has given the world the opportunity to openly communicate, exchange information, and learn new things. But, at the same time, it is a gold mine for corporate lobbyists, viral marketers and government political strategists who can work in cyberspace without any restrictions, liability and fear of detection.

More evidence is accumulating every month that online comment threads and forums are being taken over by people who are not who they say they are.

Often, such deception occurs when the interests of companies or the government conflict with the interests of society. And instead of changing their strategy, they are heading towards creating a positive image with fake accolades. For example, tobacco companies have long used astroturfing to combat negative comments addressed to them [The Guardian, 2011].

As online retailers rely more and more on reviews as a sales tool, an entire industry of scammers and promoters has sprung up to buy and sell reviews for next to nothing. On thematic sites and forums, you can often find ads with offers to write a custom review for a particular product or service.

The boundless demand for positive feedback has turned the review system into an arms race of sorts. As one company gets more and more complimentary comments, its competitors need to step up to get even more such comments already in their address [The New York Times, 2011].

The anonymity made available by the Internet has opened up new possibilities for astroturfing operations: fake mass campaigns that give the impression that a large number of people are demanding something or, conversely, opposing certain policies.

The development and active application of new astroturfing technologies is a consequence of the growing power of free speech. It was social media that gave voice to millions and allowed genuine opposition movements to communicate their position to the masses.

Censorship of such movements has not always been effective, and only authoritarian governments have the means and the will to implement it. For businesses and less repressive governments, astroturfing is a far more attractive alternative to squeezing opposition online [The Guardian, 2012].

With the help of a couple of computers and a few system administrators, you can launch a whole PR campaign that can easily change the situation in the market or political arena. One can only guess how widespread these methods are, but, according to researchers, over time, the demand for such services will only grow.

In the brutal world of online shopping, where a competing product is just a click away, retailers are willing to go to great lengths to complete a sale.

Some exalt themselves by anonymously publishing their own accolades. Today, there is an even simpler approach: offer buyers a refund in exchange for a positive review. As the collective mind of the crowd replaces traditional advertising, positive ratings are fueling e-commerce engines.

Fake reviews attract the attention of researchers. “Advertising disguised as editorials is an old problem, but now it manifests itself in different ways,” notes FTC Associate Director of Advertising Mary C. Engle [The New York Times, 2011].

Researchers such as Bing Liu, a professor of computer science at the University of Illinois at Chicago, are also taking notice as they try to develop mathematical models to systematically debunk false evidence. “More people depend on reviews about what to buy and where to go, so there are more incentives to counterfeit,” notes Liu [The New York Times, 2011].

The professor estimates that about a third of all consumer reviews on the Internet are fake. However, it’s almost impossible to tell when the reviews were written by marketers or retailers (or the writers themselves under pseudonyms), buyers (who may have made a deal with the seller for a good rating), or a hired third party or specialist agency.

Lynchy Kwok, an associate professor at Syracuse University, studies the relationship between social media and the hospitality industry. He explains that as online shopping has become more “social” and customer reviews have become an important part of the selling proposition, marketers have realized they need to monitor and manage those opinions just like they manage any other marketing campaign. “Everyone is trying to do something to look their best,” he said. “Some of them, if they can’t create credible reviews, they can hire someone to do it” [ The New York Times, 2011 ].

For example, the impact of TripAdvisor on the travel industry as the world’s largest hotel review site has long been evident. A good review can make a hotel great, just as a negative review can ruin its reputation for years to come.

It’s no surprise that many hotels bribe guests into writing glowing reviews in exchange for money, complimentary rooms, or reduced-price meals. Already 30 hotels around the world have been blacklisted due to suspicious reviews. There is a special black market for hotels that are willing to pay people to leave positive comments about their stay [Mail Online, 2014].

For example, the British media accused the upscale English hotel The Cove in Cornwall of forcing guests to post an “honest but positive review” on TripAdvisor in exchange for a future 10 percent discount. The hotel itself said it was a misinterpreted loyalty system. However, TripAdvisor posted a warning about Cove’s positive comments on their page for the hotel [Mail Online, 2014].

Negative reviews also abound on the web, often posted on competitor websites for restaurants and hotels. But, according to Trevor J. Pinch, a sociologist at Cornell University, “there is definitely a bias towards positive comments” [The New York Times, 2011].

Pinch’s interviews with over 100 of Amazon’s top-ranking reviewers revealed that only a few of them wrote anything critical. One reviewer said: “I prefer to praise those I love rather than curse those I didn’t love!”

However, the fact that almost all of the lead reviewers in his study said they received free books and other materials from publishers and others asking for good reviews may also have something to do with it [The New York Times , 2011].

Latest trends in astroturfing

Where companies used to have to pay real Internet users to write positive reviews or turn to special agencies, today this process has become much more efficient with minimal time spent. New forms of software are enabling any organization with the means and technology to astroturf on a much larger scale than ever before.

Some large companies are now using sophisticated software to create entire armies of virtual commentators with fake IP addresses, non-political interests and online stories. Authentic profiles are created automatically and take months or years to develop before they can be used in a political or advertising campaign. As the software improves, it will become increasingly difficult to detect these accounts [The Guardian, 2012].

Thanks to improvements in computer technology, the software used for astroturfing performs a number of interesting functions:

  1. Multiples the effort of each comment, making it seem like the campaign has strong support from a large fan base.
  2. Completes the fictitious user by creating a name, email accounts, webpages and social media accounts, making it difficult to distinguish between a virtual robot and a real commentator.
  3. Creates a fictitious social media search and share history that enhances the impression that account holders are real and active.

With proper use of social media, astroturfing professionals can create the impression that a person was actually present at the event in question [Daily Kos, 2011].

There are many tricks and tricks, thanks to which no one will be able to suspect the author of the comment that he first appeared on the Internet a minute ago with the sole purpose of defending a favorable position for his customer.

The United States Air Force is bidding to supply companies with HR software that will perform the following tasks:

  1. Will create 10 characters for each user with backstories, supporting details, and web traces that are technically, culturally, and geographically consistent. Characters should be able to appear from almost any part of the world and can interact using common online services and social media platform.
  2. Automatically provide randomly selected IP addresses through which they can access the Internet. These addresses should change every day.
  3. Create static IP addresses for each character so that different fake accounts look like the same person over time. It should also allow companies that frequently visit the same sites to easily switch IP addresses to appear like regular users rather than a single entity [ FedBizOpps.gov, 2010 ].

Such software, according to researchers, could destroy the Internet as a forum for constructive discussion and threaten the notion of online democracy.

Fight against astroturfing

Astroturfing is a highly controversial practice that is often frowned upon or valued. How far a business can go to get a good review is a question that has no clear answer.

Many forms of online content often walk a fine line between propaganda and astroturfing. Such “efforts” tend to be perceived very negatively by the general public. If the incident becomes widespread or widely publicized enough, then falling into the trap of open astroturfing can embarrass an online business and potentially cause significant damage to its public relations.

Of course, to believe or not to believe what you read is the choice of each person. As practice shows, it is the ability to think critically and logically that is one of the most important to protect against falling into the trap of false reviews. The online program “Cognitive Science” in one of its lessons explains in detail how to make decisions and make choices based on your own, logical conclusions, but this is by the way …

Many Western countries have clear laws governing overt or particularly misleading forms of astroturfing. For example, in both the US and the European Union, all paid promoters are required by law to disclose their financial relationship with a given company, including bloggers and social media creators.

The Federal Trade Commission has issued guidelines stating that all online confirmations must clearly indicate the existence of a financial relationship. It is worth noting the minimum control over compliance with these instructions. In addition, it is almost impossible to prove the existence of these same financial relations between the customer and the contractor [The New York Times, 2012].

Researchers estimate that up to one-third of all online reviews are fake, which unfortunately makes some form of astroturfing commonplace. It is difficult to determine the number of fake reviews on the Web.

A team of researchers at Cornell University have tackled this issue and recently published a paper on creating a computer algorithm to detect fake reviewers [ Proceedings of the 49th Annual Meeting of the Association for Computational Linguistics, 2011 ].

Cornell researchers have battled what they call misleading spam by commissioning freelance writers at Mechanical Turk, an Amazon-owned marketplace, to produce 400 positive but fake Chicago hotel reviews.

They then added another 400 positive TripAdvisor reviews that they thought were genuine and asked three human judges to tell them apart. People have never been able to distinguish real reviews from false ones [The New York Times, 2011].

So the team developed an algorithm to distinguish a fake from a real review, which worked about 90% of the time [ Proceedings of the 49th Annual Meeting of the Association for Computational Linguistics, 2011 ].

The forgeries tended to describe the author’s hotel experience using many superlatives, but they weren’t very good descriptions. The authors of fake comments also used the pronoun “I” more often, as if trying once again to convince readers of the reliability and truthfulness of the review [The New York Times, 2011].

The scientists were immediately approached by dozens of companies, including Amazon, Hilton, TripAdvisor, and several specialized travel sites, all of which were strongly interested in limiting the spread of false reviews from competitors.

Conclusion

Consumer reviews are powerful because, unlike old-style advertising and marketing, they give the illusion of truth. They are ostensibly testimonies of real people, although some of them are bought and sold just like everything else on the commercial Internet.

But with effective marketing, customer service, and product quality, the need for astroturfing is rendered unnecessary—if discovered, the risk of loss of credibility or potential lawsuit outweighs any potential benefits.

Calvin