The books are the result of a swirling mix of modern tools: AI apps that can produce text and fake portraits; websites with a seemingly endless array of stock photos and graphics; self-publishing platforms — such as Amazon’s Kindle Direct Publishing — with few guardrails against the use of AI; and the ability to solicit, purchase and post phony online reviews, which runs counter to Amazon’s policies and may soon face increased regulation from the Federal Trade Commission.
The use of these tools in tandem has allowed the books to rise near the top of Amazon search results and sometimes garner Amazon endorsements such as “#1 Travel Guide on Alaska.”
A recent Amazon search for the phrase “Paris Travel Guide 2023,” for example, yielded dozens of guides with that exact title. One, whose author is listed as Stuart Hartley, boasts, ungrammatically, that it is “Everything you Need to Know Before Plan a Trip to Paris.” The book has no further information about the author or publisher. It also has no photographs or maps, although many of its competitors have art and photography easily traceable to stock-photo sites. More than 10 other guidebooks attributed to Stuart Hartley have appeared on Amazon in recent months that rely on the same cookie-cutter design and use similar promotional language.
The New York Times also found similar books on a much broader range of topics, including cooking, programming, gardening, business, crafts, medicine, religion and mathematics, as well as self-help books and novels, among many other categories.
Amazon declined to answer a series of detailed questions about the books. In a statement provided by email, Lindsay Hamilton, a spokesperson for the company, said Amazon is constantly evaluating emerging technologies. “All publishers in the store must adhere to our content guidelines,” she wrote. “We invest significant time and resources to ensure our guidelines are followed and remove books that do not adhere to these guidelines.”
The Times ran 35 passages from the Mike Steves book through an AI detector from Originality.ai. The detector works by analysing millions of records known to be created by AI and millions created by humans, and learning to recognize the differences between the two, said company founder Jonathan Gillham.
The detector assigns a score of between 0 and 100, based on the percentage chance its machine-learning model believes the content was AI-generated. All 35 passages scored a perfect 100, meaning they were almost certainly produced by AI.
The company claims that the version of its detector used by the Times catches more than 99 per cent of AI passages and mistakes human text for AI on just under 1.6 per cent of tests.
The Times identified and tested 64 other comparably formatted guidebooks, most with at least 50 reviews on Amazon, and the results were strikingly similar. Of 190 paragraphs tested with Originality.ai, 166 scored 100, and only 12 scored under 75. By comparison, the scores for passages from well-known travel brands such as Rick Steves, Fodor’s, Frommer’s and Lonely Planet were nearly all under 10, meaning there was next to no chance that they were written by AI generators.
Amazon, AI and trusted travel brands
Although the rise of crowdsourcing on sites such as Tripadvisor and Yelp — not to mention free online travel sites and blogs and tips from TikTok and Instagram influencers — has reduced the demand for print guidebooks and their e-book versions, they are still big sellers. On a recent day in July, nine of the Top 50 travel books on Amazon — a category that includes fiction, nonfiction, memoirs and maps — were European guidebooks from Rick Steves.
Steves, reached in Stockholm about midnight after a day of researching his series’ Scandinavia guide, said he had not heard of the Mike Steves book and did not appear concerned that generative AI posed a threat.
“I just cannot imagine not doing it by wearing out shoes,” said Steves, who had just visited a Viking-themed restaurant and a medieval-themed competitor, and determined that the Viking one was far superior. “You’ve got to be over here talking to people and walking.”
Steves spends about 50 days a year on the road in Europe, he said, and members of his team spend another 300 to update their approximately 20 guidebooks, as well as smaller spinoffs.
But Pauline Frommer, editorial director of the Frommer’s guidebook series and author of a popular New York guidebook, is worried that “little bites” from the faux guidebooks are affecting their sales. She said she spends three months a year testing restaurants and working on other annual updates for the book — and gaining weight she is trying to work off.
“And to think that some entity thinks they can just sweep the internet and put random crap down is incredibly disheartening,” she said.
Amazon has no rules forbidding content generated primarily by AI, but the site does offer guidelines for book content, including titles, cover art and descriptions: “Books for sale on Amazon should provide a positive customer experience. We do not allow descriptive content meant to mislead customers or that doesn’t accurately represent the content of the book. We also do not allow content that’s typically disappointing to customers.”
Gillham, whose company is based in Ontario, said his clients are largely content producers seeking to suss out contributions that are written by AI. “In a world of AI-generated content,” he said, “the traceability from author to work is going to be an increasing need.”
Finding the real authors of these guidebooks can be impossible. There is no trace of “renowned travel writer” Mike Steves, for example, having published “articles in various travel magazines and websites,” as the biography on Amazon claims. In fact, the Times could find no record of any such writer’s existence, despite conducting an extensive public records search. (Both the author photo and the biography for Mike Steves were very likely generated by AI, the Times found.)
Gillham stressed the importance of accountability. Buying a disappointing guidebook is a waste of money, he said. But buying a guidebook that encourages readers to travel to unsafe places, “that’s dangerous and problematic,” he said.
The Times found several instances where troubling omissions and outdated information might lead travellers astray. A guidebook on Moscow published in July under the name Rebecca R. Lim — “a respected figure in the travel industry” whose Amazon author photo also appears on a website called Todo Sobre el Acido Hialurónico (“All About Hyaluronic Acid”) alongside the name Ana Burguillos — makes no mention of Russia’s ongoing war with Ukraine and includes no up-to-date safety information. (The US Department of State advises Americans not to travel to Russia.) And a guidebook on Lviv, Ukraine, published in May, also fails to mention the war and encourages readers to “pack your bags and get ready for an unforgettable adventure in one of Eastern Europe’s most captivating destinations.”
Sham reviews
Amazon has an anti-manipulation policy for customer reviews, although a careful examination by the Times found that many of the five-star reviews left on the shoddy guidebooks were either extremely general or nonsensical. The browser extension Fakespot, which detects what it considers “deceptive” reviews and gives each product a grade from A to F, gave many of the guidebooks a score of D or F.
Some reviews are curiously inaccurate. “This guide has been spectacular,” wrote a user named Muñeca about Mike Steves’ France guide. “Being able to choose the season to know what climate we like best, knowing that their language is English.” (The guide barely mentions the weather and clearly states that the language of France is French.)
Most of the questionably written rave reviews for the threadbare guides are from “verified purchases,” although Amazon’s definition of a “verified purchase” can include readers who downloaded the book for free.
“These reviews are making people dupes,” Frommer said. “It’s what makes people waste their money and keeps them away from real travel guides.”
Hamilton, the Amazon spokesperson, wrote that the company has no tolerance for fake reviews. “We have clear policies that prohibit reviews abuse. We suspend, ban, and take legal action against those who violate these policies and remove inauthentic reviews.” Amazon would not say whether any specific action has been taken against the producers of the Mike Steves book and other similar books. During the reporting of this article, some of the suspicious reviews were removed from many of the books that the Times examined. And a few books, including those by Mike Steves, were taken down. Amazon said it blocked more than 200 million suspected fake reviews in 2022.
But even when Amazon does remove reviews, it can leave five-star ratings with no text. As of Thursday, Adam Neal’s Spain Travel Guide 2023 had 217 reviews removed by Amazon, according to a Fakespot analysis, but still garners a 4.4-star rating, in large part because 24 of 27 reviewers who omitted a written review awarded the book five stars. “I feel like my guide cannot be the same one that everyone is rating so high,” wrote a reviewer named Sarie, who gave the book one star.
Many of the books also include “editorial reviews,” seemingly without oversight from Amazon. Some are particularly audacious, such as Dreamscape Voyages’ Paris Travel Guide 2023, which includes fake reviews from heavy hitters such as Afar magazine (“Prepare to be amazed”) and Condé Nast Traveler (“Your ultimate companion to unlocking the true essence of the City of Lights”). Both publications denied reviewing the book.
’You’ve got to be there in the field’
AI experts generally agree that generative AI can be helpful to authors if used to enhance their own knowledge. Darby Rollins, founder of The AI Author, a company that helps people and businesses leverage generative AI to improve their workflow and grow their businesses, found the guidebooks “very basic.”
But he could imagine good guidebooks produced with the help of AI. “AI is going to augment and enhance and extend what you’re already good at doing,” he said. “If you’re already a good writer and you’re already an expert on travel in Europe, then you’re bringing experiences, perspective and insights to the table. You’re going to be able to use AI to help organise your thoughts and to help you create things faster.”
The real Steves was less sure about the merits of using AI. “I don’t know where AI is going, I just know what makes a good guidebook,” he said. “And I think you’ve got to be there in the field to write one.”
Kolsky, who was scammed by the Mike Steves book, agreed. After returning her initial purchase, she opted instead for a trusted brand.
“I ended up buying Rick Steves,” she said.
This article originally appeared in The New York Times.
Written by: Seth Kugel and Stephen Hiltner
©2023 THE NEW YORK TIMES