March 28, 2026 ChainGPT

Paid to Post Fake Google Reviews in USDC: How Telegram and Crypto Power an Industrial Scam

Paid to Post Fake Google Reviews in USDC: How Telegram and Crypto Power an Industrial Scam
“I was paid to write fake Google reviews — then my ‘bosses’ tried to scam me.” That sentence, part confessional and part investigation, exposes a shadow industry where crypto payments, messaging apps and fake branding combine to fuel large-scale online fraud. The setup was simple and surprisingly ordinary: a Telegram message from a recruiter offering up to £800 a day to post Google Maps reviews. The targets were real hotels — a DoubleTree by Hilton, an Ibis budget (part of Accor), Travelodge, Hyatt Place — and dozens of smaller B&Bs and hostels across Europe. The work itself was banal: post a five-star review even if you’d never set foot inside. Payment: $5 per review, sent in USDC stablecoin to a crypto wallet on a U.S. exchange. The author of the piece accepted the work for a few days and earned $30 total. What they uncovered, however, points to a far bigger problem: an industrialized scam ecosystem that uses human “reviewers,” crypto rails and messaging platforms to launder money, recruit victims and distort the online marketplace. How the operation worked - Initial contact came through Telegram from an account using the name “Sharon Roberts,” which was almost certainly a pseudonym. After days of outreach she passed the author to another contact, “Victoria Castillo,” who acted as a supervisor. - Victoria coached the author through creating a crypto wallet and receiving USDC payments. When asked about legal obligations to declare crypto income, Victoria advised: “You can ignore this one.” - The scammers branded their Telegram channels with the names and logos of legitimate firms — for example, channels mimicking the listed Quad Marketing Agency that had 16,800 and 14,700 subscribers respectively — making the scheme look more credible to recruits. - Job posts were abundant and consistent: up to 14 review tasks per day, paid initially at $5 per review. Channels posted nearly 6,000 requests for fake reviews since March 12. - In parallel to posting reviews, recruits were sometimes asked to perform “business tasks”: send a small amount of crypto and receive back the same amount plus a commission. Those transactions were a gateway to a classic employment-scam escalation: pay-to-upgrade schemes promising larger returns for larger initial deposits (a chart on the channel showed a path from a $50 payback task up to a $16,000 reward after a $10,000 outlay). Why crypto matters here - Payments in USDC make it easy for operators to send large numbers of micro‑payments quickly and globally. Chainalysis — the blockchain analytics firm — found wallets involved in this scheme followed a pattern consistent with industrial fraud: wallets were topped up, then dispersed tens of thousands of small payments to recruits before transferring larger sums onward. Typical wallets paid out between $300,000 and $600,000 in USDC before moving funds. - Crypto also enables money‑obfuscation techniques like “tumbling,” where funds are split and recombined to mask origin — a service criminal groups prize. - Regulators have tried to keep pace: under UK rules introduced last April, platforms that host reviews must have clear policies to prevent and remove fake or incentivised reviews. Google has agreed to step up detection and removal. The company says it removed more than 240 million fake reviews since 2024 and restricted 900,000 accounts for policy violations. Who’s behind it — and who’s affected - The scam appears to be run at scale from low‑regulation jurisdictions, with investigators finding similar operations in Cambodia, Myanmar and Russia. In some reported cases, those running the schemes are themselves coerced or trafficked into scam centers. - Big hotel chains contacted for comment — Accor, Travelodge, Hilton and Hyatt Place — said they had nothing to do with the fake reviews and would pursue removal. Booking.com said only customers with confirmed stays can post reviews and that its systems and teams work to block fraudulent content. - Experts warn that reviews are a valuable commodity for criminals: the UK Competition and Markets Authority (CMA) estimated in 2023 that fake review text on products causes between £50 million and £312 million in annual consumer harm in the UK alone. The CMA has also found that 11–15% of reviews in sampled product categories were fake and recently opened investigations into five firms over misleading or fake reviews. The human cost and the scam twist - Fraud consultant Serpil Hall, who has two decades’ experience fighting scams, says online fraud volume has surged over the last six to seven years and that new technologies — generative and agentic AI — are making scams yet more sophisticated. She notes scammers are shifting to “human bots” (real people doing repetitive tasks) because platforms are getting better at detecting purely automated abuse. - Chainalysis’s Jacqueline Burns Koven likens the operation to employment scams: initial micro‑payouts build trust, then victims are asked to pay for “upgrades” or to remove supposed account balances; ultimately scammers run off with the funds. - The final, worrying twist: after using recruits to post fake reviews and to move small amounts of crypto, operators then try to extract money from those same recruits via pay‑to‑participate “business tasks.” That’s essentially a smaller-scale version of “pig butchering” scams, where trust is built and later monetized via a big con. Platform responses and enforcement - Telegram insists it has robust anti‑spam systems and uses moderators with custom AI tools to remove scams, saying it routinely takes down fraudulent content. Quad Marketing Agency and other brands whose identities were impersonated also deny any connection and say they’re investigating unauthorized use of their names and logos. - Google says suspicious reviews were flagged and some were blocked by its automated systems. Booking.com and large hotel groups emphasized review integrity and their existing defenses against inauthentic posts. What this means for the crypto and review ecosystems - For crypto users: accepting seemingly easy money via USDC or other stablecoins carries legal and financial risk. In the UK, recipients of crypto payments may have reporting obligations, and accepting funds from criminal enterprises — even unknowingly — can become part of a laundering chain. - For consumers: fake reviews still distort purchasing decisions and can cause significant economic harm. Even when major platforms have protections, bad actors adapt. - For platforms and regulators: this case shows the need for coordinated enforcement across messaging apps, crypto exchanges and review platforms. Blocking accounts, taking down impersonating channels and tracing blockchain flows are complementary but unevenly applied tools. The author stopped the experiment after making $30 and revealing they were a journalist; the contacts went silent. But other recruiters immediately surfaced on Telegram offering more work. The episode is a compact example of the larger reality: crypto and messaging apps have dramatically lowered the barrier for global, scalable fraud — and the people doing small tasks to prop up scams often become the next target. Takeaway for readers in crypto and tech circles: vigilance matters. Don’t accept “easy” crypto gigs from anonymous channels, report impersonating accounts to platform moderators, declare crypto income as required by law, and remember that schemes that look like simple microlabor can be a front for laundering and larger scams. Read more AI-generated news on: undefined/news