March 27, 2026 ChainGPT

Paid in USDC: Telegram review gigs turn out to be a crypto-powered laundromat

Paid in USDC: Telegram review gigs turn out to be a crypto-powered laundromat
I was hired to post fake Google reviews — and the people who recruited me tried to scam me out of cash. What I found illustrates how crypto, messaging apps and industrial-scale fraud are colliding to undermine online trust. What happened - A Telegram recruiter offering “work” contacted me promising up to £800 a day to write Google Maps reviews. Within days I was coached to post fake five-star reviews for well-known hotels — DoubleTree by Hilton, Ibis Budget (Accor), Travelodge, Hyatt Place — and for hostels and B&Bs across Europe (Genova, Naples, Maastricht, Krakow, Brussels). I never visited the properties. - Payments were made in USDC, a dollar-pegged stablecoin, via a US cryptocurrency exchange. The initial pay rate was $5 per review. When I asked about declaring crypto income under UK law, the recruiter advised: “You can ignore this one.” - After a few legitimate-seeming payouts totalling $30, the operation pivoted to asking me to perform “business tasks” — sending $50 and receiving $60 back — and to “upgrade” accounts by staking ever-larger sums. A published payout chart promised up to $16,000 rewards for an initial $10,000 outlay. How the operation was organised - The recruitment used a layered, division-of-labour approach. A Telegram account named “Sharon Roberts” initiated contact; she passed me to a supposed “receptionist,” “Victoria Castillo,” who walked me through wallets and payments. Reverse-image searches suggested the persona was fake. - The scammers co-opted the branding of legitimate companies. One Telegram channel used the Quad Marketing Agency name and logo (Quad said it had no connection). That channel had roughly 16,800 subscribers; a near-identical channel had about 14,700 — a glimpse of the operation’s scale. - Channels posted work from about 8am to 7pm UK time, publishing up to 14 review tasks per day. Since 12 March a single channel had posted nearly 6,000 requests for fake reviews. Why crypto matters - Crypto is integral to both recruiting and payouts. Payments in USDC are fast and global; scammers can split and recombine funds across many wallets — a rudimentary form of “tumbling” — making illicit proceeds harder to trace. - Blockchain analysis firm Chainalysis found that the wallets used follow a consistent pattern: wallets are topped up, then send out tens of thousands of small payments to workers, typically paying out between $300,000 and $600,000 in USDC before moving the bulk on. That pattern signals industrial-scale money flow rather than isolated freelancing gigs. - Chainalysis’ Jacqueline Burns Koven described the scheme as similar to employment scams in which victims are asked to pay to “upgrade” or withdraw balances after initial small legit payouts: “The end goal is to run away with the funds.” The broader picture: fake reviews and consumer harm - Fake reviews are a lucrative enabler for many scams and dodgy sellers. The UK Competition and Markets Authority (CMA) estimated in 2023 that fake product reviews cost UK consumers between £50m and £312m annually; for the products it examined, 11–15% of reviews were fake. - Major hotel chains contacted for comment disavowed any involvement. Travelodge and Accor said they do not create or commission fake reviews and would try to ensure fraudulent entries were removed. Booking.com said only guests with confirmed stays can post reviews. - Google says its systems flagged some of the fake reviews I posted and that it has removed more than 240 million fake reviews since 2024, restricting about 900,000 accounts. Telegram points to its anti-spam moderation and custom AI tools and says it removes scams reported to it. What the scammers may actually want - The fake-review work itself can be useful to criminal marketplaces or unscrupulous small operators seeking positive ratings. But fraud specialists say the bigger objective is often to recruit people into payment flows that facilitate money laundering or to shift to overt financial scams. - Fraud consultant Serpil Hall says scammers are getting craftier, leveraging new technologies such as generative AI and “agentic” bots, and increasingly using human workers as “human bots” to bypass automated detection. Hall also warned that some scam centres operate in countries with weak rule of law — and that in extreme cases the scammers themselves are victims of trafficking or forced labour. The endgame: trust and regulation - The UK introduced new rules in April requiring platforms that host reviews (like Google) to have clear policies to detect and remove fake or incentivised reviews. Under pressure from regulators, Google says it is doing more to block fraudulent reviews. - Even with platform defenses and blockchain transparency, the combination of messaging apps, crypto payments and organised operations creates a persistent problem. For would-be review-writers, the risk isn’t just earning a few dollars — it’s being drawn into laundering flows or losing funds to so-called upgrade schemes. Bottom line This episode shows a two-tier dynamic: fake reviews are a useful commodity for fraudsters, but more immediately the schemes serve as recruitment and money-movement vectors powered by crypto. Blockchain’s public ledger helps investigators spot patterns, but the speed and global reach of crypto plus the opacity of messaging-app networks mean the threat will keep evolving. Platforms, regulators and on-chain analysts will have to stay a step ahead — and consumers should be skeptical of unsolicited “work” that involves sending or receiving crypto. Read more AI-generated news on: undefined/news