March 17, 2026 ChainGPT

ChatGPT Didn't Cure a Dog — AI Sped mRNA Research and Offers a Cautionary Tale for Crypto Hype

ChatGPT Didn't Cure a Dog — AI Sped mRNA Research and Offers a Cautionary Tale for Crypto Hype
Did ChatGPT cure a dog’s cancer? The short answer: not exactly — but the story highlights how AI can accelerate research and why credit and nuance matter. The tale that grabbed headlines over the weekend began with Rosie, a seven-year-old Shar Pei from Australia whose owner, AI consultant Paul Conyngham, said vets gave the dog just months to live after discovering late-stage mast cell tumors. Conyngham documented a long, improvised research effort in a November 2024 thread that gained fresh attention when OpenAI co-founder Greg Brockman shared it with his large social following. Conyngham says he used consumer AI tools, starting with ChatGPT, to map a research plan: perform genomic sequencing of healthy and tumor tissue, identify mutations driving the cancer, and look for an mRNA-based therapeutic strategy. ChatGPT pointed him to institutions and equipment, and a UNSW contact connected him to Dr. Martin Smith at the Ramaciotti Centre for Genomics, which sequenced Rosie’s tissues for roughly $3,000. The centre returned some 320 gigabytes of raw genomic data — essentially a biological “fingerprint” expressed as strings of A, T, C and G, the university reported. Conyngham homed in on c-KIT, a protein implicated in canine mast cell tumors, and used Google DeepMind’s AlphaFold to model Rosie’s version of that protein. AlphaFold’s rendering suggested mutations consistent with the literature, and Conyngham identified an existing human cancer drug that targets proteins like c-KIT. He later connected with Prof. Palli Thordarson at the UNSW RNA Institute, whose team assembled an mRNA-LNP vaccine candidate based on the identified mutations. Key clarifications temper the headline-friendly claim that “ChatGPT cured a dog.” The final vaccine construct, Conyngham wrote, was actually designed by Grok (Meta’s AI), with Gemini also doing “a ton of the heavy lifting.” ChatGPT’s documented role was more narrowly about literature searches, drafting plans, and pointing to labs and sequencing options — effectively a research-navigation tool rather than the lab bench or the molecular designer. The sequencing, wet-lab work, vaccine assembly and clinical oversight were performed by established researchers and university labs trained to do exactly that. UNSW scientists and others have publicly pushed back on over-claiming AI’s role. Dr. Kate Michie, a structural biologist, noted AlphaFold’s confidence score for the c-KIT model was low (about 54.55) and reminded readers that AlphaFold “can get stuff wrong” and requires lab-based validation. Dr. Smith confirmed AlphaFold was not used to design the mRNA vaccine. Thordarson — who posted that his lab made the mRNA-LNP — cautioned that the treatment “may not have cured Rosie” but likely “bought time,” since some tumors didn’t respond and further analysis is underway to see if those lesions carried different mutations. He also stressed that the vaccine did not act alone: its efficacy depended on co-administration of a checkpoint inhibitor, and the overall resource input (including in-kind labor and infrastructure) makes precise cost estimates difficult and the program expensive. The episode illustrates both promise and peril in how AI-assisted science is portrayed. Unlike past high-profile AI-health missteps — notably IBM Watson for Oncology, which produced “unsafe and incorrect” recommendations and was later abandoned after large expenditures — this case differs in that no one was harmed, the mRNA approach aligns with established science, and credentialed researchers handled the crucial lab work. Still, attribution matters: presenting consumer AI as the “cure” risks obscuring the human expertise, university infrastructure, and clinical safeguards that actually turned raw genomic data into a candidate therapy. For the crypto and broader tech communities, the story is a useful cautionary tale about hype cycles. AI tools can dramatically speed discovery pipelines and lower entry barriers for curious, motivated individuals — just as decentralized tools can democratize finance and development — but they don’t replace labs, regulation, or domain expertise. Clear, accurate crediting of where AI helped versus where trained researchers did the heavy lifting will be essential as these fields intersect and mature. Read more AI-generated news on: undefined/news