Bonifield, Stevie. “Grammarly’s Sloppelganger Saga.” The Verge, 5 Apr. 2026, www.theverge.com/column/906606/grammarly-expert-review-ai-saga.
This article by Stevie Bonifield for The Verge is talking about the relatively quick rise and fall of Grammarly’s “Expert Review” feature. Grammarly, which rebranded as Superhuman in late 2025 after acquiring the AI email platform Superhuman Mail, launched Expert Review in August 2025. Expert Review was a feature that generated AI writing suggestions under the names of real academics and authors like Stephen King, Neil deGrasse Tyson, and Carl Sagan and presenting them with a verified-style checkmark icon. None of these individuals gave consent, and the feature only came under scrutiny in March 2026 when Wired reported it was using the names of deceased professors, and Verge reporters discovered their own colleagues’ names attached to AI-generated advice they never gave. Superhuman’s initial response was to launch an opt-out email inbox, but after mounting backlash, the company disabled the feature entirely. Investigative journalist Julia Angwin simultaneously filed a class action lawsuit alleging violations of privacy, publicity rights, and likeness protection laws in New York and California. In an interview, Superhuman’s CEO Shishir Mehrotra repeatedly called Expert Review a “bad feature,” yet also floated the idea of eventually relaunching a consent-based version where experts could train AI agents to represent them commercially.
I chose this article to share because Grammarly feels like one of the most familiar AI-adjacent tools in both college and professional life (at least for me!). I also thought that with our recent exploration of copyright and AI, this felt prevalent! But I feel like nearly every student has encountered it in a browser extension or a Google Docs recommendation from Grammarly. This is a tool many of us have trusted, and this article reveals how the company was monetizing real people’s identities without their knowledge as part of that “helpful” experience. Personally, I did not encounter the “Expert Review” feature at all because I ended up disabling Grammarly on everything about a year ago. I disabled Grammarly when it started rewriting my sentences and just going a little too far, although I do love a good spell-check! But in this article specifically, The Decoder podcast exchange between Patel and Mehrotra, where Patel pushes back on the CEO’s claim that fabricated suggestions constituted mere “attribution,” is especially interesting. It really showed how this AI-generated content blurs the line between referencing someone’s work and putting words in their mouth. It just made me think about the importance of CHECKING YOUR SOURCES! If you are using AI for work or school, don’t just let it hallucinate. Take AI with a grain of salt.
