Grammarly has pulled the plug on its controversial "Expert Review" feature after a class-action lawsuit filed by award-winning journalist Julia Angwin alleged the AI writing tool used the names and identities of prominent writers without their knowledge or consent. The feature was disabled on March 11, one day after the lawsuit was filed in federal court in Manhattan.
How Expert Review Worked
Launched in August 2025, Expert Review was a premium add-on priced at $12 per month. Users could upload their writing and receive real-time feedback ostensibly modeled after the editorial styles of well-known professionals — journalists like Angwin, authors like Stephen King, and other acclaimed editors.
The problem: many of those professionals had no idea their names were being used. The feature created the impression that real experts were reviewing users' work, when in reality an AI system was generating the feedback.
The Lawsuit
Angwin's complaint, filed against Superhuman Platform Inc. (Grammarly's parent company), alleges violations of New York's right of publicity law. The suit argues that Grammarly commercially exploited the reputations of writers to market and sell a product, profiting from their professional credibility without compensation or authorization.
The lawsuit seeks damages exceeding $5 million and class-action status on behalf of all writers, journalists, and editors whose identities were used in the feature.
A Growing Pattern
The case highlights a recurring tension in the AI industry: the line between training on public information and commercially exploiting someone's identity. While courts have grappled with questions about AI training data and copyright, this case takes a different angle — it is fundamentally about right of publicity, a legal concept that predates the AI era.
For AI companies, the implications are significant. Using a public figure's name to sell a product has long been legally restricted, and attaching real names to AI-generated output may cross that line regardless of the underlying technology.
Industry Reaction
The writing community's response was swift and overwhelmingly critical. Authors, journalists, and editors took to social media to express outrage, with many reporting they had never been contacted by Grammarly about the feature.
Grammarly's decision to shut down Expert Review within 24 hours of the lawsuit suggests the company recognized the legal and reputational risks were substantial. However, the lawsuit will proceed regardless of the feature's removal, as the complaint covers the period during which the tool was active.
What This Means for AI Products
The case could set an important precedent for how AI companies use real individuals' identities in their products. As AI tools increasingly personalize their outputs, the question of who is being impersonated — and whether they consented — will become a central legal and ethical issue across the industry.


