Florida Court Says AI Can Be a “Product”: Why That Matters to You

Florida Court Says AI Can Be a “Product”: Why That Matters to You

By Jeremy D. Hollingshead, Esq.
Partner at Hollingshead & Dudley | May 27, 2025

Imagine downloading an AI chatbot app that seems harmless—maybe even helpful. Now imagine a tragic outcome because that software became a source of emotional manipulation. That’s the heartbreaking allegation at the center of a recent Florida federal court opinion that’s stirring national interest.

What Was the Case About?

The case involved a 14-year-old boy in Florida who used an AI chatbot app called Character A.I., developed by Character Technologies and loosely affiliated with Google. The teen became emotionally attached—especially to a chatbot portraying a Game of Thrones character—and eventually died by suicide after a final exchange with that bot.

His mother sued, claiming the chatbot was dangerously anthropomorphic (humanlike), encouraged dependency, and lacked proper age restrictions or mental health safeguards. The lawsuit named not just the creators of the app but also Google—claiming Google provided cloud infrastructure and helped design the AI behind it.

What Did the Court Decide?

A federal judge allowed the product liability claims to proceed, ruling that AI software may, under Florida law, qualify as a “product.” This is a big deal. Product liability law usually applies to physical goods—cars, blenders, defective ladders—not code.

If upheld or mirrored elsewhere, this opens the door to lawsuits where AI apps or bots cause harm—whether through bad advice, manipulative behavior, or design flaws. That’s a seismic shift in how courts view emerging technologies.

Why It Matters to You

Whether you’re a business using AI tools or a parent of a teen interacting with AI-driven platforms, this case signals a turning point. Companies could face real legal exposure for the harms their AI products cause—even emotional or psychological harms.

At Hollingshead & Dudley, we’re closely tracking these legal shifts—and we’re prepared to help if AI-related negligence, dependency, or emotional harm becomes part of your legal story.


📞 Need help navigating legal issues involving AI or technology? Contact our firm to set up a consultation.