Apple to pay $250M to settle lawsuit over Siri s delayed AI features

The story

Apple has agreed to pay $250 million to settle a class action lawsuit for overpromising the arrival of Siri's AI features.
From the source
The first StrictlyVC of 2026 hits SF on April 30. Tickets are going fast. Register now.
Image Credits: Cheng Xin / Getty Images Hardware Apple to pay $250M to settle lawsuit over Siri s delayed AI features Lauren Forristal 8:12 AM PDT · May 6, 2026 Apple has agreed to pay $250 million to settle a class action lawsuit over how it marketed its AI features ahead of the launch of the iPhone 16. The Financial Times was the first to report the news.
The lawsuit alleged that Apple exaggerated the breadth of features Apple Intelligence would bring, which included a significantly upgraded version of its assistant, Siri. The complaint alleges that the company created the impression that advanced AI capabilities would be available to users sooner than they actually were. In particular, the plaintiffs allege that Apple overstated both the readiness and functionality of these features, particularly the promised improvements to Siri, which have yet to fully materialize.
Who and what
Key names and topics in this story: Apple, Siri.
Where to follow next
- Read the full piece at techcrunch.com
- More from our AI & prompts coverage

Related stories

Apple plans to make iOS 27 a Choose Your Own Adventure of AI models
With Apple's latest operating system updates, users will reportedly have their pick of which third-party AI models they want to use for a host of tasks.

At TechCrunch Disrupt 2026, all your M A questions will be answered
Leaders from Coinbase, M13, and Mignano Law Group talk about how M A is an early-stage strategy at TechCrunch Disrupt 2026. Register to hear this live.

Google updates AI search to include expert advice from Reddit and other web forums
While citing web forums and discussion boards can help users find answers to more niche queries, this design choice could also prove chaotic.

Google s Gemma 4 open AI models use "speculative decoding" to get up to 3x faster
Up to 3x the speed with no loss of quality—is it too good to be true?