According to 9to5Mac, during Apple’s fiscal Q1 2026 earnings call, CEO Tim Cook offered a rare glimpse into the adoption of Apple Intelligence. Cook stated that “the majority of users on enabled iPhones are actively leveraging the power of Apple Intelligence.” He made this comment right after noting Apple has surpassed 2.5 billion active devices globally. Cook specifically called out Visual Intelligence and Live Translation as popular features, sharing that the company has heard “powerful stories” about Live Translation’s impact. However, he provided no specific numbers on compatible iPhones or a clear definition of what “actively leveraging” actually means.
Cook’s Vague Victory
So, the majority of users on enabled devices are using it. That sounds great, right? But here’s the thing: that statement is a masterclass in selective transparency. We have no idea how many “enabled iPhones” are out there. Is it 100 million? 200 million? The 2.5 billion active devices number is a classic Apple misdirection—it includes every old iPad and MacBook in a drawer. The real addressable market for Apple Intelligence is a fraction of that. Calling Visual Intelligence a “hit” is nice, but without hard metrics, it’s just corporate cheerleading. It feels like they’re trying to create a perception of momentum before they have to show the real numbers.
What Counts as “Active”?
And what does “active leveraging” even mean? That’s the billion-dollar question Cook left unanswered. Does opening the Writing Tools menu once count? Or do you need to use Clean Up in Photos daily? For a company that loves to tout precise metrics like App Store revenue or Services growth, this vagueness is glaring. It probably means the threshold is very, very low. They want that stat to sound impressive, and a strict definition wouldn’t help. I think if you’ve used a single AI feature in the last month, you’re likely counted. It’s a soft launch metric, not proof of deep, daily integration.
The Real Test Is Coming
Look, the early buzz is probably real for features like Live Translation. That’s a genuinely useful, magic-moment kind of tool. But the trajectory here is what matters. Apple Intelligence isn’t fully baked yet; we’re waiting on those server-side features and a wider rollout. The real test will be in six months. Will people still be “actively leveraging” it, or will it become a novelty that fades? Apple’s bet is that deep, contextual AI woven into the OS will win over standalone chatbot apps. They might be right. But this earnings call snippet wasn’t proof—it was just the opening pitch. We’ll need much more than a vague majority statement to see if this intelligence has lasting power.
Basically, Cook did what he needed to do: plant a flag and signal that the feature is being adopted. For now, that’s enough for Wall Street. For the rest of us? The proof will be in our own daily use. Have you found yourself relying on it yet, or is it still out of sight, out of mind? Let me know what you think over on Twitter or YouTube.
