Thursday, November 7th

    Apple bets on AI

    img
    Apple's AI Intelligence tool, which aims to perform tasks like text paraphrasing, summarizing letters, creating fake emojis, and searching for images.
    It was reasonable to expect Apple to do what he had done in AI with so many functions and applications: Wait, take notes, and redirect it. However, it gave a light edge of contradictory technology, but it seems that society has reached the same wall as anyone. Like other AI, Apple Intelligence does nothing. Well, it does something. In fact, it does a few things. But like many AI tools, this seems to be a very computationally intensive shortcut for performing common tasks, which isn't necessarily a bad thing, especially when it comes to output - i.e. actual text analysis or generation etc. It would be more efficient to move it to the device itself.

    However, the price was set quite high. Tim Cook said at the start of Monday's Glowtime event that the "disruptive capabilities" of Apple Intelligence will have "incredible impact." Craig Federighi said this will "change a lot of what you do on your iPhone."

    Potential:
    • Paraphrase snippets of text
    • Summarize letters and messages
    • Create fake emojis and clip art
    • Search for images of people, places and events
    • Search for information

    Does this sound like a big step forward to you? There are countless writing aids available. The short possibilities are unique to almost all LLMs. The production art is lacking in effort. In this way, you can easily find this photo with any number of services. And our "stupid" vocal assistant was looking for a memo about Wikipedia ten years ago. Yes, there have been some improvements. 

    Admittedly, these things are best done locally and privately. This opens up new opportunities for people who can't easily use a traditional touch user interface. There are definitely some improvements in terms of usability. But literally none of this is new or interesting: there don't appear to be any major changes to these features since the post-WWDC beta release, aside from the expected bug fixes. (We'll know more when we have time to test them.)

    One would hope that "Apple's first phone built from the ground up for Apple Intelligence" would offer much more. It turns out that the 16 won't even have all of the features mentioned; they'll appear in a separate update. Lack of imagination or technology? Already, AI companies have started positioning their products as just enterprise SaaS tools (it turns out they weren't just repeating things they found online) rather than the "transformative" use cases we've heard so much about. AI models can be extremely valuable in the right place, but that place doesn't seem to be in your hands.

    There's a strange disconnect between the trivialization of these AI capabilities and the exaggeration of their descriptions. Apple is increasingly influenced by the methods that have appeared in the past with hidden breathing with suppression and innovation. Monday events are one of the most exciting events in recent years, but in any case, the language is more luxurious than usual. 

    So Apple, like other AI providers, is playing a multi-billion dollar game of "what if," pretending that these models are transformative and revolutionary when almost no one thinks they are. In fact, who could justify spending as much as these companies do when the end result is that they can do the same thing they did five years ago? AI models may be legitimately transformative in some areas of scientific research, some coding tasks, perhaps the design of materials and structures, and maybe (though probably not for the better) the media.

    But if we trust our eyes and fingers rather than Cook and Federighi's distortions of reality, the features we should be excited about don't seem to do anything new, much less revolutionary. Ironically, Apple's announcement failed to give AI an "iPhone moment."
    Tags :