Uncategorized

Is Apple about to finally launch a true voice activated device?

Expected Features of Apple’s Embedded Applications in the Xyz Era: Where are We Going? What Do We Expect to Expect at WWDC 2024?

In May, Apple released a refreshed iPad Pro with a new M4 chip that wasn’t the first one to come to a Mac. We will see M4 chips in Macs soon, but that is not what I am going to debate. The Mac Studio and Mac Pro continue to lag behind with M2 chips, although Apple offers all of its laptops with M3 chips.

New and improved versions of Apple’s built-in apps are expected to be included as part of operating system updates for the Mac, iPad, and beyond. The calculator app will be getting a refresh in the near future, as well as a password manager and redesigns of Apple’s Control Center screens.

At long last, Apple may finally let users arrange apps to their liking. MacRumors reports that you’ll finally be able to leave blank spaces between apps in iOS 18. Apple is going to integrate a theming system into its OS, so that it will allow you to change icons to match each other.

Apple has finally adopted Rich Communication Services as the default fallback for iMessage and we will likely see it arrive as part of iOS 18. This will mean that, soon, Apple’s iPad andAndroid’s phone can send longer text messages and higher quality photos to one another, even though they are blue and green in color.

Source: What to expect at Apple’s WWDC 2024

How much AI should Apple do? Why Apple hasn’t yet figured out how to ask for help? The case of the iPhone 4S

Apple’s operating system could be updated with artificial intelligence, according to rumors. According to Bloomberg, Apple has reworked Siri using large language models to help it better understand what users want and better respond to their queries. The new version of Siri will reportedly be able to take actions for you inside of Apple’s own apps, potentially making the assistant far more capable than it is today.

Every year, Apple’s WWDC comes third after Google’s I/O and Microsoft’s Build developer conferences, and Apple has hardly ever needed to announce a product in response. Things are different this time.

Over the past month, Apple’s biggest rivals presented bold plans for AI, with Google showing its latest Gemini models and Microsoft revealing powerful Copilot features like Recall. Now, Apple has to step up to the plate and show that it isn’t far behind in bringing its customers useful generative AI experiences.

This has obviously been the vision for Siri all along. You can even see it in those iPhone 4S commercials: these celebs are asking Siri for help, and Siri almost never actually finishes the job. It provides Deschanel with a list of restaurants that mention delivery but doesn’t offer to order anything or show her the menu. Is it possible that it already knows that Scorsese is going to be late for his meeting? Malkovich is told to be nice to people but she doesn’t offer any practical help. So far, using Siri is like having a virtual assistant with only one job – to search for you. Which is something! But it isn’t much.

The second reason Siri never quite worked is simply that neither Apple nor third-party developers ever figured out how it should work. How are you supposed to know what Siri can do or how to ask? How are developers supposed to work with something? If you want to include a task to your to-do list app, you can not just use one app. You have to say, Hey Siri, ask me to water the grass in Todoist, which is a weird sentence that makes no sense and it fails about half the time. The only thing you can do to do a multistep action is to muck around with Shortcuts, an excellent tool that does not require you to write code. It is too much for most people.

But if Apple has cracked something here, this could be the first time we ever get to see the real Siri — the Siri we were promised all those years ago. If Deschanel makes a tomato soup that magically appears at her house, and the headspace app fires up to bring peace, it may be in the next commercial. Maybe, we’re going to get the voicerecognition program Apple has been wanting to make.

AI might also give Apple a chance to end run the whole problem. Its researchers published a paper earlier this year detailing a system called Ferret-UI, which uses an AI model to understand small details of an onscreen image. The researchers even detail how an overall app using Siri might work: OpenAI’s GPT-4 does a good job of broadly understanding what an image is, and then Ferret is able to understand small regions and details. In practice, that might mean one system says, “This is the Ticketmaster app!” The other says to use the buy button.