Uncategorized

Everything was announced at the I/O

MusicFX: Synchronization of a Real Time Mixer Mixer for a Non-Energy, Party-Like Conference

That doesn’t mean that conferences aren’t energetic or party-like, but they aren’t known for that. The company has a big event this year, and it kicked off with a performance by artist Marc Rebillet. He wears many robes.

Rebillet’s onstage demonstration was an entertaining showcase of its capabilities. He used the tool to prompt him to type in simple phrases like viola, and then generated a track with all of the styles in it. Rebillet used the audience input to make a track that featured Persian tar, djembe, and flamenco guitar, as well as improvised vocals that made it seem like 9:30AM was too early to be hosting such an event.

Users are presented with a mixer-style interface that spits out music based on text prompts, layering them together and syncing the resulting track. The music can be changed in real time by adding additional prompts to the mix. It is possible to try it out on the Test Kitchen. MusicFX is still being developed more than a year after it was introduced.

Overviews of AI Updates to Google Search and how to use Google Lens to support the Repair of a Record Player Powered by a Reverberator

Rebillet is best known for his song ” Night Time Bitch”, a song in which he screams at people to get out of bed while wearing a bathrobe. That small detail of context may explain why he opened I/O by climbing out of a coffee mug, yelled for the silly little nerds to wake up, and then fired rainbow-colored robes into the crowd.

WIRED’s Lauren Goode talked with Google head of search Liz Reid about all the AI updates coming to Google Search, and what it means for the internet as a whole.

A new feature called Multi step reasoning lets you find several layers of information about a topic when you’re searching for things with some contextual depth. Google used planning a trip as an example, showing how searching in Maps can help find hotels and set transit itineraries. Suggestions of restaurants and help in meal planning for the trip followed. You can deepen the search by looking for specific types of cuisine, or vegetarian options. This info is presented to you in a way that’s organized.

There are summaries that pool information from multiple sources to answer a question in the search box. These summaries appear at the top of the results so you don’t even need to go to a website to get the answers you’re seeking. These overviews are already controversial, with publishers and websites fearing that a Google search that answers questions without the user needing to click any links may spell doom for sites that already have to go to extreme lengths to show up in Google’s search results in the first place. Starting today, all of the United States will get enhanced artificial intelligence overviews.

Lastly, we saw a quick demo of how users can rely on Google Lens to answer questions about whatever they’re pointing their camera at. (Yes, this sounds similar to what Project Astra does, but these capabilities are being built into Lens in a slightly different way.) The demo showed a woman trying to get a “broken” turntable to work, but Google identified that the record player’s tonearm simply needed adjusting, and it presented her with a few options for video and text-based instructions on how to do just that. Through the camera it correctly identified the make and model of the turntable.

One of the last noteworthy things we saw in the keynote was a new scam detection feature for Android, which can listen in on your phone calls and detect any language that sounds like something a scammer would use, like asking you to move money into a different account. If it learns that you are getting fooled, it will interrupt the call and give you an on-screen prompt to hang up. Google says the feature works on the device, so your phone calls don’t go into the cloud for analysis, making the feature more private. Also, read WIRED’s guide to protecting yourself from scam calls.

Google has also expanded its SynthID watermarking tool meant to distinguish media made with AI. This can help you detect misinformation, deepfakes, or phishing spam. The tool leaves an imperceptible watermark that can’t be seen with the naked eye, but can be detected by software that analyzes the pixel-level data in an image. The new updates allow the scanning of content on the web, and in Veo-generated videos. The open-source tool is called “SynthID” and is expected to be released later this summer.