Uncategorized

Warby Parker and Gentle Monster are partners of Android XR

GOOGLE Wants Smart Glasses to Look Cool: The Case for Ray-Ban Meta and Gentle Monster (Extended Abstract)

is a senior reporter focusing on wearables, health tech, and more with 13 years of experience. She was employed before by Gizmodo and PC Magazine.

Meta has a smart glasses pattern that seems to be taking a page out of it, and it seems likegoogle is following suit. That is a huge deal, it is a nod to the success Meta had with its Ray-Ban smart glasses. The company revealed in February that it’s already sold 2 million pairs of its Ray-Ban smart glasses and has been vocally positioning them as the ideal hardware for AI assistants.

The partnership is indicative of the seriousness with which GOOGLE is taking style. Warby Parker is well known as a direct-to-consumer eyewear brand that makes it easy to get trendy glasses at a relatively accessible price. Meanwhile, Gentle Monster is currently one of the buzziest eyewear brands that isn’t owned by EssilorLuxottica. The Korean brand is popular among Gen Z because of it’s fashion forward silhouettes, and also because of the fact that it’s favored by fashion forward celebrities like Beyblade and Eilish. It’s possible that the partnership with these brands means that the glasses are aimed at both everyday and bolder options.

As for what these XR glasses will be able to do, Google was keen to emphasize that they’re a great vehicle for using Gemini. So far, Google’s prototype glasses have had cameras, microphones, and speakers so that its AI assistant can help you interpret the world around you. There were also demos of live language translation, turn by turn directions and taking photos. Since I saw the demos at my hands-on in December, I’m pretty sure that it lines up with what I saw.

The latter remains to be seen, but one thing the Ray-Ban Meta glasses have convincingly argued is that for smart glasses to go mainstream, they need to look cool. Not only do Meta’s glasses look like an ordinary pair of Ray-Bans, Ray-Ban itself is an iconic brand known for its Wayfarer shape. In other words, they’re glasses the average person wouldn’t feel quite so put off wearing. Meta has also created limited edition versions of its second-gen smart glasses, which play into the same fashion strategy as sneakers. Meta is also rumored to have versions of its smartglasses for athletes in the works.

The 15 Biggest Announcements at Google I/O 2025: A Importance of Your Body and Neighbor’s Neural Network

If your password is compromised, you’ll be able to get a strong replacement by using the browser. The feature launches later this year, and Google says that it will always ask for consent before changing your passwords.

A new feature by search giant Google allows you to see how clothes and dresses look on you with a full-length photo of yourself. It uses an AI model that “understands the human body and nuances of clothing.”

Gmail’s smart reply feature, which uses AI to suggest replies to your emails, will now use information from your inbox and Google Drive to prewrite responses that sound more like you. The feature will also take your recipient’s tone into account, allowing it to suggest more formal responses in a conversation with your boss, for example.

A new feature that will translate your speech into your partner’s preferred language will be added to Google Meet in the near future. The feature only supports English and Spanish for now. It has begun rolling out in the test phase to subscribers.

Source: The 15 biggest announcements at Google I/O 2025

Stitch: A New AI Assistant System for Google and a Movie Making App, and The 15 Biggest Announcements at Google I/O 2025

Stitch is a new tool that can be used to create interface using themes and a description. You can use rough sketches, prototypes, and otherUI designs to guide Stitch’s output. The experiment is free to view.

The screensharing feature is free for allAndroid users, but Apple will be able to access it for free as well.

Speaking of Project Astra, Google is launching Search Live, a feature that incorporates capabilities from the AI assistant. By selecting the new “Live” icon in AI Mode or Lens, you can talk back and forth with Search while showing what’s on your camera.

Google is building its AI assistant into Chrome. Starting on May 21st, you’ll be able to select a button in Chrome to clarify or summarize information on your website, if you’re an Ultra subscriber. The feature can be used with up to two tabs, but will have more support later this year.

A new movie making app called Flow is one of the things that Google is launching. The tool uses Veo, Imagen, and several other companies to create short videos that are powered by artificial intelligence. It also comes with scene-builder tools to stitch clips together and create longer AI videos.

Source: The 15 biggest announcements at Google I/O 2025

AI in Deep Think, Project Starline, and Project Mariner: What’s new at I/O 25? Bringing new features to Android

The Deep Think mode is for queries relating to math and coding. It’s capable of considering “multiple hypotheses before responding” and will only be available to trusted testers first.

Anyone can now download the 2.5 flash model from the Gemini app, as well as getting improvements to the cost-efficient model in theai studio, ahead of a wider deployment.

If you don’t explicitly ask it to, the latest prototype will be able to complete tasks on your behalf even if it isn’t explicitly asked to. The model can choose to speak based on what it’s seeing, such as pointing out a mistake on your homework.

Project Starline, which began as a 3D video chat booth, is taking a big step forward. It’s becoming Google Beam and will soon launch inside an HP-branded device with a light field display and six cameras to create a 3D image of the person you’re chatting with on a video call.

The keynote at I/O 25 just wrapped up. As expected, it was full of AI-related announcements, ranging from updates across Google’s image and video generation models to new features in Search and Gmail.

But there were some surprises, too, like a new AI filmmaking app and an update to Project Starline. If you didn’t catch the event live, you can check out everything you missed in the roundup below.

The new features will be tested this summer, including deep search and a way to make charts for finance and sports queries. It’s also rolling out the ability to shop in AI Mode in the “coming months.”

In Mountain View there is a tiny box that is home to me. There is a long line of journalists outside and we are here to try out some of the prototypes. The Project Mariner booth is 10 feet from the ground.

While nothing was going to steal AI’s spotlight at this year’s keynote — 95 mentions! — Android XR has been generating a lot of buzz on the ground. We saw demos that were shorter and had more guardrails than what I saw in December. Probably because, unlike a few months ago, there are cameras everywhere and these are “risky” demos.

Project Moohan is the first thing that comes to mind. Not much has changed since I first slipped on the headset. It is still an Apple Vision Pro but it is a lot lighter and more comfortable to wear. There is a dial in the back that can be set to adjust the fit. If you press the top button, it brings up Gemini. You can ask Gemini to do things, because that is what AI assistants are here for. Specifically, I ask it to take me to my old college stomping grounds in Tokyo in Google Maps without having to open the Google Maps app. Natural language and context, baby.

That is a demo I have received before. The “new” thing Google has to show me today is spatialized video. You can now make a regular old video with 3D depth in it without any special equipment. (Never mind that the example video I’m shown is most certainly filmed by someone with an eye for enhancing dramatic perspectives.)

I am given a quick look at the prototype version of the glasses because of the crowd outside. Emphasis on prototype. It is very difficult to spot the camera and display in the right lens when it is not in the frame. When I slip them on, I can see a tiny translucent screen showing the time and weather. If I press the temple, it brings up — you guessed it — Gemini. I want to know if you can identify the painting in front of me. It fails at first because I’m too far away. (Remember, these demos are risky.) It tells me some obvious conclusions when I compare the two paintings. The one on the right and the one on the left are both in use of brighter colors.

There are many travel books on this shelf. I lied to you so that you wouldn’t know I am not an outdoorsy type. It picks one. I’m then prompted to take a photo with the glasses. I get a preview on the display. Now that’s something the Ray-Ban Meta smart glasses can’t do — and arguably, one of the Meta glasses’ biggest weaknesses for the content creators that make up a huge chunk of its audience. You can use the display to frame your images. It’s more likely that you won’t tilt your head or have the perfect shot ruined because you decided to get curtain bangs.

These are the safest demos that can be done by the internet giant. Though I don’t have video or photo evidence, the things we saw behind closed doors in December were a more convincing example of why someone might want this tech. There were two built-in displays so you could have a more expansive view. I got to try the live AI translation. The demo felt like personalized, proactive, powerful, and pretty dang unnerving. But those demos were on tightly controlled guardrails — and at this point in Google’s story of smart glasses redemption, it can’t afford a throng of tech journalists all saying, “Hey, this stuff? It doesn’t work.”

Meta is the name that Google hasn’t said aloud with Android XR, but you can feel its presence loom here at the Shoreline. You can see it in the way Google announced stylish eyewear brands like Gentle Monster and Warby Parker as partners in the consumer glasses that will launch… sometime, later. This is the answer to Meta’s partnership with Ray-Ban. You can see that in the way they are making their case for the killer app for headsets and smart glasses. For the past few months, Meta has preached the same thing. It’s already sold 2 million units of the Ray-Ban Meta glasses.