I waited for an hour to try Google's Android XR smart glasses, only 90 seconds

OK, that's going well. After Google I/O announced the two-hour NONSTOP GEMINI AI announcement, I waited an hour in the news lounge to have the opportunity to try a pair of Android XR Smart Glasses or Samsung's Moohan Project Moohan mixed reality headsets. Apparently, I went to Android XR smart glasses to see how they compare to Meta's $10,000 Orion Concept and Google Glass. Is Android XR the holy grail of the smart glasses we have been waiting for for a decade? Unfortunately, Google only lets me try for 90 seconds.
I was promised to use the Android XR headset prototype for five minutes, and only three minutes in total, with half of the product representatives explaining to me how smart glasses work. Within 90 seconds, I was told to tap the right side of my glasses to call Gemini. The star icon of the AI appears in the right lens (the pair of Android XR glasses have only a tiny transparent display), slightly below its center point. I was instructed to only talk to Gemini. I turned around and looked at a painting hanging on the wall, asked what I was looking at, who drew it, and asked for the art style. Gemini answered confidently. I don't know if the answer is correct. I looked at the bookshelf and asked Gemini to tell me the name of the book, and it was indeed the case. The sales representative then used a phone paired with glasses to load Google Maps. He told me to look down at my feet and I saw a small part of the map. I looked back and Gemini pulled the turn navigation there.
Then I opened the door in the 10 x 10 foot wooden box and I was told I was done. The whole thing is incredible, and honestly, I hardly even realized what the Gemini is doing. As he explained to me the Android XR demonstration, the AI kept speaking to the representatives. I'm not sure this is a wrong activation, an error or an error. When I asked about paintings and books, I didn't need to keep tapping on the sides of my glasses – Gemini kept listening, just changing gears. That part is neat.
Android XR glasses aren't even comparable to Meta's Orion smart glasses, which are also a prototype concept at this stage. You can see more and do more with waveguide lenses for Orion and silicon carbides. Orion runs multiple application windows, such as Instagram and Facebook Messenger, and even has “holographic” games table tennis For counterfeit, you can play the role of another person wearing your own AR glasses. Versus Snapchat's latest AR “Spectacles” and its ultra-narrow field of view, I would say that the Android XR prototype and its single display might actually be better. If you want to have lower-functioning hardware, rely on its strength.
As for the smart glasses themselves, they feel like any pair of thick sunglasses, and they feel relatively light. They did slip a little from my nose, but it was just because my Asian nose was flatter. They didn't seem to slide off the nose of my friend and Engadget Arch-Nemess. (I'm just kidding; I love Karissa.) I can't check battery life in 90 seconds.
So, this is my first impression of the first pair of Android XR smart glasses. This is not a lot, but there isn't either. Part of me wondered why Google chose such a demo time. My Spidey Sense tells me that it may not be like what appears in the I/O theme demo. Of course, what I saw felt like a better version of Google Glass, the screen resembles a super small head display located in the center of the right lens instead of the top of the right eye. But in just 90 seconds, I couldn't form a firm opinion. I need to see more, and I don't even see a part of it. Google, you got my phone number – give me a call!