Hey Meta, look and tell me what you see I see a person wearing a black hoodie and jeans standing in front of a large window with a camera held up to their eye. So these are the Meta Rayband Smart glasses I Did a video about them already earlier this year, but they just got an update that I have early access to. That makes this smart assistant multimodal. so it connects the AI to the camera on the front and it's pretty good so you always have to start it with the same phrase.
So I have to say hey Meta, Look and tell me how to save this plant. The plant appears to be a Draa, which is known for its durability and low maintenance requirements. To save it, make sure it's getting enough and direct sunlight and water it sparingly, allowing the soil to dry out between waterings. So it's pretty basic now.
But theoretically in the future you can look at a leaky faucet and it can tell you how to fix it. or you can look at a restaurant menu. It'll read the whole thing and tell you which ones have nuts or which ones are spicy. Accessibility aside, that's pretty cool.
We are getting closer to Jarvis
A blind person could use.
โWhat do you see?โ
โI see a tall man with a 12 gauge shotgun pointed at youโ
this seems neat, but the amount of data these thing really needs to understand the subtle relationship of what we are looking at and what we are talking about it staggering and there will never be enough
This could be amazing for sight impaired people
Cool, but useless as well
I want the version of this that can interface with my editor and suggest code changes. Itโs game over after that ๐ฎ
Wow meta hired the "my money don't jiggle jiggle it folds" guy
O futuro paraece promissor e super interessante.
That guilty look when he realized that overwatering is a thing
iRobot is coming soon
Pretty cool, but not when Meta makes it.
This is so dumb theyre running out of ideas ๐
For visual disturbances omg๐ฎ๐
Test it with a blind person and get AI yo direct that individual around by using itโs navigation camera