Note
All views expressed in this post are entirely my own. I have not been sponsored by Agiga or any other smart glasses manufacturerors.
When reading, please keep in mind that while Agiga has confirmed some things about the final production release of EchoVision, the current builds that users have are early versions of the product, and anything discussed in this article may change by final release.
Introduction
In recent years, smart glasses have become popular for blind and sighted tech users alike. whether you’re using them for content creation, AI environment descriptions, or hands-free human visual assistance when needed, anyone can benefit from having a pair of smart glasses. But which glasses are right for you?
Whether it’s the original Envision glasses released in 2020, the now discontinued Seleste glasses, or the Meta and EchoVision glasses being discussed in this post, there have been several attempts at smart glasses either for mainstream use exclusively, just for the blind, or in the case of the Meta glasses, a mainstream product that blind and visually impaired users have been able to adapt to meet our needs.
This post does not pretend to know everything there is to know about either of these glasses and have all the answers. My intention is to provide an overview of each pair of glasses as someone with exposure to both product lines and explain what each one does well, what its issues are, and help you make an informed choice if considering smart glasses for yourself.
Pricing
As we all know, the first and most important factor when considering any product is the cost. While neither pair of glasses can be considered “cheap”, they are more budget friendly when compared to products like the OrCam and Envision glasses.
If considering a pair of Meta glasses, prices start at $300.00 USD at most retailers. Depending on size, style, or brand, the price may be higher. You may also find lower prices around holidays, so keep an eye on sales offered by different retailers.
If considering the Agiga EchoVision glasses, expect to pay $600.00 USD. Note that there will be a subscription for the AI if you did not pre-order, but at the time of this writing that price is unknown. I will update the post when Agiga makes this information available.
Size and fit
Now that you have looked at pricing, you need to determine what will fit you the best. With the Meta glasses, there are several sizes and styles. While not required, you may find it beneficial to visit a retailer who carries Meta glasses and try them on to determine the best style and fit.
With the EchoVision glasses, Agiga has taken a one size fits all approach. Although the company does include nose pads to help avoid sliding, these may not work for everyone, and people with smaller heads may still find that the glasses feel insecure.
One thing I would like to see changed with the EchoVision glasses is the arms. On the current pioneer units, they are very floppy. This means that unless you are holding the arms, they will not stay open by themselves.
Can I use my prescription lenses?
Unlike the Orcam device which mounts to an existing pair of glasses, any of the devices from Ray-Ban, Oakley, and Agiga are full glasses. While not impossible to use for those that already have prescriptions, there are more considerations.
The good news is that both EchoVision and Meta glasses allow you to pick your lenses. This includes adding an existing prescription if you have one. While I have not experienced it since I do not use prescription lenses myself, I have heard from some online who are unable to use the Meta glasses because their lenses are too thick for the frames. It is unknown if this problem is present with the EchoVision glasses.
Connection and usage
Once you have your pair of smart glasses, it’s time to start using them. But how? If using any Meta glasses either from Ray-Ban or Oakley, you will need to download and install Meta AI from the iOS App Store or the Google Play Store. If using the EchoVision glasses, you will need the EchoVision Glasses app which can be found on the App Store. I tried to find the app on the Play Store, but I can only find the app on APKPure.
Once you have the appropriate app for your glasses, follow the prompts to connect them with your phone and wi-fi. Note that the EchoVision glasses require a wi-fi connection. This can be either a mobile hotspot from your phone or dedicated device or your standard home wi-fi. It is also important to note that in places like airports or hotels that require you to agree to terms and enter information to connect to their networks, the glasses will not connect.
Note for hotspot users: Heavy use of AI or human assistance while on a mobile hotspot may burn through data quickly.
If using the Meta glasses, make sure you have the Meta AI app running in the background. Unlike EchoVision which processes all requests on device, all Meta AI requests are processed over the Bluetooth connection with your smartphone.
Audio output
When it comes to audio, both Meta and EchoVision have a stereo speaker setup that fires sound down toward your ears. Unlike the Meta glasses though, EchoVision gives you the ability to connect external audio devices through Bluetooth. This is great if you need to demo the glasses, connect to a pair of earbuds in loud environments, and especially for hearing aid users that require direct audio.
Both glasses have the ability to act as wireless headphones, but at the time of this writing EchoVision’s ability to stay connected is unstable, and they need to be re-paired to your device several times. While neither pair of glasses are as good as dedicated headphones or other audio devices, they do not sound bad, and it is bearable to use both of them for music or other media consumption while out of the house. Keep in mind that in loud environments, both pairs of glasses can be hard to hear if listening to this type of content, and you may have to turn your glasses to max volume.
EchoVision AI Features and functions
One of the biggest selling points of smart glasses is the AI. Whether it’s asking general knowledge questions, hands-free navigation, or obtaining descriptions about text, objects, and people in your environment, these AI tools offer several benefits. Like any piece of technology however, neither tool is perfect, and they have benefits and drawbacks. Meta AI, for example, requires that every request is spoken. With Agiga, AI features can be activated using various amounts of presses of the AI button on the top of the right arm of the glasses. Agiga has stated that they will add voice commands for hands-free use, but at this time all modes are selected with the AI button. At the time of this writing, the three available modes on the AI button are:
- Single press, scene description and voice command
- Double press, activate Live AI
- Triple press: Activate reading mode.
Mode descriptions
Voice command
If you press the AI button one time, you will hear two rising tones. This indicates that the AI is listening for a request.
You have the ability to ask for scene descriptions, some general knowledge questions, and for human assistance which will be discussed later in this article.
If you do not speak when activating this feature, the glasses will begin a scene description. After each description, you will hear the ascending tones, and you can ask the glasses for a more detailed description of the entire scene, or you can ask about specific items in the picture.
Live AI
When you press the AI button on the right arm of your glasses two times, you are taken into Live AI mode. Live AI is a mode that provides image descriptions as you walk or turn your head in your environment. Just like the voice command feature, Live AI gives you the ability to ask follow-up questions when you receive an image description. Keep in mind though, the mic is very sensitive when using this feature, so unless you are asking the AI a question or commenting on what it told you, you will want to mute with a one-finger single tap on the touchpad. The glasses will say “Muted” to confirm they are no longer listening. When you tap the touchpad again, the glasses will say “Unmuted”. While at ATIA, I had the opportunity to catch up with the Agiga team in person, and they stated that in the future they may consider adding a wake word to Live AI so the glasses know not to start listening until you are ready for them to do so.
It is also important to note that since this feature depends on an internet connection, it may not behave as intended if on unstable connections. It is also important to remember that Live AI may not be the best option for navigating environments if you are a fast walker. When you move, it takes a picture, then has to send it to the cloud for processing. Because of this, you may receive descriptions of things you passed several seconds ago.
If your glasses are behind in their descriptions, you do have the ability to make them catch up by performing a one-finger double tap on the touchpad. While this can help, it does not always work as intended. To fix it, double tap the AI button and Live AI will restart. To exit Live AI or any mode, press and hold the AI button. You will hear two tones, and the glasses will return to standby mode.
Read mode
Read mode can be activated by performing a triple press of the AI button. As the name suggests, read mode gives you the ability to scan mail and other types of text in your environment.
When activated, the glasses give you instructions on where to position in order to capture the best view of text. When they have a good picture, they will instruct you to hold still, and you will then be able to hear the text they just scanned. When the reading ends, you can scan another page by performing a single press of the AI button, or you can exit Read mode by pressing and holding the same button.
Meta Glasses modes and functions
Meta AI
Meta AI is the voice assistant found on any of the Meta smart glasses produced by Ray-Ban and Oakley, as well as in the Meta AI app on your smartphone. Like other voice assistants, Meta has the ability to answer general knowledge questions, set reminders, tell you the weather, play music, and anything else you might expect your voice assistant to be capable of.
By default, Meta AI is activated on your glasses by saying “Hey Meta” followed by your question. When the assistant is listening, it will play a chime and listen for your question. In the Meta AI app, you also have the ability to set presses and holds of either the touchpad or capture button to bring up the AI. Unlike EchoVision however, most of the Meta products do not allow you to set AI actions to a button press. Example, you cannot set image description to tap and hold the touchpad with two fingers. Although you can set the AI to be activated this way, you still have to give the AI an instruction by voice. Prompts can be things such as “What am I holding?” “What am I looking at?” “Is there a door near me?” Keep in mind, Meta AI, just like other tools with this ability, can hallucinate, so always verify the information you receive is accurate.
Unlike the EchoVision glasses, Meta AI has the ability to place calls. These calls can be made from your phone’s contact list, WhatsApp, Messenger, and more recently Instagram. For blind and visually impaired users, we also have direct integration with Be My Eyes. This can be activated by linking your account, then saying “Hey Meta, Be My Eyes”. The Meta glasses also have the ability to work with Be My Eyes groups, but the call has to be initiated on the phone. Once the call is connected though, it is possible to switch to the glasses camera by performing a double tap of the capture button. This works in any app that transmits video from the glasses when in a call.
Live AI
Just like the EchoVision glasses, Meta glasses include a Live AI feature. To access the feature, say “Hey Meta, start Live AI”. Unlike EchoVision though, it is not as powerful. Not only does it eat up your battery, at the time of this writing it is unable to provide real-time updates as you move or pan around a room. You constantly have to ask it questions. The only advantage is that you do not have to say “Hey Meta” every time you wish to ask a question. When you are finished, say “Stop Live AI”, or “End Live AI”. Some glasses may be pickier than others with the wording, so try different things until it works.
Receiving human assistance
While AI is a powerful tool that will only get better over time, the reality is it’s not all there yet. Because of that, we still have to rely on human assistance.
“What if I don’t have people near me” you ask? Well, both glasses give you the ability to receive remote assistance. Remember when I mentioned human assistance in the voice command feature of the EchoVision glasses earlier? EchoVision has direct integration with both Aira and Be My Eyes. They can be accessed by saying “Aira”, “human agent”, “Be My Eyes”, “Volunteer”, and prompts including these words. Just like on your smartphone, the call will be placed, and a person will be able to see through your glasses camera and communicate with you by voice. Aira will also have access to your location. The only issue with human assistance at this time is that EchoVision has no way to integrate with Be My Eyes groups or set up trusted contacts for sensitive tasks or those that would prefer not to talk to someone they do not know.
As previously stated, Be My Eyes is already integrated into the Meta platform. Aira has been using the Meta glasses since the Fall of 2024, but they have had to rely on using WhatsApp due to a lack of options for third-party developers to integrate with the platform. As of February 2026, however, Aira has announced that after the announcement of Meta allowing third-party applications last year, they are working on adding native Meta support and removing WhatsApp entirely, but at this time we have no definitive timelines for when we can expect to see this ship.
While Aira agents will still have your location with the Meta glasses, they are currently unable to take pictures at this time.
Battery life
When relying on smart glasses, you need to make sure you are purchasing a product whose battery can meet your needs. With the Meta glasses, battery life is estimated from four to eight hours depending on your model. Agiga estimates around six hours with EchoVision, but it is hard to say for sure how accurate this is. Naturally, if you are using more camera heavy tasks, the battery will drain much faster that if they are in standby or being used as an audio device.
While on the go, you have the ability to charge both pairs of glasses. The Meta glasses have small battery packs that can clip to the left arm, and there are also cords that can connect to a USB-C battery pack or your phone to charge the glasses while in use. EchoVision includes a USB-C charging port on the end of the right arm for charging outside of the case either at home or while walking. If receiving human assistance or relying heavily on AI features, having access to these portable chargers will be extremely valuable.
Content creation
While not everyone will care about this section, we have to consider content creators. Are either of these glasses worth it? Naturally, the Metas being primarily designed for content creation are a great tool with their photo and video functions. While EchoVision is able to perform these functions, the audio is not the highest quality in videos. However, photos and videos appear to be taken in landscape from what I can tell with the little vision I have. While I understand that EchoVision is not designed to be a content creation tool first, the audio quality is still disappointing. At this price point, videos should not sound like they are coming out of an old telephone.
The final verdict. Who should you choose?
After reading this post, you probably still have some questions. The honest answer is that there is no right product. Like any product, both products have their strengths and weaknesses, and they are both continuing to grow in different ways. EchoVision is growing because Agiga is still a relatively new company, and Meta is growing because they are primarily a mainstream product that we as blind and visually impaired users have been able to adapt to our needs, and more recently the announcement of third-party app integrations becoming possible.
If you prefer to stick to mainstream products over those designed specifically for users who are blind or visually impaired, the Meta glasses come out on top. If you value products designed specifically for us since they are developed as an accessibility first product, then the EchoVisions are the clear winner.
If price is a deciding factor for you, the Meta glasses are over $200.00 cheaper than the EchoVision glasses, and you will not be required to pay any subscriptions, so Meta wins here.
Conclusion
As someone who has access to both products, I am unable to decide on a clear winner. Since the EchoVision glasses are still in a testing phase, I can get my money back should I choose not to receive a production unit. Once Meta’s app SDK is in the field and third-party apps come on board, it may be an easier call. In the meantime, I will continue using both products in various real world situations.
Leave a Reply