Category: Tech

  • EchoVision Vs. Meta: Which Smart Glasses are Right for You?

    Note

    All views expressed in this post are entirely my own. I have not been sponsored by Agiga or any other smart glasses manufacturerors.

    When reading, please keep in mind that while Agiga has confirmed some things about the final production release of EchoVision, the current builds that users have are early versions of the product, and anything discussed in this article may change by final release.

    Introduction

    In recent years, smart glasses have become popular for blind and sighted tech users alike. whether you’re using them for content creation, AI environment descriptions, or hands-free human visual assistance when needed, anyone can benefit from having a pair of smart glasses. But which glasses are right for you?

    Whether it’s the original Envision glasses released in 2020, the now discontinued Seleste glasses, or the Meta and EchoVision glasses being discussed in this post, there have been several attempts at smart glasses either for mainstream use exclusively, just for the blind, or in the case of the Meta glasses, a mainstream product that blind and visually impaired users have been able to adapt to meet our needs.

    This post does not pretend to know everything there is to know about either of these glasses and have all the answers. My intention is to provide an overview of each pair of glasses as someone with exposure to both product lines and explain what each one does well, what its issues are, and help you make an informed choice if considering smart glasses for yourself.

    Pricing

    As we all know, the first and most important factor when considering any product is the cost. While neither pair of glasses can be considered “cheap”, they are more budget friendly when compared to products like the OrCam and Envision glasses.

    If considering a pair of Meta glasses, prices start at $300.00 USD at most retailers. Depending on size, style, or brand, the price may be higher. You may also find lower prices around holidays, so keep an eye on sales offered by different retailers.

    If considering the Agiga EchoVision glasses, expect to pay $600.00 USD. Note that there will be a subscription for the AI if you did not pre-order, but at the time of this writing that price is unknown. I will update the post when Agiga makes this information available.

    Size and fit

    Now that you have looked at pricing, you need to determine what will fit you the best. With the Meta glasses, there are several sizes and styles. While not required, you may find it beneficial to visit a retailer who carries Meta glasses and try them on to determine the best style and fit.

    With the EchoVision glasses, Agiga has taken a one size fits all approach. Although the company does include nose pads to help avoid sliding, these may not work for everyone, and people with smaller heads may still find that the glasses feel insecure.

    One thing I would like to see changed with the EchoVision glasses is the arms. On the current pioneer units, they are very floppy. This means that unless you are holding the arms, they will not stay open by themselves.

    Can I use my prescription lenses?

    Unlike the Orcam device which mounts to an existing pair of glasses, any of the devices from Ray-Ban, Oakley, and Agiga are full glasses. While not impossible to use for those that already have prescriptions, there are more considerations.

    The good news is that both EchoVision and Meta glasses allow you to pick your lenses. This includes adding an existing prescription if you have one. While I have not experienced it since I do not use prescription lenses myself, I have heard from some online who are unable to use the Meta glasses because their lenses are too thick for the frames. It is unknown if this problem is present with the EchoVision glasses.

    Connection and usage

    Once you have your pair of smart glasses, it’s time to start using them. But how? If using any Meta glasses either from Ray-Ban or Oakley, you will need to download and install Meta AI from the iOS App Store or the Google Play Store. If using the EchoVision glasses, you will need the EchoVision Glasses app which can be found on the App Store. I tried to find the app on the Play Store, but I can only find the app on APKPure.

    Once you have the appropriate app for your glasses, follow the prompts to connect them with your phone and wi-fi. Note that the EchoVision glasses require a wi-fi connection. This can be either a mobile hotspot from your phone or dedicated device or your standard home wi-fi. It is also important to note that in places like airports or hotels that require you to agree to terms and enter information to connect to their networks, the glasses will not connect.

    Note for hotspot users: Heavy use of AI or human assistance while on a mobile hotspot may burn through data quickly.

    If using the Meta glasses, make sure you have the Meta AI app running in the background. Unlike EchoVision which processes all requests on device, all Meta AI requests are processed over the Bluetooth connection with your smartphone.

    Audio output

    When it comes to audio, both Meta and EchoVision have a stereo speaker setup that fires sound down toward your ears. Unlike the Meta glasses though, EchoVision gives you the ability to connect external audio devices through Bluetooth. This is great if you need to demo the glasses, connect to a pair of earbuds in loud environments, and especially for hearing aid users that require direct audio.

    Both glasses have the ability to act as wireless headphones, but at the time of this writing EchoVision’s ability to stay connected is unstable, and they need to be re-paired to your device several times. While neither pair of glasses are as good as dedicated headphones or other audio devices, they do not sound bad, and it is bearable to use both of them for music or other media consumption while out of the house. Keep in mind that in loud environments, both pairs of glasses can be hard to hear if listening to this type of content, and you may have to turn your glasses to max volume.

    EchoVision AI Features and functions

    One of the biggest selling points of smart glasses is the AI. Whether it’s asking general knowledge questions, hands-free navigation, or obtaining descriptions about text, objects, and people in your environment, these AI tools offer several benefits. Like any piece of technology however, neither tool is perfect, and they have benefits and drawbacks. Meta AI, for example, requires that every request is spoken. With Agiga, AI features can be activated using various amounts of presses of the AI button on the top of the right arm of the glasses. Agiga has stated that they will add voice commands for hands-free use, but at this time all modes are selected with the AI button. At the time of this writing, the three available modes on the AI button are:

    • Single press, scene description and voice command
    • Double press, activate Live AI
    • Triple press: Activate reading mode.

    Mode descriptions

    Voice command

    If you press the AI button one time, you will hear two rising tones. This indicates that the AI is listening for a request.

    You have the ability to ask for scene descriptions, some general knowledge questions, and for human assistance which will be discussed later in this article.

    If you do not speak when activating this feature, the glasses will begin a scene description. After each description, you will hear the ascending tones, and you can ask the glasses for a more detailed description of the entire scene, or you can ask about specific items in the picture.

    Live AI

    When you press the AI button on the right arm of your glasses two times, you are taken into Live AI mode. Live AI is a mode that provides image descriptions as you walk or turn your head in your environment. Just like the voice command feature, Live AI gives you the ability to ask follow-up questions when you receive an image description. Keep in mind though, the mic is very sensitive when using this feature, so unless you are asking the AI a question or commenting on what it told you, you will want to mute with a one-finger single tap on the touchpad. The glasses will say “Muted” to confirm they are no longer listening. When you tap the touchpad again, the glasses will say “Unmuted”. While at ATIA, I had the opportunity to catch up with the Agiga team in person, and they stated that in the future they may consider adding a wake word to Live AI so the glasses know not to start listening until you are ready for them to do so.

    It is also important to note that since this feature depends on an internet connection, it may not behave as intended if on unstable connections. It is also important to remember that Live AI may not be the best option for navigating environments if you are a fast walker. When you move, it takes a picture, then has to send it to the cloud for processing. Because of this, you may receive descriptions of things you passed several seconds ago.

    If your glasses are behind in their descriptions, you do have the ability to make them catch up by performing a one-finger double tap on the touchpad. While this can help, it does not always work as intended. To fix it, double tap the AI button and Live AI will restart. To exit Live AI or any mode, press and hold the AI button. You will hear two tones, and the glasses will return to standby mode.

    Read mode

    Read mode can be activated by performing a triple press of the AI button. As the name suggests, read mode gives you the ability to scan mail and other types of text in your environment.

    When activated, the glasses give you instructions on where to position in order to capture the best view of text. When they have a good picture, they will instruct you to hold still, and you will then be able to hear the text they just scanned. When the reading ends, you can scan another page by performing a single press of the AI button, or you can exit Read mode by pressing and holding the same button.

    Meta Glasses modes and functions

    Meta AI

    Meta AI is the voice assistant found on any of the Meta smart glasses produced by Ray-Ban and Oakley, as well as in the Meta AI app on your smartphone. Like other voice assistants, Meta has the ability to answer general knowledge questions, set reminders, tell you the weather, play music, and anything else you might expect your voice assistant to be capable of.

    By default, Meta AI is activated on your glasses by saying “Hey Meta” followed by your question. When the assistant is listening, it will play a chime and listen for your question. In the Meta AI app, you also have the ability to set presses and holds of either the touchpad or capture button to bring up the AI. Unlike EchoVision however, most of the Meta products do not allow you to set AI actions to a button press. Example, you cannot set image description to tap and hold the touchpad with two fingers. Although you can set the AI to be activated this way, you still have to give the AI an instruction by voice. Prompts can be things such as “What am I holding?” “What am I looking at?” “Is there a door near me?” Keep in mind, Meta AI, just like other tools with this ability, can hallucinate, so always verify the information you receive is accurate.

    Unlike the EchoVision glasses, Meta AI has the ability to place calls. These calls can be made from your phone’s contact list, WhatsApp, Messenger, and more recently Instagram. For blind and visually impaired users, we also have direct integration with Be My Eyes. This can be activated by linking your account, then saying “Hey Meta, Be My Eyes”. The Meta glasses also have the ability to work with Be My Eyes groups, but the call has to be initiated on the phone. Once the call is connected though, it is possible to switch to the glasses camera by performing a double tap of the capture button. This works in any app that transmits video from the glasses when in a call.

    Live AI

    Just like the EchoVision glasses, Meta glasses include a Live AI feature. To access the feature, say “Hey Meta, start Live AI”. Unlike EchoVision though, it is not as powerful. Not only does it eat up your battery, at the time of this writing it is unable to provide real-time updates as you move or pan around a room. You constantly have to ask it questions. The only advantage is that you do not have to say “Hey Meta” every time you wish to ask a question. When you are finished, say “Stop Live AI”, or “End Live AI”. Some glasses may be pickier than others with the wording, so try different things until it works.

    Receiving human assistance

    While AI is a powerful tool that will only get better over time, the reality is it’s not all there yet. Because of that, we still have to rely on human assistance.

    “What if I don’t have people near me” you ask? Well, both glasses give you the ability to receive remote assistance. Remember when I mentioned human assistance in the voice command feature of the EchoVision glasses earlier? EchoVision has direct integration with both Aira and Be My Eyes. They can be accessed by saying “Aira”, “human agent”, “Be My Eyes”, “Volunteer”, and prompts including these words. Just like on your smartphone, the call will be placed, and a person will be able to see through your glasses camera and communicate with you by voice. Aira will also have access to your location. The only issue with human assistance at this time is that EchoVision has no way to integrate with Be My Eyes groups or set up trusted contacts for sensitive tasks or those that would prefer not to talk to someone they do not know.

    As previously stated, Be My Eyes is already integrated into the Meta platform. Aira has been using the Meta glasses since the Fall of 2024, but they have had to rely on using WhatsApp due to a lack of options for third-party developers to integrate with the platform. As of February 2026, however, Aira has announced that after the announcement of Meta allowing third-party applications last year, they are working on adding native Meta support and removing WhatsApp entirely, but at this time we have no definitive timelines for when we can expect to see this ship.

    While Aira agents will still have your location with the Meta glasses, they are currently unable to take pictures at this time.

    Battery life

    When relying on smart glasses, you need to make sure you are purchasing a product whose battery can meet your needs. With the Meta glasses, battery life is estimated from four to eight hours depending on your model. Agiga estimates around six hours with EchoVision, but it is hard to say for sure how accurate this is. Naturally, if you are using more camera heavy tasks, the battery will drain much faster that if they are in standby or being used as an audio device.

    While on the go, you have the ability to charge both pairs of glasses. The Meta glasses have small battery packs that can clip to the left arm, and there are also cords that can connect to a USB-C battery pack or your phone to charge the glasses while in use. EchoVision includes a USB-C charging port on the end of the right arm for charging outside of the case either at home or while walking. If receiving human assistance or relying heavily on AI features, having access to these portable chargers will be extremely valuable.

    Content creation

    While not everyone will care about this section, we have to consider content creators. Are either of these glasses worth it? Naturally, the Metas being primarily designed for content creation are a great tool with their photo and video functions. While EchoVision is able to perform these functions, the audio is not the highest quality in videos. However, photos and videos appear to be taken in landscape from what I can tell with the little vision I have. While I understand that EchoVision is not designed to be a content creation tool first, the audio quality is still disappointing. At this price point, videos should not sound like they are coming out of an old telephone.

    The final verdict. Who should you choose?

    After reading this post, you probably still have some questions. The honest answer is that there is no right product. Like any product, both products have their strengths and weaknesses, and they are both continuing to grow in different ways. EchoVision is growing because Agiga is still a relatively new company, and Meta is growing because they are primarily a mainstream product that we as blind and visually impaired users have been able to adapt to our needs, and more recently the announcement of third-party app integrations becoming possible.

    If you prefer to stick to mainstream products over those designed specifically for users who are blind or visually impaired, the Meta glasses come out on top. If you value products designed specifically for us since they are developed as an accessibility first product, then the EchoVisions are the clear winner.

    If price is a deciding factor for you, the Meta glasses are over $200.00 cheaper than the EchoVision glasses, and you will not be required to pay any subscriptions, so Meta wins here.

    Conclusion

    As someone who has access to both products, I am unable to decide on a clear winner. Since the EchoVision glasses are still in a testing phase, I can get my money back should I choose not to receive a production unit. Once Meta’s app SDK is in the field and third-party apps come on board, it may be an easier call. In the meantime, I will continue using both products in various real world situations.

  • How to install Windows 11 on unsupported hardware as a blind or visually impaired computer user

    Timing note

    While this guide was written before October of 2025, all steps and links are still relevant.

    Introduction

    As most people who follow tech news are aware, Microsoft will be ending support for Windows 10 on October 14, 2025. For some users, that means they will no longer receive updates. For others who want to upgrade but can’t, financial barriers make it hard to justify spending hundreds of dollars on new hardware.

    My goal in writing this guide is to help anyone who wants to upgrade to Windows 11 complete the process as smoothly as possible. This guide may also help users already on Windows 11 who want or need to perform a clean install for any reason, though some steps are not required if your hardware is supported.

    What you need

    Before installing Windows 11 on your machine, make sure you have the following items ready:

    • The computer you are installing Windows 11 on.
    • If using a laptop, the power cord.
    • A cloud service or external media to back up important files.
    • A USB flash drive with minimum 8GB of storage to create bootable media.
      Note, make sure any important data on your flash drive is backed up. The drive will be formatted in this process, and all files will be lost.
    • A tool for creating bootable media. I personally recommend Rufus.
    • On some computers, an external USB soundcard for Narrator speech during setup.
    • A smartphone running Be My Eyes and/or Aira. (More on this later).

    What to do

    Confirming if your PC is officially supported.

    Before beginning this process, I recommend using Microsoft’s free PC Health Check application to determine if your system is officially Windows 11 compatible. If it is, you may want to install through Windows Update or the Windows 11 Installation Assistant. If your device is not officially supported by Windows 11, the steps in the sections that follow are for you.

    Creating your bootable installer

    • If you have not already done so, Download Rufus using the link above and store it in a desired location on your computer.
    • Connect your USB drive and launch Rufus. If user account control is enabled, press Alt Y to agree to the prompt.
    • When Rufus opens, you will see any connected USB devices. If your flash drive is the only connected external media, it will automatically be selected.
    • Once you have confirmed the desired device is selected, press Tab to move to the boot selection box. By default, it will be set to Disk or ISO image. We do not need to change anything here, so tab until you hear SELECT subMenu.
    • When you reach this option, press the down arrow key. you will have a select and a download option. Select is already checked, so move to Download and press Enter.
    • You will be brought back to the main Rufus screen, and the submenu will now say Download. Press Enter or space to activate the button.
    • When Download is activated, you will be placed in a screen that asks what you want to download.
      By default, Windows 11 is selected, but it may not be obvious if using NVDA since you have to hear “@{Index=0; Version=Windows 11; PageType=windows11} collapsed”. So, if you arrow around and feel like you are not finding what you want, just stop and listen.
    • Once you activate Continue, you will be placed in a screen that lets you choose which version of Windows 11 you want to download. At the time of this writing, 24H2 is the latest version, and this is the only option Rufus lists.
      Note that just like the previous screen, you will hear index information if using NVDA. I do not have JAWS, so I cannot say how it behaves with this application.
    • The next screen will be asking you to choose which edition of Windows you want to use. Rufus has one standard ISO for Home, Education, and Pro. Like other screens in the download process, NVDA will announce extra information, so you will hear “@{Id=System.Object[]; Edition=Windows 11 Home/Pro/Edu} collapsed”. Locate the continue button and move to the next screen.
    • The next screen will ask you to choose your language. By default, it is set to English International. Like other boxes in this download screen, NVDA announces extra information, so just stop and listen when going through the languages. To access the languages in this box, press Alt Down Arrow to expand, then use up and down arrows until the desired language is located. To select a language, press Enter.
    • Once you select a language and hit Continue, you will be asked to choose if you are installing on ARM or X64. X64 will be selected by default, so repeat the steps from the previous step if you need to change to ARM. The next option will allow you to download from a browser. By default, this option is unchecked. Once your desired Windows version is selected, Tab to the download button and activate it. A standard File Explorer save dialogue will appear. Use standard navigation commands to choose a location and activate the Save option with Alt S.
      Note, the ISO is a large file, so download times will vary depending on your network speed.
    • Once the download is complete, you will be returned to the main Rufus window. When using NVDA, I have found that you do not get speech when first returned to this window. Press Alt Tab to return to the window, and NVDA should speak normally.
    • Once back in the window, you will have options to select your partition scheme. The available schemes are GPT and MBR. GPT is the most common for modern systems, but if you are unsure if your computer uses UEFI or BIOS, MBR will work.
    • Once your partition scheme is set, press the Tab key until you locate the Select option. Press Space, and you will be asked to select your file. Locate it in the File Explorer window that pops up and select it with Enter.
    • When the file is selected, locate the Start button. You will be presented with the following options:
      • Remove requirement for 4GB+ RAM, Secure Boot and TPM 2.0
        Note, you will want to make sure this option is checked. If it is not, the install will not work properly.
      • Remove requirement for an online Microsoft account
      • Create a local account with username
      • A field that will be filled in with the username for the active user on the computer.
      • Set regional options to the same values as this user’s
      • Disable data collection (Skip privacy questions)
      • Disable BitLocker automatic device encryption
      • Ok
      • Cancel
    • Once your settings are chosen, activate the Ok button. You will be warned that this will completely erase the drive. If you want to continue, press Ok, and if you do not want to press Cancel.
    • Once you select ok, Rufus will begin formatting the drive. When the format is complete, eject your drive and move to the next section.

    Completing the installation

    Now that your installation media is ready, it’s time to begin the upgrade process. With your computer plugged into power, shut id down completely and insert your flash drive.

    Important notes

    If you do not already have Be My Eyes and Aira, Download one or both of them now and sign up with free accounts. You can Download Aira on the AppStore, Download Aira on the Play Store, Download Be My Eyes on the AppStore, download Be My Eyes on the Play Store. During the next steps, we will have no auditory feedback, so we will need to either rely on the AI functionality of these applications or call for human support. For Seeing AI users, the app will be able to scan the screen, but it will not be able to tell you what is selected, so this option will not be practical for this task.

    It is also important to remember that if you set a password for your UEFI screen, you may be unable to boot into your installer. You should also keep in mind that on some computers, Windows may not recognize your computer’s internal drive as an install location. When this happens, power off your computer, turn it on normally, and access the website of your computer manufacturer. On their website, you will need to find storage drivers. You will have the option to directly install the drivers or copy the contents to another location. In our case, we want to copy them to a folder on our installation drive or another drive. If you successfully found the needed drivers and applied them, continue with the installation.

    Once your computer is fully powered off, press the Power button to turn it back on. As it boots, repeatedly press one of your function keys. This varies by computer, but in most cases F12 will bring you into UEFI. There is unfortunately no auditory feedback when UEFI loads, so you may have to try this process a few times to see what works for your computer.

    Once in your UEFI screen, you can navigate using Tab or the arrow keys, and select options with Enter. The layout of UEFI screens will vary across computers and manufacturers, so a full guide to these screens is not possible. In my experience, HP may be more challenging to work with, but it can still be done. The HP I performed this process on required you to rearrange the boot devices. Using your AI, Aira agent, or Be My Eyes volunteer, take a photo of the screen to determine which option is currently highlighted and where boot device selection is. If you want the ability to ask for specific options in advance, you may prefer Aira’s Access AI. Access AI also gives you the ability to verify its responses with an Aira agent, so if you want to confirm that you are where you need to be but don’t want to talk on a call with someone you can do that. On some computers, selecting your boot device may be as easy as pressing Tab or an arrow key two or three times, then pressing Enter. On the HP I used, I had to press Space to highlight the option, and use the arrows to drag it. Once your desired boot device is selected or positioned correctly in the list, the computer will boot into the standard Windows setup screen. If using computers that require you to rearrange boot devices, remember to go back into UEFI and set your internal drive as the primary boot device.

    Once in Windows setup, press Control Windows Enter, and Narrator should start. Follow the prompts, and you will be good to go. If you have no speech, this is where your external soundcard comes in. After the install, make sure to download your manufacturerer’s audio drivers through Windows update or their website.

    For those who run into the issue with their internal drive not showing up as an install location, there will be an option to add the needed driver. The option is a link, but Microsoft did not make this option easily discoverable, so finding this option will take some exploration. The file selection part is mostly straight forward as well. When you find your drive, press Space, then arrow up and down through the vertical list of files and folders, using Space to select what you need. When finished, look for an open or continue button. Assuming you found the correct files, you can continue the installation.

    Additional notes and links

    For those who would like an audio walkthrough of the process for using Rufus, you can listen to the following video by Chris Wright on YouTube. While the tutorial is targeted toward users wanting to set up a virtual machine, he does briefly discuss some of what is in this guide for those who prefer audio format.

    It is also important to note that when major Windows versions are released, Windows update will not find them. When 25H2 comes out, those who upgraded using this method will need to create another bootable. Thankfully most of the steps for booting into the UEFI screen and selecting your USB device are not required. On the USB drive, there is a file called setup.exe. Run this file, and you will be able to perform an in place upgrade.

    Conclusion

    I would like to thank you for taking the time to read this article, and I hope it was beneficial to you. If there was anything I missed or something you think I should add, please let me know in the comments below, or use one of the methods on my contact page.