We have been following the various Smart Glasses product announcements with a lot of interest. So when the M100 became commercially available, we decided to give it a try and take our image processing algorithms for a spin.

First impressions

The M100 is very well built, light, and comfortable to wear. The unit can be mounted on eyeglass frames, or worn with an over-head band. The display can be placed above or below the eye, at an adjustable distance. It fits on either side, which is handy if your dominant eye is the left one, like approximately one-third of the population. That’s a lot of flexibility.

Another nice touch is the inclusion of a high-capacity battery pack. Apps that continuously use the camera are bound to drain the built-in battery faster than your average smartphone app.

There are four buttons on the main body/ear piece: power, select/back, volume up, volume down. They take some time getting used to, as they have more than one function and can easily be confused for one another. Clearly, the upcoming gesture and spoken controls will be useful.

Finally, an Android companion app is available on Vuzix’s web site. It connects to the M100 via bluetooth and acts as a remote control to launch and navigate apps, enter text, etc.

Developer perspective

We purchased the SDK from Vuzix at the same time as the glasses. Once the device-specific files are installed in Eclipse, the M100 behaves like any Android device: build, deploy, debug, etc. This gives the developer more power than, say, Google Glass, which restricts what can run on the device or what can be displayed. The M100 is a full-fledged Android 4 device where all the APIs are accessible (except of course APIs to absent hardware, such as the phone or GPS). From a developer perspective, it does not get any better than that!

Barcode and QR code scanning

As James Lavery points out in a recent post, the main difficulty with scanning barcodes on the M100 (or any smart glasses for that matter) is the distance between the camera and the barcode. With a smartphone, one takes the camera to the code, hovering at about 10cm until it is read. With smart glasses, bringing a small object within 10cm of your face in order to scan its barcode would be awkward. Bringing your face 10cm from a barcode in a hard-to-reach place is simply not going to happen.

Unfortunately, the size of a barcode in the camera image decreases in proportion to the distance. Even just at arm’s length (40-50cm), most barcodes are too small to be read by standard decoders.

The following images illustrate the problem. The QR code is version 6, low error correction, 2cm wide (or about 1/2mm per dot). The UPC is 3cm wide. The pictures were taken at (roughly) 18cm, 25cm, 50cm and 75cm. The right column zooms in on the barcodes in each image.



18cm (& detail)



25cm (& detail)



55cm (& detail)



75cm (& detail)

At 18cm, both codes are easy to read. At 25cm, the bars of the UPC start to blend together and may be difficult to read with standard algorithms.

How do we deal with this? The answer is a combination of user interface tweaks and specialized algorithms.

  • Zoom. A digital zoom does not add any information nor does it make the image sharper. However, it helps the user see what they are scanning and correctly position the barcode. It also limits the search area and lets the software focus on the area of interest.
  • Scan line. Our barcode scanner offers the option of displaying a red line at the precise location where the barcode will be read. Again, this gives more control to the user.
  • Specialized Algorithms. Our original UPC/EAN scanner was developed for the iPhone 3G, which had a lot less CPU power and produced images that were, believe it or not, even blurrier than the ones above.

The result? We can read the 3cm-wide UPC in all the images above, and the 2cm-wide QR code up to 50 cm. The distance depends on the physical size of the codes. Larger codes will be readable from further away and conversely.

Pic2shop demo (available for download)

We have built a custom version of our comparison shopping app pic2shop for the Vuzix M100. It is designed to scan regular UPCs and EANs, as well as QR codes. If the M100 is connected to the internet, pic2shop will show the product name, image and shopping information.

Here is the scanner screen:


and the result:


The application package (apk) is available for download here. It is a time-limited demo, so be sure to check back here for updates.

A few more things to note:

  • Use the volume up or down buttons to launch the scanner.
  • Pic2shop only scans EAN13, EAN8, UPC/A, UPC/E and QR codes.
  • Like the regular pic2shop, this demo exposes a barcode scanner Intent, so you can call it from any another app. A sample “client” application is available on GitHub. Since it is a time-limited demo, please do not use this mechanism in shipping apps, but contact us if you need a long-term barcode scanning solution.


I agree with James Lavery, field service (remote assistance, step-by-step guides, documentation), warehouse picking, ticket scanning all seem likely application candidates. I firmly believe that the most useful apps will originate from the users and from the developers who work closely with them.

With no keyboard and no touch screen, visual and spoken input will be even more important than on smartphones. It is also clear that very careful user experience design will be critical for adoption. Ideally, one should never have to use the buttons nor the companion app, except maybe at the start or the end of a task. I look forward to trying the gesture interface and Nuance’s speech recognition, which should both help a lot in this regard.


We are very excited to finally have our hands on powerful smart glasses and to be able to freely experiment with them. They are a great platform for the kind of image processing and pattern recognition software we are currently developing. Expect more cool demos in the future, and if you have applications ideas, please drop us a line at