Information-rich Eyeballs? Talking to Microsoft’s Desney Tan about the Functional Contact Lens

Tom Cruise’s futuristic contact lenses in the new Mission Impossible movie may not be as far off as you think.  Desney Tan and Microsoft’s Computational User Experiences group have formed a collaboration with Professor Babak Parviz and his Bio-Nanotechnology Lab at the University of Washington to build a contact lens that provides the wearer with a fully configurable display of digital information.

The functional contact lens is part of Microsoft’s work on creating natural user interfaces that make interacting with computers seamless, so that people can focus on completing their everyday tasks while they’re on the go, or in this case, so that people with diabetes can monitor their blood sugar without stopping for a finger prick.

Recently, the functional contact lens team took advantage of the fact that the lens comes into contact with bodily fluid—tears, to be precise—to explore applicability of the lens for continuous glucose sensing. Tests show that blood-glucose levels can be measured via special sensors embedded into the lens. So the team embedded tiny, flexible electronics into the contact lens. This includes the control circuits, communication circuits, active components (LEDs or glucose sensors, depending on use), as well as an antenna. The contact lens receives power and communicates through the antenna with a mobile device nearby (e.g. could be an augmented smart phone). Glucose sensing is done using an enzyme-based electrochemical process sensitive to the glucose molecule. As the enzyme interacts with the tear fluid, specific measurements are made by observing the change in current measured by bio-compatible electrodes on the contact lens. This information can be sent to the mobile device for display and feedback to the user.

We had the opportunity to talk to Desney, a senior researcher at Microsoft Research, about his fascinating work:

 Assuming a glucose-sensing lens is created in your lab, what are the steps that take the lens from your bench to my bathroom counter? What sort of complications are there in scaling up production on functional contact lenses?

The team knew this was an ambitious project, but that’s what made it so appealing. At a high level, there are several challenges that need to be addressed first. The team needs to understand how contact lenses are manufactured today so that they can project what changes to the process would be necessary to embed technology into them. Since this is a device people may use, there is a huge amount of work to figure out how they easily interact with it and the ecosystem of devices and tasks that may be connected to this.

 

Have you tried on any of the prototypes? What is it like to have an information-rich eyeball?

There is more work to be done before conducting human trials. In preparation for this, we have been simulating various experiences using technologies available today, such as wearable heads up displays. The team envisions a way to automatically display important information—including abnormal glucose or insulin alerts—in the lens wearer’s view. The functional contact lens provides us with the ability to have displays that we don’t have to pull out and look at, and that require we take our attention away from the real world. They aren’t socially quite as intrusive as wearing the goggles that are sort of the state-of-the-art in the field right now.

 What kind of algorithms convert the tear-based measurements to comparable blood-glucose values? Are you using algorithms published from other groups who have done research in the tear-glucose field, or are you using a novel system? How well do tear-derived glucose values correspond to blood-glucose values?

There are now various groups working on non-invasive measurement of tear glucose (with sensors embedded in contact lenses). Professor Zhang’s lab has been largely using nanostructured optical probes embedded in hydrophilic hydrogen lenses, and they’ve had some successes recently. Each group is taking a slightly different approach (with measurement and with conveying to user) and exploring a slightly different piece of the puzzle, which will hopefully lead us to better solutions much more quickly.

 There are two ways that I can view this project– the first is as an attempt to build a better glucose sensor, in which case questions of accuracy and durability matter a lot. The second is as a proof of principle that this kind of sensor technology can be built in to contact lenses, in which case accuracy matters far less than the successful transmission of one set of data and the subsequent display of a converted form of the data. Which is the correct lens (pun intended!) through which I should be viewing this project? Are you building a meter for diabetics or are you demonstrating the potential of ubiquitous computing?

 Both. Our current project is to build a lens to monitor glucose levels for diabetes, but the team is always exploring other uses of this technology. The team has only begun to scratch the surface of the opportunities that exist with this type of platform. The most important challenge is really in the deep exploration of all the things not yet imagined with this platform, and new platforms enabled by this new-found capability to create other technology of this form.

 How does the collaboration between Microsoft Research and the University of Washington function? Who is responsible for what, and who owns the ideas in the end?

The Computational User Experiences group at Microsoft Research, which has expertise in mobile interfaces and medical sensing, directly collaborates with Professor Babak Parviz and his Bio-Nanotechnology group at the University of Washington on this research, which is one of the world’s top labs working on engineering and fabrication of very small, flexible devices, including medical sensors.

The Microsoft Research group, as I understand it, is a corporate-owned research center much like Xerox PARC was. Xerox PARC, of course, is famous for its many astounding innovations (Ethernet! The internet!), but also for never taking much commercial advantage of its innovations. To what extent is this project intended to be commercializable for Microsoft? Is there an intent to eventually market and sell these contact lenses, or is the interest more akin to university research, where commercialization is a nice theoretical endpoint, but not a primary concern?

Thus far, the team has mostly conducted bench tests. Given the sensitivity of the application space, as well as the human eye, they are working hard to fully develop the technology before moving to in-situation trials. At this point, the team is working hard to make this technology a reality and hope to be able to push this out to consumers as soon as everything is ready.

 Microsoft Research has been developing new technology and transferring that technology into key Microsoft products. In fact, our work touches nearly every product Microsoft ships, whether by contributing new core technologies, providing new algorithms, developing and sharing code, consulting with product teams, designing new user interfaces, or creating better developer tools.

Desney Tan is a senior researcher at Microsoft Research, where he manages the Computational User Experiences group in Redmond, Washington, as well as the Human-Computer Interaction group in Beijing, China. Desney was honored as one of MIT Technology Review’s 2007 list of 35 innovators under 35 for his work on brain-computer interfaces, and was named one of SciFi Channel’s Young Visionaries at TED 2009, as well as Forbes’ Revolutionaries: radical thinkers and their world-changing ideas for his work on Whole Body Computing.

For more information, check out Medical Sensing via a Contact Lens and a University of Washington case study.

Emily Patton contributed to this article.

Karmel Allison
Karmel Allison

Karmel was born in Southern California, diagnosed with Type 1 Diabetes at the age of nine, and educated at UC Berkeley. Karmel now lives in San Diego with her husband, where she is loving the sunshine, working in computational biology at the University of California, San Diego, and learning to use the active voice when talking about her diabetes.

0 0 votes
Article Rating
Subscribe
Notify of
guest
3 Comments
newest
oldest most voted
Inline Feedbacks
View all comments
nira
nira
12 years ago

wow… ela ela..

Scott K. Johnson
12 years ago

Wow!  Great interview Karmel – thank you!  And a big thanks to Desney and the team.  This is exciting!

Stefanie Tsabar
12 years ago

Nicely done, Karmel!  Thank you for your excellent questions.  Here’s to the future!

3
0
Would love your thoughts, please comment.x
()
x