[Full Transcript] The Upside Talks about how AR contact lenses will revolutionize sports training

This week, we had the pleasure of interviewing Steve Willey the CEO of Innovega a great AR startup that built AR contact lenses. Throughout our conversation, we talked about what it was like to build Innovega, and more importantly, what kind of live sports experiences — AR and VR contact lenses packed with statistical and biometric data — we’re likely to see in the coming years and how it will revolutionize sports training.

?Here’s the full transcript from our conversation with Steve:

Julien Blin – Sports Tech Advisors: Today on our weekly Sports Tech podcast powered by Sports Tech Advisors. We have the honor to have Steve Wiley, the CEO of Innovega, an AR startup that is transforming the world of sports training. Steve, welcome to the show!

Steve Willey, CEO of Innovega: Thank you. Thank you. It’s my privilege. Appreciate the opportunity.

Julien Blin: Great, so today what I wanted to do was to talk about how you started Innovega. And then I want to get your thoughts on how your technology is relevant to sports like golf and cycling. And what you believe are gonna be the future types of AR training applications to be used through your AR contact lenses in the future. How does that sound?

Steve Wiley: It sounds perfect.

Julien Blin: Great. So for the audience who is listening, Steve, tell us about how you started Innovega.

Steve Willey: Yeah. What we saw was an opportunity to respond to a market we thought was going to be really massive. And this is going back to the inception of the company. We had worked together as a team before, and we had delivered these wearable display systems into the defense community, government community. And even there we saw great gains. We delivered great gains, for example, we did pilots of airframes and helicopters. But what we saw is when we started in Innovega and we thought that there would be this fundamental growth from professional and government related applications. You have a time critical, literally life and death exploding into the consumer market where you now had 100x the number of opportunities in terms of application and quality of delivery. But at the same time, it was equally clear that the what was available and made available to government defense would not work at all to the consumer. The consumer in many ways most was challenging customer because they would expect full normal, you know, comfortable, stylish, wearable displays, we call them glasses. But at the same time that we’ve been spoiled with our big screen TVs and monitors and tablets, you expect panoramic photo real resolution. So there we sat, saying OK, the market that we had a good understanding of and had delivered successfully into was going to explode a hundredfold because media new media HD streaming media was available. But the new user of the media, the consumer or the pro Sumer, the golfer, the cyclist, the new user of media. And where of these wearable displays was going to have demands that the industry would fail to deliver?

So we think about Google glass. It died because the consumer expected amazing, useful panoramic displays. Well, they got a postage stamp display. Even VR is failing today because the consumer also expect something that’s cool and stylish and social, not a box on your head.

So that’s what we saw and that idea that a market would explode, but the industry would fail to deliver it, hit a brick wall, enabled us as a technical engineering group to say: “this is the time, We can deliver a solution, it will be entirely unique and a very high value to these consumer and specifically sports markets”. That was a market that excited us a lot and I can tell you more.

Julien Blin: That’s great, so and you talk about Google, for example, and Google also trying, and some other companies to try to crack the code of the AR contact lenses as people call them by putting maybe some LEDs on the lens, but it didn’t work. So what is your approach? And you guys do not put any LEDs on the contact lenses themselves? Can you explain to the audience how you guys did this?

Steve Willey: Yeah, well, what they realized Google and others in this work, is that if you took the display, you know, a wearable display. And if you put it very close to the eye in fact, on the eye in a contact lens, then you would have the benefit of the tiny display embedded in the lens and this would appear large to the wearer. I could take your hand. If you bring it closer to the eye, your hand looks very large.

What they missed was that putting LEDs and electronics and batteries and such in a contact lens with highly problematic. With the FDA, every lens has to be approved by them, and the FDA is more concerned about safety. They’re putting those types of materials and components so close to the eye in fact, in contact with the eye, essentially, we felt would not be acceptable. And that’s the type of problem they ran into. They ran into another fundamental problem, that your eye relies on oxygen I mean its like your skin, if you take oxygen away from your skin, your skin will die. Well your eye is the same. So when you develop contact lenses, oxygen has to go through the lens to the eye, keep it healthy. But if you load the lens up with LEDs and electronics and batteries and stuff, you don’t achieve that. You block the oxygen. So for many, many reasons, that solution was never going to work. And after maybe millions, many millions of dollars, I understand that was abandoned.

So our solution was well, let’s stop for a second. We got eyewear. We’ve got available media HD 3D Wi-FI, and we can push that into this so called eyewear. But in the eyewear, there’s a lens that’s required . there’s a media display. But there’s also a lens, and the lens enables the wearer to see the display just half an inch from your eye, and it’s actually the lens that causes the whole problem. So rather than putting the entire display into a contact lens and have all the limitations I described, and there are more.

Julien Blin: You are right … And I don’t think I’d be comfortable putting on AR contact lenses with LEDs on my retina.

Steve Willey: Yeah, well, this heating issues, there’s radiation issues. And in fact, you know there are optics. There are just optical performance issues. It would mess up your normal view of the world. So the evolution was when we’d just take the lens components out of the eyewear, out of these smart glasses, if the lens component and dropped the lens into contact lens. This is what contact lenses are. I mean, they allow you to view through a lens, and it corrects your view of the world.

So we took a modern approach, a soft, disposable contact lens, and we modified it so that it would correct your vision. But more important, the media that would sit in your glasses half an inch from your eye, full HD 3D full see through, everything you want. The lens would enable you to see that equally well, so now you had kind of a dual focus lens allowing you to see a world, but also see your media both in focus, simultaneous all the time. And so our system is a combination of a pair of cool, normal glasses like sunglasses but a contact lens that you would wear. You’re now wearing your optics wearing your lenses that enables you to use and stream your media at the same time, improving your view of the world. So that was our that was our approach, and it was the notion of using the contact lens for what it’s for, which is you to like a lens, to improve your view of something as opposed to what the Google and others trying to do by putting an entire display.

So it’s now been acknowledged that ours was that our hybrid approach was really pretty clever and that we’ve now all of the miniature electronics in the glasses where they belong, the electronics and the display component in the glasses. They will get less expensive and tinier as time goes on, literally year by year. The Google glass have a tiny display. We would just eliminate that by dropping it into a contact lens, and it now can provide all of the visual performance that you would require. That was the epiphany, and that’s what we’ve implemented. And we believe that as we look to the future, this notion of a matched combination of smart lenses, pressure vision and also streams your media, plus smart glasses will look like normal glasses. That combination is going to become, you know, normal as we move forward. Instead of throwing on sunglasses, which modify your world, eighty ninety percent of us have a pair of sunglasses. You’ll have smart glasses. The smart glasses will enable you to view your world better. Lenses and glasses enable to better view your world, but at the same time it brings you media. Which brings us to the training applications that you touched on for golf, cycling, etc.

Julien Blin: Yes, I think your hybrid approach is very smart. This week we have one of the biggest golf tournaments in the world and you guys intend to accelerate golf training by allowing golfers to get instant comparison of the real view of the ball but also real-time metrics on how to ball is struck. Can you maybe elaborate on a little bit on that?

Steve Willey: Absolutely. And you’ve got it right, as you described it. So there’s this to terrific companies called Zepp, for example, they put a sensor on your glove, and as you swing it informed you about your swing, which is obviously important. But if you’ve got to, say pick up a smartphone and view it on a cell phone, the golfer has to stop and put the club down, pick up the phone, look at it, understand it and and that’s not ideal. So what we know about athletes, if you could provide real time instant analysis, data and analysis, to players holding the club in their hand so they’re still watching the ball fly or even putting, it would be ideal to provide them with the real world view and the analysis and data instantly. They can correct their swing instantly. So what we do, with our hybrid approaches we have the combination of glasses and lenses.

Here we have great sensor data from Zepp that measures the swing. So as the golfer swings, we have this data fully available. I mean, the swing speed, swing dynamic, angle of attack on the ball. All this information instantly collected the moment the ball is struck. It’s analyzed and made available instead of the golfer having to stop and view it after the game, after the practice or just view it instantly but on a smartphone or tablet. It is now in front of their eyes as they watched the ball fly, they’re being told: ‘Hey you have three degrees off axis you came at, added it at X miles an hours as opposed to Y mile an hour.’ And if you have the virtual coach, it would say: ‘If you had your left foot moved back in half an inch,’ — so this is the instant information provided to the golfer who is watching the ball fly.

There’s this confidence level and an ability to say, OK, this is what the data is telling me, and this is precisely what I’m watching as a result, it’s kind of cause and effect. So today I have the real time matching of cause effect that humans are really good at. Humans are very good at responding to reality, if you will, and it’s unavailable. So our paradigm is to take the sense of technology that’s available today and it’s amazing, if anything, that’s almost too much data. And too much analysis in certain instances, and to instantly delivered not only to the glasses but to the golfer or cyclist or runner who was looking to adjust their form, their stroke to improve. Then on top of that is the whole virtual golfing where maybe you’re at a driving range and hit five balls, you’ve got the data of the analysis, you should watch the ball fly and you’ve got the virtual coach who’s telling you this is what you want to try again. Back to move that left foot back two inches, rotate your grip five degrees so but it’s that ability, that richness of instant comparison of ball the flight of the ball versus the analysis would be very powerful. And it’s not available today. Absolutely not available.

Julien Blin: That’s right. So, you know, I like your comment about the virtual coach. We had a podcast interview with Adam Cheyer, the founder of Siri. Adam has been really a pioneer in the world of AI and digital assistants and so it’s only a matter of time until we have those digital coaches, right? Where the players can get directions from, right? So I think we’re getting to a point where the technology is there.

Steve Willey: There’s absolutely no question. And there’s a whole business in that respect, because you’re putting golfers at the driving range or on the course they are striking the ball and they’re kind of lost. They struck it. It goes to the left, goes to the right. They really don’t know why. And like I said, the senses are kind of interesting because they could stop and try to get a sense of what’s going on. But that’s impractical if you’re hitting one hundred balls at a golf or a driving range or if you’re out on the course.

So the ability to get at the instant feedback mapping of your analysis, your data, your analysis in the fall of flight and, a virtual pope whispering in your ear is so powerful. And if in fact, you’ve been on that course before and you’ve got data from the last three times you played on it. You always went, you know, thirty yards to the left. And this is why. And now you could think about all of the golf pros now having a business off coaching for these the amateur golfers is looking to make progress.

Julien Blin: Yeah, and I think you mentioned getting real time feedback, like on your play and so on, but I think another key component of that is gonna be providing data on the biometric data of the players, right? So for example the digital coaching system would say: “you’re getting dehydrated or your core body temperature is going through the roof, takes some Gatorade to hydrate yourself, right? That would be possible too because of players would be wearing smart patches and watches that would feed the coaching system in real time, right?

Steve Willey: It is not a stretch at all. I mean, we could imagine smart glasses getting real-time information from the user’s smart watch, which would be measuring the user’s blood pressure, heart rate. And, as you said, hydration is absolutely common sense. And you can obviously extend that for the concentration.

So we’re tracking the eye. It’s the eyes, foot on the pins, focused on the clubs, looking at the ball. I mean, how many of us are told? Watch the ball. Look at the ball. How many of us have failed to do. If you’re tracking the eye and you’re watching the eye move to the left and right and everywhere. Well, you’ve lost your concentration. You’re not focusing on the two or three fundamental tasks at hand. So there’s no question. I gave an example of the sensor that’s attached to a glove or a club. But you’ll have multiple senses, inexpensive, fully available real-time rich media, and it all comes down to, is the last inch is the last inch to your eyes and your audio. It’s vital, and that’s our whole interest. It’s that interface, that IO to the human wear, who’s got very high expectations, doesn’t want to look goofy, doesn’t want to be antisocial, doesn’t want to mess up their normal vision.

Golf is a frustrating game. I play a little and is the most frustrating game if you don’t make progress, so with this sort of technology, sensors and displays and media, the richness of that portfolio of technology, I should say that’s where it could become fun again.

Julien Blin: Yeah, that makes sense. Some people argue that today there is no consumer market for AR glasses yet, right? So you have a unique form factor. How do you guys expect to change that and bring an AR experience, which will enable consumers to wear regular glasses while enjoying a great AR experience, but also even VR experience, right?

Steve Willey: Yeah, absolutely, I am often asked that: What is the killer AR app? Because I’ve seen VR doing well. And Tim Cook or Mark Zuckerberg saying things like, ‘My goodness, AR, VR, it’s everything. It changes how we use technology, it changes how we use media. It will change everything.’ And then we see VR happening here and there, but not not the vertical growth that we expected and the whole AR thing, Google Glass and others who have disappointed us. And so you’re asking yourself, well, okay, I’m not seeing as much happening in this whole AR/VR/MR space as I thought, maybe there are no good applications. The answer is whatever you’re doing on your smartphone, you would expect that on your tablet. That’s the application. We don’t have to think too hard about how we’re going to use these glasses. It’s more, rather than pulling your smartphone out, again, you’re holding a golf club, and it would be pretty inconvenient to be taking a smartphone out in the middle of a golf course, sun shining and expect to be productive and efficient with that. And so those hundred thousands applications on your smartphone would make more sense if you have them inside your glasses at the right moment. And they could be dedicated specifically for things like golf training and coaching, or they could be just generic, such as my video chat, my stock quotes, my email, my et cetera. So there will be no shortage of applications. And as we know, we have conversations in the industry with the leadership of the industry. They’re being great, honest with us and saying it’s not a shortage of applications. It’s an unwillingness for the humans to wear something that’s goofy and bulky and heavy and uncomfortable, and where something that has a display which is no better than a smartphone. Give me something that’s big and expensive and exciting and valuable, but at the same time, allow the person to be comfortable and et cetera. So we’re just not worried about that. As a company, we worry that we grow enough, and how we scale enough because we’re now being discovered. The whole virtual coaching doesn’t happen unless you’ve got a very practical way of getting that real time coaching feature for the athletes.

Our next step is to start demonstrating what we’ve been describing and to have live demonstrations where folks will wear lenses, wear glasses and instantly see the benefit of that combination of style and comfort and display quality.

That’s exactly around the corner for us.

Julien Blin: Great. Obviously you have your own startup and startups are not easy, right? There’s always challenges with startups. So what do you think are the major challenges that need to be overcome in the coming months for you guys?

Steve Willey: Startups, as you pointed out, have challenges and they have unique challenges. So in our situation, we’ve got a lot of experience in terms of growing of startups. So some of the pains of cash flow and challenges around accounting and legal, and we don’t have that because we have done that before. But because we use a contact lens, there are regulations and there’s a regulatory body called the FDA. So one thing that we have to do that Google Glass did not have to do, or Oculus with VR don’t have to do:

We have to get approval by the FDA that we’re going to work with partners and deliver these cool modern lenses that do something slightly different. So in our situation, we have to go down the path with the FDA, which we’ve begun doing with a very favorable response, so the next step for our company is to begin the FDA trials, which are well defined because it’s a contact lens, it’s not a new horrible drug.

It’s a well defined contact lens and is a well defined path. And the second piece of that is the FDA wants to know if we’re going to be demonstrating if we’re going like yourself wear our lenses and glasses, the FDA would like to know that. So there’s also a separate and related path, which is to get approval to begin demonstrating, not with just sort of modeling or power points or emulation or hey here’s the components. It’s now enabling folks to actually wear and experience our solution, it’s the demonstration phase, and we now have the path and immediate path and really, within the next several months, our being able to demonstrate to those who are interested at that level and we’ll, on our side, we sort of have to pick our shots.

Now you know the Microsoft, the Google, the Amazon, they’re all going to be very, very interested to see what we’re delivering because it’s something that they failed to deliver. But for us and for them, so as a cooperation, it’s now about what applications would be important. And I love this conversation because sports and athletic training or even athletic performance could be enhanced so radically because of this ability to match actual performance and the effect from the data and analysis to this AR contact lenses experience.

So I hope one of the partners that I’ve just described is going to conclude that there should be a sort of an athletic program where we are training and participating in the training and coaching off of athletes who are looking to improve their sport.

Julien Blin: That’s great. So and we have a relationship with lots of world class trainers and coaches and training centers. So what are your goal in the next 24 months? What do you plan to accomplish? Is it to team up with major companies like Amazon, Microsoft, Google, Magic Leap and bring the technology to the masses, is that one of the goals?

Steve Willey: Yeah, that’s a very good question, because our technology could be your system, if you will, could be used by really a breadth of participants all way from defense to government to prosumers, the golfers to, you know, millennials eighteen to thirty year old who just love their media. We have to pick that, and to some extent, if you think about us partnering with an Amazon, they have their virtual assistant or partnering with Apple, they have their virtual assistant too. I could see us having a conversation. What is the best fit, but also the best place to start. And one of the things that we have done in our side is to evaluate these different markets.

The enterprise market, consumer market, defense market and a very interesting one is the patient market. Do you think about patients who are partially sighted, visually impaired or hearing impaired? How we can provide benefits to that whole vertical, the whole patient vertical. But that’s just an example. So we’re fully prepared to take our magic, our technology to a partner and to launch into that patient vertical, patient assistant with impaired senses. But equally, we’d be fully prepared to package up our sports training application with any one of these parties.

And your list is my list.

It’s the Amazon, the Microsoft, the Google, the Magic Leap even who are in our industry. And in our view, we clearly need better smart glasses solution for consumers and athletes to accept it en masse. So, yeah, we will bring the tools, we’ll bring the optics. The system tools to large, much larger partners than ourselves, and they will help us get direction and they in fact will lead that direction. So on your side, Julien and you have so many strong relationships within the sports space. I could come up with names, Adidas, Nike, there’s no reason why they couldn’t come to us and say “it’s great what you’re doing with patients, it’s wonderful what you’re doing with the defense community but guess what? We’re interested in offering enjoyment and performance to the sports.

You know, the sports market in general. That would be amazing for us, and it’s something that we would be very responsive to.

Julien Blin: Well, look, we have a relationship with some of those people you mentioned, so I would be happy to help you there. Look, we are actually arriving at the end of the interview, but I wanted to thank you for your time. And it’s been a pleasure having you on the show.

Steve Willey: Thank you for the opportunity. I really enjoyed it, and good luck.

Julien Blin: Thank you.