[Full Transcript] The Upside Talks With ‘Father of Siri’ About AI, Sports

Last week, we had the honor of chatting with Adam Cheyer, the founder of Siri (sold to Apple), Viv Labs (sold to Samsung), and known as a world’s pioneer in the world of AI.

Throughout our conversation with the ‘Father of Siri,’ we touched on the legacy of digital assistants (and their role today), what it was like building Siri and working with Steve Jobs, and the implications AI will have for sports in the coming years, many of which will leverage the power of AR, VR, and more.

?Here’s the full transcript from our conversation with Adam:

Julien Blin – Sports Tech Advisors: Hi, so today on our weekly Sports Tech Podcast, powered by the Sports Tech Advisors, we have the honor to have Adam Cheyer, the founder of Siri. The well known new assistant, used by means of iPhone users every day. As some of you may probably know, Adam is also the founder of Viv Labs, which was acquired by Samsung. So, Adam, welcome to the show.

Adam Cheyer: Thank you, glad to be here.

Julien Blin: Great. So, today, what I wanted to do was talk about how you started Siri at SRI, and then we will get your thoughts on the future types of AI based application and experiences that you expect to see in the future, in the world any sports. How does that sound?

Adam Cheyer: Excellent.

Julien Blin: Great. So, for the audience who is listening — Adam, could you tell us about how you got into computer programming, and the world of AI?

Adam Cheyer: Sure. I guess, it’s a little embarrassing, because it will date me. Back in the early 80s, the Rubik’s cube just had come out, and it became very popular. And my brother and I were very interested in a Rubik’s cube. We actually won the Northeast Championship two years in a row. But that was actually … Rubik’s cube was really what got me into computers, a friend had learned just enough programming to teach me very simple things: Input statement, print statement, go-to, if then. And with that, my first program ever was a program, where you would type in your pattern in the Rubik’s cube, and it would tell you how to solve it. So, that really got me started. Once I had graduated high school, I started this in high school, I decided to try to find something that I wanted to do in college. And for me, the human mind was the most interesting thing in the world. So, I went to a liberal arts undergraduate university, where I studied how we work from all different dimensions, including psychology, and philosophy, linguistics, neuroscience, computer science, mathematics. And it was just very, very interesting to see the magic that is the human mind.

Julien Blin: Well, that’s great. And I think I was reading an article about an interview, where you mentioned that your grandfather spoke eight languages or something? Is that right?

Adam Cheyer: Yes. Once I finished school, the next question was what do I want to do with the rest of my life, or my career. And I was inspired by my grandfather. He would walk down the street, he was learning his eighth language in his 90s by going to the library and reading a newspaper. And I decided that I wanted some of him, in me. So, I ended up going to work in Paris for four years, where I built-

Julien Blin: Yeah, I read about it.

Adam Cheyer: … a large computer, large AI system. It was actually deployed in 50 countries, and saved the company I was working for tens of millions of dollars every year. But yeah, that’s what led me to get away from the east coast United States, where I had always grown up. I was trying to get a little perspective like my grandfather had. He could talk to any person, in any language, it seemed to me. And he had seen so much of the world.

Julien Blin: That’s awesome. Actually, my great-grandfather actually spoke seven languages. He was a crew captain on a ship. So, I never met him, but it’s quite amazing when you-

Adam Cheyer: Yeah.

Julien Blin: … you want to speak so many languages, so that’s great. So, what was it like to work at Siri, with guys like Luc Julia, who I know, great guy, and Dave Martin. You guys were working on some of the coolest stuff, like TVs that has voice assistants and could let you control your house. What was it like to work there back then?

Adam Cheyer: Sure. So, I actually say the first version of Siri that I created was at a company called SRI International. It used Stanford Research Institute, but then they just dropped the words, and went with SRI. So, back in 1993, before I ever saw a web browser, I imagined this world where I said, “Someday, there will be computers around the world with content, and services we want to access. And we need a way to discover all of those services, and to interact with them.” So, I started a vision. My idea was that you would have an assistant that you could talk to, that you could interact with in various ways, including pen, and writing, and clicking on a computer screen. And you would tell the assistant what you wanted to know, or what you wanted to do. And the assistant would break your request down into sub-tasks, know where all the right content and machines were around the world, to execute those tasks. And it would route the request, gather back information from those computers, interact with the user as needed, learn from those interactions, and help the user get their job done. And so, I built a system like this in 1993, before I ever saw a web browser.

Adam Cheyer: So, I never conceived of the internet, as having hyperlinks, and multimedia web pages. I thought, you’d just have an assistant. And we worked on that with Luc Julia. So, in 1994 he came, and he was really the user interface guru. He has a PhD in multi-module system, and has done a lot of speech recognition, and handwriting recognition. So, he helped me add these really cool user interfaces, on top of this distributed architecture, for this next generation internet, or before the internet existed. And I worked on that with David Martin, and also with Didier Guzzoni from BPFO in Switzerland.

Julien Blin: That’s great. So, in a way, you guys were pioneers, right? You were building the stuff that have become the norm today? Would you agree with that?

Adam Cheyer: I think so, it was certainly ahead of its time. Because, the idea of an internet that you could talk to, to get everything done. That’s still, 25 years later, an interesting idea today. So, we do have things like Alexa, and Google Assistant, and Cortana, and Bixby, and Siri…That you can talk to, to do a few things. But the world that we imagined back at SRI in the 90s, still has not yet come to pass. I think, it’s getting closer and closer every day.

Julien Blin: Yeah.

Adam Cheyer: A few examples. So, some systems that we built, Luc, and David Martin, and Didier, and I. In the late 90s, we built two systems for the house, two for the office, and two for the car, that all interacted with each in a completely seamless way. Because, it was all built within the same infrastructure.

Julien Blin: That’s amazing.

Adam Cheyer: Some examples, were we had an automated refrigerator that knew every bit of its contents, because of RFID tags. So, it knew what you had, and how long it’s been there. It would help you plan your week’s recipes, your meal planning for the week, by looking at recipe websites, and suggesting things that you can make with the ingredients that you have. And if you were missing an ingredient, you could order online using voice, and Webvan.

This was the 90s, would deliver it to you. You could then walk into your living room, and turn on your TV, and you had complete control of your house. So, when the phone rings, the speaker ID of who was calling would actually display on the TV that you were watching.

You could answer it from the TV. Or, let it go to voicemail, and then say, “Play my voicemail messages.” And it would send the audio over the TV while you continue to watch the sports game, for instance.

Julien Blin: Yeah.

Adam Cheyer: You could ask any question that you had. So, “Hey, what’s the record of this team that I’m playing?” Or, “How many points did this player score last night?” And it would display it on the TV in a voice interactive way, while you’re watching the game.

Julien Blin: And that was in the 90s?

Adam Cheyer: In the, yeah, late 90s.

Julien Blin: Wow.

Adam Cheyer: And then you could also have your office system. So, if you had a meeting that day, all of the meeting was recorded, and automatically, notes were kept in various ways. Including from whiteboards, smart whiteboards, and things of that sort.

And you could say to your TV, “Pull up my meeting notes from my two o’clock meeting today.” And you could browse your notes, picture and picture, while you continue to watch the game. You’d head to your car, and you’d be driving around, and we had a 3D model of the world that we could imprint. So, you would be looking out your windshield, and as you pulled up the hill, you’d see this building appear behind you.

But in your augmented reality experience, it would label, “This is this building.” If the refrigerator back home said you needed milk, it would realize you were passing a grocery store, it would remind you. And then, you could either get directions to the grocery store of your choice, or just pull into the store that you were passing. So, it was kind of this really whole experience, whether you’re in the office, whether you’re at home, whether you’re in the car. Multiple systems, all controlled by a single assistance really, in a very multi-module way. And so, I think we’re getting there today. Samsung has committed to get all of its billion devices, including refrigerators, and washing machines, and smart speakers, and phones, and more. Heads up displays for AR and VR. All of these, enabled with an intelligent assistant called Bixby. So, maybe we’ll get actually in the commercial world, much of what Luc, and David, and I prototyped back in the 90s.

Julien Blin: Yeah, and I know that’s your vision. And we’ll get into that, in a few. So obviously, people know you as the founder of Siri, right? That’s your baby. So, how did Siri come about? Was it when you came back to Siri, and worked with Calloway, Norman Winarsky, a great guy by the way, for our ventures, and Dag Kittlaus, and Tom Gruber. Was it when you guys came up with Siri, at the time?

Adam Cheyer: Yes, yes.

So, as I mentioned, I started on this odyssey in 1993, more than 25 years. And had been building through the 90s all these different versions of assistance for your home, and assistance for military, assistance for robots, assistance for all these different versions.

We probably did 50 or 60 versions of Siri, when working at SRI. And one of the largest projects was a government funded research grant, something like $250 million spread over five years, that I was leading technology for. And the goal was to build an assistant that could live with you, and help you get your work done more efficiently by learning what was called in the wild. So, it wasn’t coded to have knowledge in it. It would acquire knowledge just by interacting with you, by observing you. And based on that knowledge, it would help you do your job, your office work much faster. So, that was called Project CALO, and it stood Cognitive Assistant that Learns and Organizes. We had at the peak, 27 or 28 universities, all reporting into SRI to build this one assistant. So, that was a giant research project. But over at SRI Ventures, a guy named Norman Winarsky was looking at commercializing some of the technologies that SRI was developing. And he had the idea to lure in Dag Kittlaus, who was an executive at Motorola. To become an EIR, entrepreneur-in-residence at SRI. So, Dag would wander the halls talking to different researchers, seeing what they did, and trying to see if he could make a business out of this idea. And when we met, he says, “Wow, an assistant that you can talk to, to do everything on the internet. That’s something that I could put a business plan around.” So, working with Norman, and I was providing the technology, again, working with Didier Guzzoni, and others. We built a prototype, and we were starting to put together a pitch, to go to venture capitalists, and raise money, and actually start this as a spin out company. As part of that process, Norman wanted outside due diligence. And he contacted a guy named Tom Gruber, who was a well known expert, not only in user experiences, but also artificial intelligence. And he was brought in to ask us all the hard questions that we might get asked along the way. And it was a really funny meeting, because he came in super skeptical. He’s like, “Yeah, I’ve been around AI, and these types of systems for decades. How are you going to do this?” And then, I would answer. And he’s like, “Well, yeah. But then, how could this work?” And then, I would show him in our prototype, how it worked. And he’d be like … And by the end of this two hour meeting, he walked out of the room. He was the guy who was supposed to give us all the hard questions, and he walked out of the room and we answered the questions so well on the way out, he goes, “Do you need another co-founder?” And so, Dag Kittlaus, Tom Gruber, and I were the co-founders of Siri. We started out as a company. We built over two years an app that we launched free in the App store. We were about 20 people in the company total. And, yeah, it was a pretty exciting time.

Julien Blin: When did you actually knew that you had something, that could impact millions of people? Where was the aha moment? When did you have that conversation all together?

Adam Cheyer: So, yeah, that’s a really good question. I’ll tell you almost the reverse of the aha moment. So, we had this prototype, and it was working pretty well for simple quarries. Like, “Find me a French restaurant in Palo Alto for two people tonight.” And it would go, and understand that request, and contact open table, and find the reservation, and actually make the reservation, but it was a prototype. And so, then as we were building it for real, we needed to load in 20 million business names in the United States. And we realized every word in the English language is a business name, literally. And so, all of a sudden, our system which worked so well, before we loaded in this real world messy, huge data. All of a sudden, we load in this data, and nothing works. I typed in the most basic command, which was, “Start over.” Which, would reset the system. And it said, “Looking for businesses named start and over in Louisiana.” And I’m like, “Wow, what just happened?” The ambiguities in language are incredibly complex. So, if I tell you, “Book a four star restaurant in Boston.” You instantly know what it means. But when you think about it, book can not only mean a physical book, or the verb to make a reservation. But book is also a city in the United States, and star, a city in the United States. And there are 13 different Bostons in the United States. So, which city am I talking about in that sentence? And star restaurant is the name of a restaurant, but I’m not talking about the restaurant.

Julien Blin: So how do you know the AI understand those nuances?

Adam Cheyer: Exactly. The ambiguities are exponential. They just combine in huge ways, and trying to do this with all this massive data of all the places in the world, all the business names in the world, all the movie names in the world. There was a movie that came out. Actually, two different movies that came out called Nine. One, was the number 9, and one was written out as the word Nine. Those were the movie names. So, you could say, “I want two tickets for Nine at 7:00.” And you had to understand what that meant. So, the complexity of language in the real world, was very difficult. I think, most people in the industry didn’t think it was possible. Google and others, could put words into a search engine, it would return documents, popular documents with those words on it. But to do this as a broad domain, meaning it can handle many things…From sports, and movies, and directions, and restaurants, and everything you would want it to do…

To handle the ambiguities, implicit in the language, and to do it with conversation, and context, and followups, was incredibly difficult. I think, most people in the industry didn’t think it was possible. And so, when we managed to climb back out from having start over completely fall apart. And we were now able to understand the complexity of all of these different requests from many different industries, and many different styles, in a robust way. I think, that was our aha moment, where we said, “You know, I think we’ve done something that no one thought was possible, and that’s going to change the world.”

Julien Blin: Yeah, I can’t imagine how complex it must’ve been to do that. So, several weeks after you guys launched Siri right, what was it like to receive a phone call from someone like Steve Jobs, asking you to come to his house, to talk about Siri? Was it a complete surprise to you?

Adam Cheyer: Yeah, that’s a funny story. So, right. We had launched a free app, it was in the App store. It was very exciting, because we had this system that would show logs of what people were typing. So, late at night, we had just launched the app, and we were watching these requests stream in from people around the world. And we were like, there was this huge cheer, because we knew that the types of things people were asking, were actually things that the system did really, really well, and it was going to be popular. And people started to tweet. One of my favorite request was, it was like someone came out of the future, and said, “Here, you’ve got to try this app.” So, that was one of my favorite quotes. So, we were pretty happy, we were excited to have launched this app. And then, two weeks later, so two or three weeks later, we get this phone call. So, it was on the phone of our CEO. It was the phone of Dag Kittlaus, who’s our CEO. And back then … So, he had an iPhone, obviously. I don’t know if you remember, but it used to be when the phone would ring, and you had to swipe to answer. But sometimes, the phone wouldn’t pick up. You’d swipe, and it wouldn’t answer. Swipe, and swipe. And so, we get this call, and it says Apple on it. And Dag’s like, “Hey guys, come over.” And so, Tom and I come over, and he’s swiping and swiping, and it’s not answering, and it’s on the fifth or seventh swipe. And finally, it answers, and puts it on speaker, and we hear this voice that says, “Hey, it’s Steve. What you doing? Want to come over to my house tomorrow?” And we were like, “Wow.” It was just, it was crazy.

Julien Blin: That’s awesome, yeah. That reminds me, you mentioned the iPhone. One of my friend Andy Grignon, worked on the iPhone, right?

Adam Cheyer: Yes, I know Andy.

Julien Blin: The iPhone was not supposed to work, right? That’s awesome. So, then you went to his house. And then, what happened?

Adam Cheyer: Yeah, so we went over his house the next day, and we talked for a few hours about technology, about the future. Right away, I got a sense of who Steve Jobs was. He had a fire burning in him, he wanted to win. And I was like, “Boy, he’s already changed so many fields.” He’s not only reinvented mobile computing, but computing itself with the first multi windowed systems, and then movies with Pixar, and music with iPod, and App stores. It was just crazy. And you’d think he could just rest a little bit. But he had none of that in him. And he asked me a question, and he goes, “Do you think Apple should buy …” And I’m not going to reveal the company. “Do you feel Apple should buy this company?” And I said, “No, I don’t think so.” And he’s like, “What? No, why?” And we started going at it on the very, very first day. And I’m like, “Well, here’s why. A, B, C, D, and E.” And he’s like, “Okay. Well, maybe. I’m going to think about that.”

Julien Blin: Did you try to give your opinion? Or, was he the kind of person who listened, or no?

Adam Cheyer:

So, the thing I loved about Steve, is that he had opinions, but he never felt that he was necessarily right. He was always open to hearing another perspective, and thinking about it. And if you couldn’t defend your position in a reasoned way, he’d like, “I don’t have time.” And he would knock you aside. But I never had a problem with Steve. He would ask me things, or we would talk about things.

And as long as you knew what you were talking about, and had the data to back it up, he was always open to considering that new idea, that different perspective. And that was my favorite part of Steve Jobs. I always felt that I could ..we disagreed. Like, when we worked on Siri, a lot of it we agreed on, and a lot we disagreed on. And he said, “You know, I’ve heard you. I don’t think that’ll be a problem. We’re going to do it this way instead, and here’s why.” And I was always cool with that. I said … Looking back, there’re still times when I look back, and say, “You know, I think I was still right.” There are many times when I look back, and I say, “You know, he was actually right.” Looking back at that disagreement, history will show that he was right.

But his method of always listening, and thinking about it, and hearing you. And then, saying, “No, we’re going to do it this way, for this reason.” I’m like, I loved that about Steve Jobs.

Julien Blin: Yeah, I think a lot of people maybe had the misconception. They thought that he was not listening to people necessarily, right. But what you’re saying, is that he would take feedback, and with some reasoning, right?

Adam Cheyer:

Absolutely, he was a great listener. So many leaders feel they’re the smartest person in the room, and they’re right, and they have their opinion, and it’s very hard to change their opinion. He always was looking to learn from someone else, someone smart. He wanted to be right. He wanted to win, and that meant he had to learn, and think, and listen.

And I thought that was a great attribute about Steve Jobs.

Julien Blin: That’s great. So, obviously millions of people are using Siri every day, all right. So, how does it feel to get up in the morning, knowing that so many people are using your product? It must be a pretty awesome feeling, right?

Adam Cheyer: Yeah, it’s super cool. I think, I counted that Siri has been on over a billion iPhones.

Julien Blin: Is it really?

Adam Cheyer: … already, over time, right. So, I summed up the published numbers. Because, it first came out on October 4th, 2011, on the iPhone 4S. And Steve Jobs died the very next day, October 5th. So, you can definitely say that Siri was Steve Jobs’ last baby, last big creation. And Tim Cook announced that, really, the iPhone 4S was just the iPhone 4 with Siri, and a slightly better camera. The next two quarters, were actually the greatest quarters in technology history, in terms of revenues, and profit margins.

Apple’s stock went from $350 to $650 in six months, just selling the iPhone 4. And Apple became the most valuable company in the world, surpassing Exxon, at the time, in terms of market capital.

I think, it’s clear that Siri was a big impact for Apple. It’s been on every phone since the iPhone 4S, and there have definitely been more than a billion, 1.2 billion phones sold, just phones. Now, Apple has, of course, put Siri in many of their devices. Including Apple Watch, and Apple TV, and the Mac itself. So, it really is getting out everywhere.

Julien Blin: Well, I tell you. Back then, I was working for Samsung. Well, no, let me go back here a bit. So, I was working as an analysis at IDC, and when Steve Jobs announced the iPhone, the founder of Microsoft said, “It’s not even going to work. It’s not going to be a success.” And I think it wasn’t the case, right? It became a huge success. And then, when I joined Samsung back in 2010, we had such a hard time to compete against Apple, and because of the beauty of the iPhone, right? And they were trying.

Adam Cheyer: Yeah.

Julien Blin: It was so hard to try to replicate the UI, the perfect UX and UI integration. And, of course Siri was kind of tying everything together.

Adam Cheyer: I’ll tell you one funny story about how it feels to be an entrepreneur. So, when we just started the company, we were still pretty small. I walked into an Apple store, and there on the wall were icons of the biggest apps. So, there was Google, Skype, Pandora, all of these big, big, powerful companies had their apps on the Apple store wall. And so, as an entrepreneur, we were a couple of guys, and just starting out. And I said, “Someday, Siri is going to be an app on the Apple wall, just next to Google, and Skype, and Pandora.” That was the biggest dream I could have, and it seemed so impossible, so crazy. Like, “Wow, we were just a bunch of people. How are we going to do something as important as a Google, or a Skype?” And then, of course, when Siri came out on the iPhone, a bunch of people from my team, we said, “We have to go to an Apple store, to just see if people are trying it, and liking it, and how it’s working.” And so, it was pretty exciting.

And we walked up to that same Apple store, and now on the front door, next to the front door, they had this giant plasma display made up as an iPhone prototype. And above the plasma display, it said, “Introducing Siri.” And they had Siri use cases playing on the loop. And I got this chill, where I remembered so clearly wanting to be one of those advert icons on the wall, and now I was the front door.

…became a reality, but even so much more than the biggest dream that I could have. The biggest dream that I could imagine of being one little icon on the wall with lots of others.

Adam Cheyer: It was like, “Wow, this not only happened, it happened times a thousand. I’m the front door, not just one little icon.” So, it was a very, very cool experience.

Julien Blin: That’s awesome. So, obviously … Let’s talk about sports a little bit, because our audience are mostly comprised of sports and tech executives. And I know you’re a huge Warriors fan, right? I love the Warriors too.

Adam Cheyer: Absolutely.

Julien Blin: Yeah, there are many professional teams that are using chat box, right. The Warriors are very good at that. Siri is also being used to let the fans check on live scores, and they’re also putting Google Home, and Alexa type devices in the stadium, to improve the fan experience, so in your view, so what do you think would be the killer AI sports conversation or experience?

Adam Cheyer: Well, so I started a few companies. So, two in the assistance space. Namely, Siri and then Viv Labs, which sold to Samsung.

I almost started another company called Sentient, which is a machine learning AI company, and I think AI, in general, will transform sports in so many ways. And AI being both optimizing decisions, and letting athletes be their best selves.

So today, everything is instrumented, we can have wearable technologies that capture every movement, every heartbeat, every little aspect of an athlete. We have video cameras that can watch the interactions of teams, so they know every position of every player, and who’s passing the ball to what. And then, they can number crunch, and do machine learning to do data analysis.

To understand when does this team play best against other teams, what defenses work best, what player combinations work best. So, I think AI, in general, will absolutely optimize both the individual, how they perform, and teams. And, of course, an assistant as an interface to that AI knowledge base will be incredibly powerful. A coach will be able to say, “Bring up recommended plays that I should do in this situation.” Or, ask questions, “What is the success rate?” Players when they’re working out, can have a little earbud in their ear that’s giving them feedback based on how to shoot better, how to position better, what they’re doing wrong. And almost, get this real time adjustment, where they can talk to the assistant, the assistant can talk to them. And the assistant becomes an automated data driven coach.

Of course, for the fans, it’ll let them pull up not only what’s the score of the current game. Maybe, they’re watching the game, or something else. But they can also tap into this huge amount of data, and be able to get new insights, be able to check out history, to engage, to pull up news stories, and Twitter feeds. And really, tap into the full 360 degree view around the game that they’re watching, or the team that they’re passionate about. And, of course, interfaces will continue to evolve. So today, you watch at home on a TV, but the TV experience will evolve tremendously. And I think language, and an assistant who knows your preferences, and has access to immense amounts of data, will really transform the fan experience. The last thing I’ll mention is, I know that betting is a big topic in sports. And it looks like we’re moving more and more, to legalize betting in more and more states.

And as betting progresses, you’re not just going to bet on the score of the game. But there will be bets of all different sorts, right? I’ll bet that the first person to score a point in the half will be this person. And I bet that this person outscores the guy he’s guarding by this many points.

And there’s such a wide range of things that could be bet on, and gambled live, as you’re watching the game. Having an assistant that’s automated, and knows your preferences, can help place those bets much more easily, or can help assist you, and bring up information to help you make good decisions in that space. I think, that may also be another aspect of the game of sports that’s going to grow, and I think a conversational assistant interface, and the AI, and data analysis behind it will play big roles in that.

Julien Blin: I think you’re spot on, and we’re starting to see some glimpse of that. There are some companies who are trying to build some of those experiences, but they’re based on what you just described. I think we still have a way to go. So, how many years do you think … When will we start to see those types of experiences? Are we talking about 5 years from now, 10 years from now? Or, what is your take on that?

Adam Cheyer: Well, I think some of it in some form, we’re starting to see it already, right. Even in 2011 or ’12, you could ask Siri, “When is the next Warriors game, and what are the odds on that game?” And things like that, through an assistant.

It’s amazing we carry around in our pocket a super computer that we can talk to, that has access to the internet, and this huge array of data. So, we’re already living in the future, and have been for quite a few years.

That said, it’s going to grow. And the data that we have access to, the intelligence that’s applied across that data, and the ubiquity of the assistant. So, I say that every 10 years, the way people interact with computers changes.

So, think about it. Mid 80s, the PC came out, and we all had to learn about Windows, and a mouse, and keyboards, and how to run a PC computer. 10 years later, the web emerged in the mid 90s, with hyperlinks and multimedia documents, and URLs, and act buttons. 10 years approximately after that, out came the iPhone and the App store. And now, we had to learn about downloading apps, and pinch and zoom, and all of that experience. And last year was the 10th anniversary of the App store.

So, I believe we’re sitting at a point, where we have assistance like Bixby, and Alexa, and Cortana, and Google Assistant. They only do a handful of things, and most of those things are programmed by the big companies themselves. “What’s the weather?” “Set a timer.” “Play a song.” Third parties are beginning to start, to be able to create skills or add-ons to an assistant, but none of them are used, it’s not scalable. So, everyone will say, “Get me the weather.” But very, very few people use third party skills. And I make this as the analogy of, “What if we had a web that only had bookmarks and no search engine?” That’s kind of where we are today, right?

Adam Cheyer: You use the 10 things that are in your bookmarks, but you’re really not accessing the whole rest of the web, because it’s not easy to discover.

Julien Blin: That’s right.

Adam Cheyer: … to have a scalable view. But now, companies like Samsung, and powered by Viv technology, I’ll get a little plug in. We’re going to be coming out with a new kind of, I’ll call it, an app store for an assistant, that we believe will scale in a whole new way. And last November, we launched Tools at bixbydevelopers.com, that are unlike any programming environment ever created before. Literally, the AI writes most of the program for you, and with you. It’s a really crazy experience as a developer. What this means, is any company, any service, any use case that has a website or an app, you’ll now be able to plug in what we call a capsule. You’ll be able to create a capsule that will encapsulate your experience. It’s like a knowledge pack that will be able to be plugged in to your personal assistant. And users will be able to discover them, buy them. If they’re not a free capsule, you can make money off of the capsules.

And soon, we believe every industry, and every company will be using an assistant as a major component of how they offer their services. Just like they do today, with the web, and with a mobile app. And every user will now start using the assistant as a major way for getting things done. Not only for sports, but for everything they do. Because, it’ll be a truly scalable experience.

How long will that take? The marketplace will come out this year. Probably, competitors will be looking at this, and coming out with their versions. Just like when we came out with Siri, six months to a year later, all the other big companies had their Siri clones.

So, I believe ubiquitous assistant, that can do everything that we do on the internet today, or most things on the internet, will be here in full force within two years from now.

Julien Blin: Two years from now?

Adam Cheyer: And I think that will change, as I said. People will still use a PC, people still use the web, people still use smartphones and apps. But this will be a new component, and I think it will be a significant way. Because, it’s just so easy, so frictionless to have an assistant who can handle multiple complex task for you. Knowing your preferences, and being able to automate getting those tasks done.

Julien Blin: And that makes sense. And also, when we met in San Jose the other day, you mentioned that the issue today, from what I understand, is that the way that Siri was built on the iPhone is different from the way it was built on the iPad, right? From what I understood. And that, therefore, you cannot really have a true cross platform or AI conversational experience. If you start asking a question to Siri on the iPhone, and then you continue conversation on the iPad, it will not be seamless. And what you guys are building, is going to enable the seamless experience. Did I get that right?

Adam Cheyer: Yes, that’s right. So today, the Siri on your Apple TV, and the Siri on your Apple Watch, and the Siri on your iPhone, and your iPad, they’re not all the same Siri. They do different things, and they know different things, right. If you have contacts on your phone number one, and you didn’t sync it to contacts on your iPad, for instance, Siri will only know the contacts that are on that iPad. And so, our vision is we want one assistant that you can access, where the device is really just the context.

So, whether I’m talking on a TV, or a refrigerator, or a smartwatch, or a phone, I want to know that Bixby knows me. If I told it something once on one device, it should know it over on a second device, it’s the same assistant. And so, we’ve designed it in that way, that a user can just think of it as, “It’s just my Bixby. And the fact that I’m talking to it over a phone or a TV doesn’t really matter.”

Additionally, for developers when you create your capsule, you don’t create different capsules for each device, that would be really hard to maintain. You create one capsule. This is the interface for my service. And you can make within that capsule, small adaptations for, “Well, what does it look like? It’ll look a little different on a watch, then it does on a TV.” Or, if there’s no screen, the dialogue might change a little bit. But it leverages and maximizes reuse. So, you build one capsule, most of it is reused, but you can tailor it slightly to be the same capsule, but have slightly different experience, depending on the device that you’re accessing.

Julien Blin: Yeah, that makes sense. So, you talked a bit about AR and VR earlier. So obviously, there’s a lot of hype about AR and VR, with companies like Oculus, and Google, and Magic Leap. So, for example, Magic Leap is actually working on an assistant for the Magic Leap devices, right. So, what is the future of AR, and VR, and AI altogether looks like? And what kind of experiences? You talked about it a little bit.

Adam Cheyer: Yeah. So, as I mentioned, I think a new paradigm emerges, interface paradigm emerges every 10 years.

For me, AR and VR are not ready to go mainstream right now. There’s too much still to be figured out. But 10 years from now, I could see AR being the next interface paradigm. That, as you walk around in the world, literally, we can read and write every pixel that we’re seeing, and really create the experience, the augmented experience that we want.

So, in order for AR and VR … well, let’s focus on AR, to work well in my life, I need to really be able to understand the world, and understand the physics of the world. So, if I place an augmented reality object into my physical world, needs to look like and work like real objects. And today, we’re not quite there yet. We also need the equivalent of an OS. I need to be able to point to things in this world easily. I need to be able to maybe have haptic feedback to interact with them. When Windows came out, we had dialogue boxes, and menus, and file system. And there are all these core base elements that were standardized, that all the other programs could be built on yet. And I feel for AR, those core principles of how do you search for things, how do you grab things, how do you manipulate things, how do you discover things, how do you store them, how do you find them, how do you reference them. That still hasn’t been figured out enough, and standardized. Since we don’t have the OS completely done yet, it’s hard to build lots and lots of applications for it. And then, the role of a conversational assistant, I think, as you’re walking around the world and there are now augmented objects in that world, voice, and speech, and language is a huge component of that OS. You can talk, and say, “Hey, bring up certain information, and put it over here, and point.” And then, that should help tell you where in your visual display you want this information. So, I think it makes sense.

But to me, it still feels like we’re in the research phase for all of this. And it just hasn’t yet materialized and you need it all to come together. The whole thing has to be together, to be ready for prime time.

And then, once it’s solid, then we can build all the applications that we would want. And so, like I said, I think just historically, you look every 10 years, things change. I’m kind of penciling out 10 years from now, that my next big user experience breakthrough will be based on AR.

Julien Blin: That makes sense. And obviously, you created so many great products, like Siri. But what do you hope to accomplish in the coming years? You talked about that a little bit. But is it to create the third wave of innovative digital assistant, or become one of the most creative magicians out there? I know you like magic, right? Or, maybe both?

Adam Cheyer: I do like magic as a hobby, especially with my son. But I don’t think that’s going to be my next profession. So, I’ve really been pursuing two different threads in my career, over the last 25 years. One, is the assistant thread, and I keep trying over and over again, from 1993, through the 90s, through the 2000s. And hopefully, we’re getting very, very close to what I imagined 25 years ago.

The second thread in my career, was from a mentor. He’s perhaps the greatest computer scientist ever, but quite unknown. His name was Doug Engelbart, and he actually was the inventor of the mouse, of Windows, of hypertext. He did something called The Mother of All Demos in 1968, where he showed shared screen, teleconferencing, interactive computing, editors, email, windows. Pretty much, everything that makes up our personal computing and internet experience today, he demoed in ’68.

And the reason he created all this technology, was not for technology sake. He said, “The world will be faced with ever complex urgent global problems. And unless we get better as a species, to think and work together to solve global problems, we’re not going to survive.” And so, he dedicated his life and his career towards trying to augment human intellect, by forming high performance teams at a global scale, to solve problems. So, I’ve tried to do things like that.

I was a founding member and first developer at change.org, which is the world’s largest petition platform. More than a quarter of a billion members. You see anything wrong in the world, you just go say, “I want this organization to make this change, and here’s why.” And if people like your idea, they click on the petition, and now you get a couple hundred thousand votes, and that shines a spotlight on that organization or that person, to make that change, and every day victories happen.

So, that’s one simple attempt to solve real world problems by harnessing the collective intelligence of humanity. I think, if I were to do a next company, a next project, I would swing back to that side. I’ve done a lot in the assistant space, I think I have more to day in the how can we collectively solve global problems better. So, I think that probably, my next project or effort, would probably deal more in that space.

Julien Blin: It’s an awesome goal, right? Well, listen, we’re at the end of the show. But I just want to thank you very much. That was a great conversation. Thank you-

Adam Cheyer: Yeah, thank you.

Julien Blin: Thank you for being part of it. And I hope you enjoy the rest of your day. Thank you, Adam.

Adam Cheyer: Okay, thanks so much Julien, appreciate it.

Julien Blin: Thank you.