Developing for Augmented Reality

Sponsors:

  • Mint Mobile: Voice, data, and text for less. Get free first-class shipping with code VTFREESHIP.
  • Thrifter.com: All the best deals from Amazon, Best Buy, and more, fussily curated and constantly updated.
  • Interested in sponsoring VECTOR? Contact sponsor@mobilenations.com

Transcript

[background music]

Rene Ritchie: I'm Rene Ritchie, and this is "Vector." Vector is brought to you today by Mint Mobile. Mint Mobile works just like your traditional US wireless service, but it is ridiculously inexpensive.

For example, you can get five gigabytes for three months for just $20 per month. Right now, you get three months free when you buy three months. You can even get free shipping on any Mint Mobile purchase. Just go to mintsim.com and use promo code VTFreeShip. Thanks, Mint Mobile.

This is our first developer roundtable. It's very much in the spirit of a podcast I used to do with Guy English before he was forbade for doing tech podcasts anymore. The idea is we get a bunch of really smart developers together and talk about a topic that all of us love. Today's topic is AR.

First, I'm going to introduce my colleague, someone who's work I've admired a lot over the last few years. I always love the chance to podcast with him. He runs augmented reality and virtual reality coverage for iMore. I guess mixed reality now is a thing. Russell Holly, how are you?

Russell Holly: Hi there. I'm good.

Rene: Thrilled to talk with you today.

Russell: This is going to be a lot of fun. I'm excited to dive into this a little deeper.

Rene: We also have Michelle Hessel. How are you, Michelle?

Michelle Hessel: I'm good. How are you?

Rene: Good, thank you. What do you work on?

Michelle: I'm currently a research resident at ITP-NYU. For my research at NYU, I'm doing a lot of work in AR.

Rene: Perfect. James Thomson, I'm very carefully not pronouncing the p because you took the p and you put it in PCalc.

James Thomson: [laughs] I did indeed.

Rene: [laughs]

James: I've been working on PCalc for it's coming up 25 years in a couple of weeks. To keep myself sane, I do silly things occasionally. In the most recent version, yes, there's some AR stuff tucked away into the About screen.

Rene: I think it's safe to say anybody who's followed your career knows that, one, anytime a new technology is introduced, you are one of the first people to adapt it, whether it's stuffing an entire calculator app into widget space or putting the panic truck into an AR experience in a calculator app. Two, you do it all with cheeky Scottish aplomb.

James: [laughs] That's the only way I know.

What is augmented reality?

Rene: Russell, could you orient us if people aren't familiar with AR, maybe the distinction between AR and VR and the advent of mixed reality, what we're all referring to?

Russell: Virtual reality right now is probably a little more common from the techie perspective as a term goes. Basically, it's replacing your field of view with something else. Putting a headset on that replaces whatever it is that would normally be around you with another environment.

Augmented reality takes the environment that's around you, and as the name suggests, just messes with it a little bit. In the most recent iteration of it, through our phones, our tablets, and very specific headsets makes it so that you are almost looking through the thing that you are either holding or wearing and are able to experience other things that are happening through a camera.

Rene: I had an experience that really made the distinction clear to me. It was a developer demo. I was holding up, I think, an iPad at the time. I was just looking at a library. It was a real library. It was giving me a live camera view, but there are a few extra things on it. I turned, and there was a door that didn't really exist.

When you literally punched your way through the door, it opened up into a second library that was completely not in the real world, but was marvelous and full of interesting things. I felt I could really just walk into it and crossover, like a portal, like stranger things.

Russell: The portal examples, there are a ton of them if you go look through videos and stuff. Those are often the most visually jarring where you really get the feel for what's happening. They're definitely a lot of fun.

Getting augmented

Rene: Michelle, how did you get into AR?

Michelle: That's a funny question. I got into technology in general about two and a half years ago. Before, I was working with marketing, but I decided to come to New York to a master's in technology. Somehow, I found myself among programmers and started to get exposed to all these technologies.

I found myself very interested in the 3D universe. Of course, I did some VR. I guess I got more interested in AR recently when we had so many developments in the field. When Apple, Google, Facebook, and all these companies decided that they were really going to invest on it, and they released so many technologies.

Like many others, that was a big thing for me. I just wanted to experiment and see what I could do. Once I got going I was just fascinated about the possibilities.

Rene: Was that true for you too, James? I know like to adopt early technologies, but often it's in the context of PCalc, and it's places where PCalc makes sense. This was almost another world for you.

James: I first became interested when I had tried VR, because I only tried that last year, when I got a Playstation VR.

Rene: Because you could be Batman. Same as me. [laughs]

James: Exactly, yes. That was, indeed, the high point. Once I tried that, it was clear to me that these kinds of interfaces are probably the future, be it 5 or 10 years away.

When Apple announced ARKit, I thought, "Well, this is something I'd like to play with." I had no experience whatsoever in doing any 3D graphics, or anything.

As an evening project, trying to not actually get in the way of real work, I started playing with SceneKit to do basic 3D graphics, and then I ramped up. I was really just playing and experimenting.

You could very easily say that what I did in PCalc is completely pointless because, amongst other things, you can create a virtual PCalc calculator in 3D space. The buttons still work, and things like that.

It's not really useful in any way, but the learning experience of how to get to that point was really what I was interested in.

Going Glass

Georgia Dow wearing Google Glass

Georgia Dow wearing Google Glass (Image credit: iMore)

Rene: That's the thing, Russell. You keep hearing everybody from Tim Cook, to Sundar Pichai, to Mark Zuckerberg talk about AR and VR as if it's the future.

Facebook famously invested all the money, [laughs] Mark Zuckerberg's reserved bank account, to buy Oculus. Apple and Google have either bought companies or built up the talent to make ARKit and ARCore.

They really do think that this is the future for a lot of people.

James: It's a fairly safe bet that Apple are working on a headset of some kind. The stuff that they're doing now with ARKit on the phones, while it's interesting and it has quite a wide degree of application, it's a testbed for the stuff that they're going to do for a future device.

Whether those future devices are going to replace our phones in 10 years, I don't know, but I could see that happening.

Michelle: Also, one thing that I read about two days ago that I thought it was very interesting is that Bloomberg announced that Apple is working on a depth-sensing technology, which should be introduced to iPhones in 2019, or so. In that sense, I have a feeling they are truly committed to augmented reality.

What they're doing now with the face tracking, the frontal camera in the iPhone is a first step, like a test to see how people assimilate the technology. I really think in the near future we will see much more of AR coming from these companies.

Rene: That's my question for you, Russell. We've seen this before. For example, Apple had Passbook long before they had Apply Pay.

If you looked at it you could read ahead and say, "With Passbook it makes sense to have credit card integration, that it makes sense to have Apple Pay, that it makes sense to have person-to-person payments." By the time you get to the end result, you've built up all the infrastructure you need.

That's sort of what it felt like with this stuff for me. It's adding sensors, adding camera capabilities, and then adding ARKit. Even before you get any specific hardware, it lets all of us start playing early on.

Russell: What we're looking at here is Apple had some very clear questions that they wanted answered, and they used ARKit as a mechanism for doing that. Through that they have this entire app ecosystem that they can let developers loose on and figure out what else they don't have answers to yet.

The big thing that we're going to see initially is what problems can be solved with this. A lot of it goes way beyond facial recognition things, which are very important, but really moving into how can we improve things like turn-by-turn navigation using better sensors for motion tracking.

If someone's looking at their screen anyway when they're using turn-by-turn navigation in a city, for example, to give them more information, real world information, instead of giving them this flat map to look at.

A lot of those kind of examples are things that immediately came from this ARKit going to the public. These questions that I'm sure Apple is already working on, and so are several other companies, to build natively into their own platforms to create the kind of infrastructure that you're talking about.

Getting real

Rene: I want to dive deeper into the code-y, nerdy bits with James and Michelle in a second.

This raises two things to me. First, James, you were very self-deprecating when you said that it's just fun.

There's going to be a whole bunch of people that the educational benefits, people who are especially visual or kinesthetic learners. Having it environmentally is going to be a much better experience for them than having a representation of it on a phone, even if it is a really good one.

Russell, there was other attempts, too. Microsoft got HoloLens in prototype format on the market early, and Google had Project Tango. Were those different vectors to the same problem?

Russell: You can even take it a step back further with Google and look at Glass, which was never really aimed at being a consumer product, but they released it to this group of people to figure out what questions they didn't have answers to.

We wound up with two different ends of the same perspective from Microsoft and Google, in that Microsoft envisions a future without a computer. The computer is either a set of glasses, or something like that, that you wear. Google envisioned a world where the phone wasn't this thing that you used a hundred times a day to interact with things.

This prototype hardware were built to figure out how far into solving that problem are we right now. Apple's taking a much more cautious approach, and it makes them look much nicer right now.

Rene: [laughs] Like boiling the frog?

Russell: Right.

Rene: Michelle, how did you get started with ARKit, or ARCore, or the AR technologies in general?

Michelle: I do a lot of work in Unity. I love Unity for many reasons, but the main reason I love the platform is that it supports so many plugins and so many SDKs.

I was already working in Unity, and Unity has a very good partnership with a company called Vuforia, which is a marker-based type of AR. It's different from ARKit or ARCore in the sense that it needs a tag, an image, to trigger the digital content.

I've been experimenting with that technology for a while, and then when ARKit and ARCore were released a few months ago, they immediately had SDKs for Unity, which made the workflow, for me, much easier.

Rene: Was that the thing with you, too, James? It made it approachable?

James: I started from a very weird place in that I was drawing alternate icons for PCalc. I had reached the level of the limits of my skill in drawing, and I thought, "Well, I could probably do this in some 3D software."

I started playing around with things like Blender. By the end of that I had the PCalc icon as a 3D model, and I thought, "Well, I've got this 3D model. I could probably put this in the About screen as a little 3D thing that you could play around with and spin around."

I started that, and got into the 3D programming and the physics engine stuff of SceneKit. I was like, "Well, let's see what happens if I make some marbles and drop these on top of the icon."

I started each week building up a new part, or looking into a new part of the SceneKit APIs. They tie in very nicely into ARKit. There are some problems, which I'll come back to, but generally it was obviously built to work together.

Then I could start to do this sort of stuff in AR, as well, and play around with what it was like to interact with a 3D space like that.

Russell: Michelle, you brought it up, so I'm going to shift pace a little bit here. One of the things that makes ARKits, ARCore, and some of these others, the Facebook tech, so compelling is that it doesn't require a marker of any kind.

For those who are unaware, we had augmented reality before ARKit, ARCore, and Facebook. Nintendo had it built into the 3DS system. It required a QR code that acted as a marker to do things on their AR cards.

Can you speak a little bit about, in your experience, the upside and downside to using a lot of these markers?

It seems like a bunch of ARKit apps that I've used on the iPhone, they map an area. It seems like they lose it every once in a while. That didn't happen as much for me on marker-based systems.

Michelle: You mean you want me to talk about the downside of marker-based or marker-less?

Russell: Both, really. In your experience, having worked with both, what do you think is happening with both of them right now?

Michelle: Both have their own advantages. I've done projects with marker-based AR that are really exciting and engaging.

The thing is, like any technology, it needs to fit well the project. For some projects it makes total sense for you to have a marker if you're doing something that is related.

What I mean by a marker, in case people are not so familiar with the term, pretty much what you can do with Vuforia is you can take any image, like you can take a picture of anything that has some sort of pattern, upload it to their database.

Then they do something in which they can calculate the distance between the points in the image. Based on that, you can link a digital object to it.

When you have a printed version of this image, or the object, itself, in front of a camera, the camera is locating these points. You can move it around, and it's going to know where's the location, what's the angle that it's positioned against the camera, and so on.

For some cases that's a really good technology. For example, I did a project where my marker was a temporary tattoo. It made a lot of sense for you to have the marker on somebody's body, and then trigger things coming out of the body.

ARKit is more exciting, in some ways, because it's almost like the thing is part of the environment, and there's nothing that really is triggering it to your eyes. What's triggering it, it's like the plane detection.

If you open the app, and you tap, then all of a sudden you can place something anywhere in the room. I feel like that becomes, depending on the content, more exciting for the user.

Again, I feel like both have their purpose. It's a matter of finding what's the best technology for the project.

Russell: My favorite -- not favorite because it was very good, just favorite because it was a very silly -- example of marker-based AR in the real world, there was a company for a while that was using Vuforia to make these shirts that had a pattern on them.

When you pointed your phone at the pattern, one of the chestburster aliens came flying out of the guy's shirt. He walked around for, it must have been two years, wearing this shirt. It was so absurd. Every time I saw it, I loved it.

Michelle: Something like that is perfect for marker-based AR. You cannot have the same result using ARKit.

It's a matter of is there an image that should be triggering something? If so, yes, marker-based AR is your solution. If you want to put something on a flat surface, then you should go with ARKit or ARCore.

Layers upon layers

PCalc AR Mode Demonstration from James Thomson on Vimeo.

Russell: James, one of my favorite things about ARKit so far, having...I don't even know how many apps I've installed. Throughout a lot of the things, one of the things ARKit does very well, and we don't talk about it a whole lot, is being able to tell the difference between stuff that's in the foreground and stuff that's in the background.

It has a name that for some reason has completely escaped me right now. This obfuscation detection, where if you're putting something, you place Apple's demo candle down on a table, and then you slide a coffee cup in front of it, the candle will stay behind the cup in a lot of cases.

Have you had the opportunity to experiment with that with your setup? Do you think that should be something that gets greater focus when building an augmented reality app?

James: I haven't actually seen that, myself, when I've been playing with ARKit. It's entirely possible that I've missed it.

The plane detection is quite slow. It's only horizontal planes currently. It would be nice in some future tech if the system could see walls, ceilings, and get a much better idea of the room it is looking at.

At the moment, if you have something, you have a floor plane, for example, you drop a ball on it, and the ball rolls away, when the ball gets to where the wall is, the ARKit doesn't really know that it should stop at that point.

Russell: The ball keeps going as though there's no wall there.

James: Then it breaks the visual illusion to a certain point. You could see that with future sensors, if you had the front IR camera, depth camera, that kind of technology. If it was looking around and it had a much better idea of where it was, that's the goal, but I don't think they're quite there yet.

I'm now intrigued by this thing you've talked about. I'm going to have to look at the documentation again.

Russell: I looked as you were talking. It was occlusion detection. There are very few things that use it.

The explanation that I got when I asked someone else was that it was computationally very expensive because it has to run that depth map a lot more than just the one time when it's creating the image, which a lot of augmented reality apps do.

The closest thing to what you're talking about right now, as far as being able to tell what's going on in the room, HoloLens does a version of that with the way that it maps out a room before starting a field.

It actually has to physically map the room in order to do it. It doesn't do it on the fly. It's all early prototype stuff. It's not commercially available, or anything.

You basically walk around with this helmet on and get this 3D map of the space that you're trying to operate in before you even load an app in the first place. It ends up being a lot more complicated.

Being able to do this with the phone and have it immediately detect the wall would be amazing.

James: The other thing is occasionally it will lose the tracking and you'll get jarring jumps of stuff moving around. That's going to be a much bigger problem if, I'm assuming, we get to headsets.

Certainly, with VR, any time where you move your head and things don't move with it...

Russell: It's a bad time.

James: ...instant nausea, and it's bad. That's the gap that they need to get across before we can get to headsets, is the tracking needs to be perfect. Most of these systems, like the VR systems, and things, will do it by means of those external cameras, or pods, or something tracking your position in 3D space. Doing it entirely on the device is still quite tricky.

Michelle: In that sense, that's what I feel Google tried to do with, also, Project Tango. Tango, I feel like it failed most because Google pulled the technology. Right now, there are two devices that support Tango, which is the ZenFone and the Lenovo Phab 2 Pro, but it's a device that very few, few, few people have.

I've had the opportunity to try it. It's amazing the ability of the phone to track the environment around you. It's pretty incredible. It tracks walls. It tracks if you have a chair in the middle of the room. It tracks everything,

Unfortunately, there are very few apps that were developed for Tango, and there are very few people who have the phone. I feel like, in that sense, ARCore was a response for ARKit.

It was like, "OK, we're not quite there yet to release something like this, because not many people will have the opportunity to play with the app. So let's launch something that most people will be able to play with the phones that they already have."

In a few years I feel like depth-sensing cameras will be almost like having the normal cameras that we have now in the phones. That kind of technology and that kind of interaction that you guys are talking about will be much more available.

Practical AR

Russell: Absolutely. Rene, you pointed this out in your review of the iPhone X, where Apple was already using ARKit in very subtly ways through the camera.

A large portion of how portrait lighting works, when it does, in fact, work, is using augmented reality. We saw another example of that in Clips, where it's built into Clips now in this very subtle way.

You've been using a lot of these AR apps, as well. Do you find that you appreciate the more subtle approach to launching an app that is specifically for augmented reality, and doing the augmented reality thing, and then moving on to something else?

Rene: I have this pet theory, and I know I'm certainly not alone in it. AR, we talk a lot about it now because it's new and interesting, but it'll eventually become standard display technology, the way panels did for previous generations.

Everything had a panel -- your phones, your tablets, your computers -- and that was the way you perceived them and interfaced with them. AR is going to take a lot of that over.

Where AR becomes interesting, and maybe confusing for some people, is that you have ingestion, as well as expression. You can walk into an AR world, but the device can also suck in so much through computer vision, and through modeling, and all these things that it uses for things other than projecting you into that space.

Portrait lighting is a great example of it. It's doing all the depth detection, the mapping, and creating all these things, but instead of giving me deer antlers, whiskers, and stuff, it's projecting a lighting effect. People may not realize that it's the same technology, or they may not realize that things like autonomous driving are relying on all the same technology.

All of this stuff is powering the future. When you launch an app and it does these cool things, like portrait background, which is essentially what Clips is doing, it doesn't really even seem like AR to you.

All the computational [inaudible 28:31] , it seems like next generation apps. That is super cool, because it's becoming part of everything that we do.

James: Do you think there could be problems with Apple pushing VR very heavily now, when it's still in a very early stage? People will sort of say, "Oh, well, yeah I tried that AR thing. It was..."

Rene: "It's got blurry edges," or "It makes you look like a paper cutout."

James: Or something like that. There was a big media splash over ARKit. There are a lot of apps that have appeared, but not many that are stuff that you would use day to day. I have a concern over that and over VR.

[crosstalk]

Rene: Correct me if I'm wrong, Russell, but in the beginning, it's always turbulence. Everybody rushes in, and they throw every bit of spaghetti they can at every wall they can, and then everyone starts to see what sticks.

Some of the stuff is remarkable tech demos, absolutely, but then you start to get the tools, the education, the imaging, and all the stuff that two or three years from now we'll look back and wonder how we lived without.

Russell: You've got this Wild West thing going on, especially in the App Store right now, with all of the things that can be done with ARKit.

Let me tell you, if you are listening to this podcast and think that what the App Store needs is another way to measure things in AR, please reconsider, because there are 45 apps that are digital rulers using ARKit.

They all run into the same kind of pitfalls, unfortunately. Is that if a tracking fails, or something like that, then the data's not usable. There isn't a functional solution to that.

Apple ran into a similar "problem" when Force Touch came out. That there were a couple of things where Force Touch got pushed into everything, and it was stuff that a lot of people wound up not ever using.

Now Force Touch is a part of the operating system. I found myself surprised at how frequently I use Force Touch to open the camera on my phone now, or to turn on the...

Rene: Can I confess that the flashlight button on the iPhone X is my new bubble wrap. I press it as stress relief. [laughs]

Russell: All the time. It's one of those things that I found myself genuinely surprised that I used it, especially because when Force Touch first came out I was definitely one of those people who was like, "Well, I'm never going to use this stuff," and then it got built into all of this weird stuff.

It was like every single thing that you could touch in Instagram for a while had a Force Touch variant, and it annoyed me.

Rene: Right now, I'm going to start scheduling the tactile interface podcast based on Force Touch and Nintendo Switch, because tactile interface is going to be a whole other dimension to all of this. [laughs]

Russell: It really is. I feel like it's the same kind of thing, where we've got this mad rush to see what can we make AR do. In about a year there will be a couple of really solid answers to that question.

Rene: Are you seeing that, Michelle, in your research? Are you seeing any trends yet, or is it still a lot of experimenting?

Michelle: This is actually a very exciting topic for me, because I feel like everything I did now was fun experiments, not any useful thing. I feel like I'm within a moment where I would like seeing what I can do.

If we're hoping to create something of lasting value, we need to start thinking about what people want and what people need, not just what we technically can do, how far can we go with the technology.

I feel like, for example, AR has a potential not to make you more distracted and throw more things in the world, but it could be, maybe, I'm just saying, a tool that will help you focus and pay more attention to something that matters for you.

We need to apply techniques of human-centered design and start rethinking the technology in a way that people will actually want to use it in their daily basis, and not just go to the App Store, download this app, try it once, and delete it, or end up with 50 different AR apps in your phone that you opened once, and then never open anymore.

It should move towards the point where, like the Portrait mode, it becomes something that is part of your daily life.

Russell: I definitely agree with that. My favorite example right now, I mentioned it earlier, is the kind of turn-by-turn navigation using AR.

Google has a demo somewhere using ARCore, where it's actually an indoor turn-by-turn navigation. A regular office building, where you're able to hold the phone. There's this big green arrow that points you along the way.

We have a lot of places where Google and Apple both have indoor maps for, particularly, malls, libraries, and airports. Airports, in particular.

That's a huge quality of life improvement, to be able to pull up my phone and have a guiding arrow through an airport that I've never been in before, if I want to find a particular coffee shop, or a bookstore, or something like that.

If I have a need for emergency services, to be able to hit an emergency services button and have an ambulatory service or a police service be able to get an augmented map to where I am in that place so that they can come directly to me, because my location is a known quantity and I'm attached to this network.

I feel like both of those things could be tremendously powerful when it comes to the day-to-day use.

There's also plenty of room for silly things. My absolute favorite thing in the last two weeks for ARKit, and it's actually your fault, Michelle.

[laughter]

Dancing Storm Troopers

Russell: There's this dancing Stormtrooper on Michelle's Twitter feed.

Michelle: The most useless AR app.

Russell: It's so great. The more I looked at it when I first found it, the more I was like, "The shadow in it is super cool, and the lighting in it is super cool." It made me happy.

There's still definitely room for silly things, as well.

Rene: Do you remember that early demo of ARKit with BB-8 running through the kitchen? That sold me, right there. [laughs]

Russell: It was great.

Michelle: In that sense, there is space for useful tools, like you can put in an airport or something, and there's, of course, room for fun stuff.

Snapchat, for example, to me is one of the biggest AR companies that exists today, and we don't even associate it with AR. It's part of people's daily lives, like Instagram.

What you can do in Snapchat sometimes might not be like, "Oh my god, so useful. This is going to improve my day in so many ways. I'm going to learn from it," but it's a delightful moment for many people.

It's an app that does, in a very good way, what people want. It can be a purely fun experience, or an experience that will allow you to express yourself as a human being and connect with your friends.

In that sense, AR has a lot of potential, too. It doesn't have to be just a very useful, serious tool. It can be fun, but it should be something that has the users' needs and what people want in first place.

Rene: I get yelled at on Twitter a lot...I actually get yelled at on Twitter about a lot of things.

Sometimes when I talk about Google Glass people say, "That's not real AR. That's a personal screen," or I talk about Pokémon Go, and they say, "That's not AR. That's just a sprite on a live view."

All of these things, Russell, at least in the beginning they start the discussion and they get people more used to the idea of the physical and digital world coexisting.

Russell: Absolutely. I say this, I was definitely one of those guys. I wore Google Glass for well over a year because it did things. Yes, it was more a personal screen than anything else.

The one thing that I absolutely loved doing with it was using it as my GPS. Instead of having my phone setup in a dock, being able to use Google Maps and nav through Glass.

I never had to take my hands off the steering wheel. I never had to look away from the windshield. It was an incredibly powerful experience for me, personally.

A lot of that comes from doing just that, being able to take me so that I'm not looking down at my phone, but still giving me this information that's really useful. I feel like that's the core thing for augmented reality right now, is that.

Even though right now I'm still looking at my phone, being able to look through my phone is an incredibly important thing.

James: Once we get to the ability to have a headset, that's where a lot of stuff becomes easier, because you're not going to be holding the device in your hands, which then restricts what you can be doing at that point.

Microsoft have had demos like this. You're trying to fix your sink, or you're trying to do some kind of...

Rene: [laughs] The server's down and you don't know how to fix it, but there's some guy at the beach who can.

James: Some kind of task that you're trying to achieve. Then to be able to have, within the world, some kind of annotation to say, "Switch this thing on first. Do this. Do this," and be able to use your hands to do it, then that's going to really get somewhere.

Because holding something in your hand when you're trying to navigate a city, walking around, yes, you're looking through your phone, but you're not 100 percent aware of what's around you.

I keep looking forward to the future. I don't know what it will be, whether we'll get something in five years, or what Apple's time scales are.

Rene: James, you know we'll have a mother box and a set of contact lenses. The mother box will handle local authentication and cloud connection, and then everything else will display in our contact lenses. I know you know this.

[laughter]

Michelle: That's some "Black Mirror" stuff.

[laughter]

James: I don't think the contact lenses are the way to go. The direct connection through the spine is probably...

Rene: That's five years later. We have to do this in stages.

James: Then you have change the port in the back of your neck every five years when Apple changes the design.

Rene: Implant updates. I am not looking forward to that. [laughs]

Face to faceless

Russell: That actually does bring me to an interesting point. With the glasses, with a headset, one of the biggest criticisms of Google Glass was that it was not an attractive thing to have on your face.

I genuinely think the biggest problem that it had was that it was asymmetrical. We are, as human beings, very picky when it comes to that kind of thing. When something is on our face, there is a desire to have it be symmetrical in a lot of cases.

Rene: Also, like our colleague Georgia Dow kept bringing up, that it got in-between. We communicate with our face and with our eyes. It intermediated interpersonal relationships in the real world.

Russell: I'm curious to see how that challenge gets approached by other companies. We saw Snapchat with their glasses, which got weirdly compared to Google Glass all the time, even though it was just a camera.

Rene: I want to call them Snapticals, but I think they were Spectacles, right?

Russell: They were Spectacles. That was a big thing that happened with those, was that they were very symmetrical. They were very focused on style.

I'm curious to see what you collectively think, all of you, as to how that gets approached before we go to things like embedded chips, and things like that. They will eventually be glasses.

For starters, not everyone wears glasses already, so there's a level of discomfort there that doesn't exist with something like a phone. Also, how to make sure that those glasses are stylish enough that they're not immediately criticized for being a gadget on your face.

James: The glasses are a display technology talking to your phone, or something like that. They might have some sensors in them, or whatever, but all the processing is done down on your phone. That way you wouldn't need quite as much weight up on your face.

Rene: I just imagined the Apple Watch 1.0 on my face, James. Thank you for that.

[laughter]

James: If you look at any of the headsets on the market, something like the PlayStation VR, which is a great, low cost, nice headset, you could either call it futuristic, or stupid, or both. If you had anything that's starting to look like that, nobody's going to wear it, at least not outside.

If you had something that looked like a normal pair of glasses, like I'm looking down in Skype now and I can see a lovely picture of at least two of you wearing glasses. If looks like that, nobody's going to have a problem, because lots of people wear glasses.

The more that gets added to that, the less likely it's going to be adopted. I don't know how you solve that problem with today's technology.

Michelle: I completely agree with you, James. I feel like that's one of the problems, even with doing VR.

When you put a headset on, and you know that there's a bunch of people around you, at least I feel so vulnerable and awkward, because I can't see people, where are they, if they're looking at me. I know that I look awkward with the headset.

I feel like Google tried to do something when they came up with the Daydream. You can see that the design of the Daydream is much different, if you compare it to most headsets around. It has fabric texture. It's grey. It's nice.

It's super well designed, but even with that, it still feels weird to have a headset on and be the only person in the room. The same thing goes for HoloLens.

The HoloLens is a monstrosity that you need to put on your head. It's a gigantic headset. It has a computer embedded in the headset.

If we reach the point where we can put normal glasses that are very similar to the ones that we wear on a daily basis, I feel like that could potentially work. I don't want to reach that point where I have contact lenses or something that is permanently on my body. I hope I'm not here when that future comes.

[laughter]

Blacker Mirrors

Russell: I'm really curious as to how much damage to our collective psyche has been done by Black Mirror. As amazing a show as it is, there are definitely some hard lines that I have now when it comes to tech, where because of that show I've been like, "Oh, so now that I've seen this imagination of the worst-case scenario, I'm definitely never going to consider that."

Rene: "I'm wearing these contacts. Is that an alien, or is that my friend James whom I am going to kill? Is that an alien? Is that James?"

[laughter]

James: We will look back on Black Mirror in 10 years and think how naïve they were about how far it was going to go.

[laughter]

James: Somebody suggested a test. I can't remember who it was. It was, "Would you wear this device when going on a date?" If the answer is no, then it's not going to take off.

Russell: That's such a bad example, because I wore Google Glass everywhere. [laughs] I am the worst-case example for that.

Rene: You're adorable, Russell.

[laughter]

Rene: There's this other side to that, James. That is this theory that society keeps going backward and forwards. We're like a pendulum.

That one of the reactions to reality becoming less tolerable is that the Silicon Valley billionaires will eventually take over the government, give us all basic income and a phone and/or AR/VR headset. We'll end up being WALL-E, where we're just sitting there in the chairs being totally content and no matter what's happening in the un-Matrix world.

James: There are some days that that seems quite appealing.

[laughter]

James: I don't know how we get from where we are to there. Probably, Apple has prototypes of stuff.

Tim Cook had made some comment fairly recently saying the technology isn't there yet to do this, but it's clear they're working on it. They're buying AR glass companies, and it's not just them.

I always get these companies confused. Is it Magic Leap, or is it Leap?

Russell: Magic Leap is the one with the headset and the really tragic set of stories, so far, about their project being largely vaporware.

James: That's the other thing I was just thinking of when I was talking about being able to have your hands free. Is once your hands are free, then you can start doing gestural interfaces and things to replace the touchscreen or however you do it, as a way to input into these things.

At the moment, if it's on a phone you can touch things, and have buttons, or talk to Siri, or whatever. Once it's a thing on your face, then you need some way to talk to it that isn't just Siri.

The only thing worse than walking around with a giant headset on your face is walking around with a giant headset on your face, talking to it.

[laughter]

The problem is people

Taking a screenshot on iPhone X

Taking a screenshot on iPhone X

Russell: It's definitely not a thing you're going to enjoy, especially walking around outdoors, when someone else leans in and shouts into your headset.

That was one of the things that made me take Glass off, was being on a train in New York. Someone leaned in close to me, and thought they were being very clever, and started shouting, "OK, Google do this thing," half an inch from my ear.

Rene: That happened at CES. Someone burst in the room and stated yelling things at the headsets.

Russell: I was like, "OK, I'm going to take this off now." That wasn't fun.

Rene: People.

James: I tried Glass. It was a Macworld party at WWDC about four or five years ago, something like that. I looked back at the photos recently and it doesn't age well, that look.

Russell: It doesn't. It really doesn't, but at the same time -- this was something that I come back to all the time -- none of the other companies that did stuff afterwards, including now, none of them came close to even being as small as it was at the time, which was not very small.

It's really fascinating that of the companies that have continued to try headgear, that the Glass is still pretty high up there, as far as where the bar is, for how functional they have gotten to this point. It's really weird.

[background music]

Rene: We're going to take a quick break, so I can tell you about our sponsor that is thrifter.com.

Thrifter.com is an amazing team of people who scour the Internet day in, day out, looking for the absolute best deals. Mostly in technology, but also a bunch of other fun stuff like Lego stuff, Disney stuff, all sorts, the pillars of commercial goodness.

They compile them and explain them. They give you context. They give you details. They put them all up on thrifter.com all day, every day. Check it out. All their best stuff, Amazon stuff, Best Buy stuff, all of it without the fluff. Thrifter.com. Thanks, Thrifter.

[background music]

Augmented accessibility

Rene: One other question I had is accessibility with this stuff. You'd think that it was a huge accessibility win, but as I look more into it, and as we get between VR, AR, and mixed reality, there's so much you have to consider.

For example, some people don't have the ability to focus and converge on the same plane, which is why they can't watch 3D movies, but also why some VR experiences make them ill.

Especially things where the display is spinning, and your inner ear isn't. When they're not coordinated. They're fine with standard experiences, but when the things start to mismatch, they get ill.

Or the density of the display. Because you're so close, retina becomes a function of a 4K per lens, maybe, to avoid the screen door effect.

Or, depending on the display technology, some people can apparently see the smearing. I can't, but some people can see the smearing on OLED, and that bothers them.

Russell, is there still a lot of technology and best practices that have to be overcome for that?

Russell: There's a ton of stuff that has to be done. This is actually one of the few things that Oculus did very, very well coming out of the gate with the Oculus Rift, is making it clear that these were the challenges that they were looking at. That there were certain things that were immutable laws of making VR and AR work well. The first was a consistent frame rate.

Whatever you set your frame rate at, be it 30, 60, 120 frames per second, you cannot, no matter what, deviate from that if it's something that someone has immediately in front of their eyes. Very much like the inner ear mismatch, as soon as the frame rate drops, your stomach is gone. [laughs]

There's no coming back from that. It will very quickly ruin experiences for a lot of people.

As far as the displays, we definitely still run into what's called "the screen door effect," where you're looking, and you can see the lines in between the pixels, because you're holding magnifying glasses up to these lenses in order to create these effects.

It runs into a problem, but with augmented reality, I look to the prototype Mira headset and the Lenovo Mirage headset, that it is more commonly known as the Star Wars -- Jedi Challenges headset.

The Lenovo Mirage headset, it takes your phone, and it bounces it off of these reflective displays in a way that you're using actually a really small amount of the display on the phone to drive these kinds of larger-than-life experiences. There is no screen-door effect or resolution, or anything like that.

I put an iPhone X in this display. Then I put an iPhone 7 in this headset. They both look the same. One is clearly brighter than the other because Apple really nailed that brightness thing with this X display, but the resolution makes very little difference in the way that that is designed because of the way that the image is being reflected and expanded based on your view.

It's a curious set of problems from a visual perspective, but I feel like AR is going to have a lot fewer of those problems to contend with than VR currently does.

Rene: Is that your experience too, Michelle?

Michelle: In terms of VR, I had a lot of bad experiences with VR in terms of getting out extremely nauseous. That has to do with the frame rate, with the locomotion in VR. AR had so many other problems, I don't feel like that's one of the main issues we have right now.

In the end, you're putting something, you're layering it on the existing world. You're facing other issues. I feel like the technical and the resolution and this thing about getting nauseous is very specific to VR.

AR is more about how do you truly merge the content to the world in a way that is more meaningful, in a way that it looks like it's truly there? That has to do with lighting and how the light hits your digital object, how it's reflected, and how it's hitting the walls.

It's many other technical aspects that to me are more important, and we should figure out those things before then things like resolution and if we're seeing pixels and so on.

James: The other point is that if we move to these kind of interfaces as replacing the phone, if there are people who have accessibility issues for dealing with that. Motion sickness with VR is something that affects quite a lot of people, but there are going to be people who are locked out of this experience.

That would be my concern if it becomes the mainstream. In some ways, it might open up other avenues for people, but I'm not sure. [laughs] From my perspective, I would like some glasses that would enable me to see colors better, because I have problems with color.

I can't remember the name of them. Jason Snell had a pair of these glasses which are the ones that supposedly help people with color blindness. I only tried them for a couple of hours, but it didn't do anything for me.

Rene: You thought you'd be able to experience the horror of the outline.com the way the rest of us see it, James

[laughter]

James: The obvious example might be people who have vision problems. How does AR help them? It might be the technology that's being developed for, say, recognizing objects. That might actually help if you can imagine a pair of glasses that could describe what's in front of you.

Rene: Computer vision becomes translation.

Russell: Which there are actually virtual reality apps that do this for the Samsung Gear VR, which has the camera on the outside.

There's actually an assistive app for wearing the headset, and having it pointing out things and describing them to the user in the ear, and pointing out things like colors and shapes, and things like that. We're already starting to see the beginnings of those experiences.

The future of AR

Rene: I guess the last question I have for everybody, when ARKit was announced -- and, Russell, please forgive my ignorance on ARCore, that's why I work with you -- one of the things I really loved was that Apple was doing a lot of the heavy lifting themselves on the CPU, all things like scaling and lighting, and then leaving the GPU open for developers to create models, and do textures and everything that they needed to do.

I'm sort of curious what else would you like to see, what else would everybody like to see -- and maybe we'll start with you, James -- what would everyone like to see from the platforms next, from ARKit, from ARCore, from Facebook, from the big vendors? What could they do to make your jobs easier?

James: Given my job is writing calculators, I'm speculating a lot.

Rene: To make your About screen better, James. [laughs]

James: It is very good that they're doing a lot of the heavy lifting because I have no 3D graphics experience. I'm not a sort of John-Carmack-type figure who can write his own 3D engine. Ironically, the maths makes my head hurt.

The more that Apple can do for developers to make things. The obvious example is the wall detection for planes and things like that. Their 3D engine has support for a bunch of things. Every time they add something to that, then anyone can get access to it, so like the lighting models and stuff that they've been adding and things.

I don't know what would be the thing. Now that I've got caught up in the AR stuff, I'm trying to think 12 steps ahead, like, "Where is Apple going? You know, what, what is the end goal of this?"

The whole ARKit on phones is purely, well, not purely, but it's, as you say, a training exercise. It is getting developers to understand how to do this stuff, getting Apple to understand how to do this stuff, and do it in such a way that we don't need hardware that doesn't exist yet.

Even if you took the current ARKit, and you translated that onto a headset, that would be amazing to play with.

Rene: They'd be Steve-Jobs/John-Lennon-style glasses, right? [laughs]

James: Yeah, they would have to be. It would be good if you could control the tint of them as well.

Russell: [laughs]

Rene: What about you, Michelle? What would you like to see next from all the platforms?

Michelle: What I expect from all these platforms is -- and I'm not speaking for myself -- that they make the workflow of creating experiences more simple. By doing that, it will democratize the creation of the experiences.

If you can make the process of making it super simple -- and even Amazon is launching their platform called Sumerian or something like that that allows you to create VR and AR experiences pretty much without programming. If you can push more towards this direction more people will be able to participate.

By more people participating, you will have more experiences created. It will spread the technology and help us understand what are the possibilities better. I hope it becomes easier for everybody.

In general, for AR, and that's more of a bigger thought, but I hope that it becomes a tool that helps us better understand and connect with our existing physical reality. Adding this digital content, I hope it becomes a tool to make us see what we can't see. Whether it's because something happened in the past, so it could be a portal that brings me back to 200 years ago.

It could be something that helps me see what my eyes literally can't see, whether it's planets or stars that are far from me. A tool that will help me better understand my existing reality in general.

Rene: I just want that Keanu Reeves moment where I can say, "Wow. Now I know kung fu."

[laughter]

James: I want a future Pokémon Go where Pikachu can come and live in my house.

Michelle: I know. Aw.

[laughter]

Rene: They just announced more Gen 3, James. They have a few months on ahead of you. What about you, Russell? Where do you want to see all this going next?

Russell: The next step for me is a really short one. I want really solid surface detection, all the way around. Surface detection is the biggest challenge right now for Apple and Google both. We're seeing it in the portrait mode stuff that's coming from both platforms and with their respective augmented reality platforms that are available for people to work with.

The surface detection on both, it's a great first step, but I definitely want that to be the focus in the short term -- is to make that better.

Rene: Why take a giant leap when there's so many little steps to be taken? [laughs]

Russell: That's right. It is not a low-hanging fruit by any stretch because it's a lot of work, but it is the thing that is immediately in front of me.

Rene: We have a taste for it now. I think that's always the thing, is that it's so hard to get that taste. Once you do, you want more and more of it.

Russell: Exactly.

Rene: Michelle, if people want to follow you on social or see more of your work, where can they go?

Michelle: They can follow me on Twitter. My Twitter handle is @michhessel. Or they can go to my website, michellehessel.com.

Rene: Awesome. What about you, James. I don't know if you actually do that sort of stuff or not.

James: Go to @jamesthomson on Twitter -- Thomson without a p, as you well know right now -- and pcalc.com, obviously, for PCalc. If you want to play around with the AR stuff in PCalc, you find hidden away in the Help section and then the About PCalc section. You'll see a little logo, tap on that, find an AR button. Then you've gone down the rabbit hole.

Rene: I can't wait for some augmented reality app to sneak a calculator into their About screen in retaliation.

James: People have suggested that I put an option in the app, so it starts up in the About screen, but...

Rene: [laughs] Russell, where can people read all your great works?

Russell: I am @russellholly on pretty much everything. You can find all of my VR- and AR-related things either on imore.com or vrheads.com.

Rene: Awesome. You can find me at Rene Ritchie on all the social things. You can email me at rene@imore.com if you want to comment on the show or give suggestions for other shows. I'd love that. I want to thank all of you for joining me, everybody for listening. That's the show. We're out.

[music]

Rene Ritchie
Contributor

Rene Ritchie is one of the most respected Apple analysts in the business, reaching a combined audience of over 40 million readers a month. His YouTube channel, Vector, has over 90 thousand subscribers and 14 million views and his podcasts, including Debug, have been downloaded over 20 million times. He also regularly co-hosts MacBreak Weekly for the TWiT network and co-hosted CES Live! and Talk Mobile. Based in Montreal, Rene is a former director of product marketing, web developer, and graphic designer. He's authored several books and appeared on numerous television and radio segments to discuss Apple and the technology industry. When not working, he likes to cook, grapple, and spend time with his friends and family.