Transcript


Rene – Hi! Welcome to Meta Minutes, your bite-sized pieces of the metaverse. My name is Rene from Valorem Reply, and today we're going to talk about the real-world metaverse, and for this, I'm honored to have a special expert guest today, Tory Smith. Hi Tory, and welcome to the show. How are you today?

 

Tory - I'm very well, thank you so much for having me here. I'm super excited to be here.

 

Rene – Awesome. Well thanks for being with us. Can you tell us a little bit about yourself and your background as it relates to well computer graphics, metaverse and all the related things?

 

Tory - Of course. Yeah, thanks. So, my name is Tory. I have two passions that I've been really excited about for a lot of my career. First, is maps and the second is games. So the role I have at Niantic is actually like really a dream job for me. My background in mechanical engineering. I spent the first five or six years of my career working on building maps and localization systems for self-driving cars, figuring out how to help cars figure out exactly where they are in the world and figuring out how those perception systems work as well. The three years I spent immediately prior to 20 Niantic, I was at Mapbox, working on a vision SDK for augmented reality navigation and driver assistance and then since the beginning of 2021, I've been the product manager at Niantic, working on our visual positioning system which just launched last month.

 

Rene – Alright, well that's exciting and we're definitely going to talk about the VPS system, but let's start with a simple but also complex question because it's the metaverse, right? So what is the metaverse for you and where do you see the potential?

 

Tory - Great question. So, I think unfortunately the metaverse is a bit of a buzzword and it's been described in a lot of ways. I think it's actually pretty short, which is nice but I think that my favorite definition I have come across is the metaverse is the Internet in 3D, which is really, I think, a nice way of thinking about it and it's this, I think, also captures the fact the metaverse can be virtual reality, can be augmented reality, but the real difference here is instead of experiencing that the content through a web browser you are experiencing it in a much more immersive way and talking about what that means for us at Niantic, we're focused on augmented reality. If you look like our, at our, company's mission, we're focused on inspiring people to explore the world together, which I think is a really inspiring mission and it also it is both a blessing and a curse because I do think augmented reality is a little bit more difficult than virtual reality because it augmented reality you don't have control over the outside world. You have to actually additively create experiences that live on top of the real world and there's some parts of the world that you can sort of you can set expectations, but you can't control virtual experience and virtual reality experience.

 

Rene - Fully agree. I think like the most exciting part is really when we pull this metaverse experiences, when we connect the real world into our, you know, virtual or digital replica. if we make those bridges and connections and like I said old mentality is one of those and of course like a lot of people, a little bit philosophic sorry folks but a lot of people went you know when they hear about the metaverse or a lot of folks, I talked to these days that are very much like no that's not going to happen and you know very much happened and you know very much hesitant always, have this kind of perception in their mind about this or just pure virtual worlds, right? These are games and so VR and all of this stuff and when we start to explaining a little bit that it's much more and when we connect the real world is to the virtual world then they realize, oh actually there's more potential to this and you know no one wants to replace all beautiful real-world with virtual worlds right? Like we wanna have a thing about it more as an attachment or another mode to engage, right? I love what you guys are doing with the real-world metaverse and I keep on using the term too, so this is fantastic and Niantic has been working on geo-location based games for a very long time and like you said recently opened up the API for third party developers with the Niantic lightship platform and so can you tell us a bit about what it is and how can developers get started?

 

Tory - Yeah, absolutely. So one thing that's very true with virtual reality also augmented reality, is it's gated behind a lot of really tough foundational problems regardless of what you're building. This is especially true in AR. So, understanding things like depth and semantics and occlusion which we sometimes just sort of package and call that contextual awareness. So like understanding the shape of the space you are in, understanding things like physics, like, if you are going to render objects being able to know like where the walls are and how those objects should interact with walls, ceiling, the floor, other objects or even virtual or sorry, animated like people or pets that are in the scene is really tough. There's also lots of problems around things like networking if you want to build experiences for multiple players or have shared experiences and then obviously, what I'm... or maybe obviously not obviously, what I'm most excited about is actual location based AR. So the ability to create experiences at real world locations that have actually it matters where the experience is happening and this has a lot of potential for shared experiences because a lot of our existing games that you think about, things like Pokémon Go are based around real world locations that are meaningful. So it creates a lot of opportunities for engagement and sharing those experiences with others which is really fundamental to our mission here. So for us the lightship platform basically packages a lot of those fundamental technologies that make real world are possible into a unity SDK that allows developers to start building new are experiences faster than ever before and just quickly takes you from your idea to being able to sandbox prototype iterate and do that really quickly. And in terms of getting started, anyone can just get started for free on lightship.dev and just download what we call the ARDK, which is just our augmented reality developer kit and start building it immediately.

 

Rene - You should all do this. I can tell you. I did that. I worked beginning last year or so when you got some early access program or so I was lucky enough to get into that and I was trying it out myself like with typical locations. I typically test for you know augmented reality actually computer vision right because what I was most interested in was spatial reconstruction. The spatial mapping of how well can it perceive the real world just by using like a monoblock RGB camera in the end on a mobile phone right and I like my I did some experiments last year, for example, I compared this with the same scene where I use the HoloLens five years ago and the spatial reconstruction spatial mapping is on par with what you have with the HoloLens a few years ago but the HoloLens is much more sensors, it has a depth camera, time of flight camera and so it has much more signals, much more data than just basically when I was using it was on a cheap Android phone with just one single RGB camera. And I get this kind of similar spatial mapping. So I could run like you said, I could, for example, now I have the mapping of the real world. I can run physics with it, I can have virtual kind of objects like roll down a real world hill and that kind of stuff and long story short what I was trying to say is ARDK is amazing and a very good piece here and there's this new feature you mentioned VPS. So, let's talk about this a little bit more in detail because this is also very much exciting. So let's talk about the Geo location and the Visual Positioning System, the VPS, which can enhance your geo-location with more precise location, right, based on computer vision and image analysis in the end and so how does it work and what's so unique about Niantic’s approach?

 

Tory – It’s a great question. So, going back to that definition, I love, the metaverse for AR being the Internet in 3D. Effectively what VPS or are the lightship visual positioning system does is, we're telling you instead of your web address for the right for today's Internet, we are telling you your exact physical address in the metaverse and we're doing that to within a few centimeters. One caveat here like, yes, you can get your rough location with GGPS but one thing you cannot get with GPS today is the understanding of exactly what you are looking at in addition to where you are. So, a term we sometimes use is six dot, or six degree of freedom. So, we actually tell you your exact location in terms of XY and Z but also, we described the vector of where your eyeball is and what exactly, what it is looking at which is necessary if you are going to render things into the scene and have them show up interact with the scene the way that a human is expecting to have that happen. One other really important concept for VPS is the concept of persistence. Most AR experiences, if you don't have a way to have that content stay in place and be anchored to the real world those experiences are going to be ephemeral so even if you have like a good understanding of depth and semantics, for example, like you start experiencing park and you throw a bunch of like footballs or soccer balls into the park and they bounce in the ground and they roll like they're supposed to and then you turn off your device and go back you know a few minutes later none of that stuff is gonna be there anymore because you're not storing a map of that place. What VPS allows you to do is it effectively when you go to a location that is part of the map, every time you create an object or have an interaction that interaction is actually anchored in place by your understanding of that environment. So you're effectively creating spatial bookmarks which are often referred to as anchors that tie that content to real world locations and that's really powerful because that not only allows you to have an experience that you can revisit at different times yourself but it also means that you can share that experience with other people. So it allows different users and at different times to have that same experience and specifically with VPS there's a couple special ways that we do this at Niantic that make it especially powerful. So, one of the most important differentiators we have is the fact that we are working with crowdsourced data. So, one exciting thing for us is, we already have probably the largest AR game ever created which is Pokémon Go and we've been crowdsourcing data with the Pokémon Go community for years and this has given us this really rich corpus of user-generated data or UGC from which we've been able to activate locations on VPS and places that are naturally already very popular for real world AR gameplay. And the quantity of that data means that we not only have we're not only able to build very accurate maps, but we're also being able to build very fresh maps because these are a lot of folks that play this game every single day which allows us to keep up with the real world, like, one of the challenges of map making that extends to all types of maps is, map making is inherently like the job is never done even if we had a perfect map of the entire world today it would already be out of date tomorrow. So the ecosystem that we have to create between folks that are making use of the map and then folks that are contributing back to the map is really critical and the other nice thing that you actually just alluded to, until quite recently, a lot of these technologies have also been gated behind very expensive hardware. And hardware is still a challenge for augmented reality but one thing that's really exciting about just the current state of smartphones is that they're quite ubiquitous. So like our system today no longer requires lidar for scanning the world or for using VPS which means they're already billions of devices and users hands that can take advantage of this right away. I guess one last thing to point out that's really helpful is, as a developer using our VPS, you're not starting from scratch. You don't need to go out and map the whole world in order to build these experiences because we already have millions and millions of scans and some of the most popular AR gaming places in the world that are all part of one shared map that, so basically you get access to the exact same mapped world that we have access to for our games and you know that you're also benefiting from that ecosystem. So, joining in that basically being part of that flywheel we're building, more experiences draws more users contribute more paid data back into that map is what makes this such a valuable platform to join.

 

Rene - This is great. So I think you can also create your own VPS maps for locations that don't have a VPS map available yet from the crowdsourcing, right? So, for example, I found this Niantic Waveform app, where I can use an iPhone to scan a certain location and this will create my visual map basically from the surrounding which I then can plug into the VPS system of Niantic.

 

Tory - Yes, that's correct. Yeah, so one challenge about building for the real world metaverse is that you need to actually go to one of these locations in order to understand the experience and build it, so we realize it's not an ideal situation as a developer. If you need to build your application and then drive or walk or take the bus to the nearest VPS activated location. So, we've created sort of this ad-hoc mapping system that could allow developers to, if I just wanted to turn around and build a map of my kitchen, I could do that and then build an experience in my kitchen and rapidly iterate on that just for my desk that is being to travel every time I want to see something. So that's that was one of really important first features that we built for our public data.

 

Rene - Makes a lot of sense and a lot of the, kind of, developer thinking like to make the workflow a little bit easier, 'cause it's always so time-consuming especially with your building AR apps right just at deployment and all testing takes so much time, but yeah, great stuff. Hey Tory, thank you so much for all these insights. We're unfortunately already at the end of the show. We could talk for many more hours and I'm super excited and stoked about what you're developing over there. Again, thank you so much for joining us today and sharing your insights, very much appreciated.

 

Tory - Yeah absolutely! Thank you so much for having me.

 

Rene – Well, and thanks everyone for joining us for yet another episode of Meta Minutes, your bite-sized pieces of the metaverse. Watch our blog and follow our social media channels to hear all about the next episodes, and of course, subscribe to our channel and you can all watch the previous episodes and our website of course. Well until then, take care and see you soon in the metaverse.