The Secret Prototypes of Valve’s VR Lab

The Secret Prototypes of Valve's VR Lab

Hi, this is Wayne again with a topic “The Secret Prototypes of Valve’s VR Lab”.
The htc vive vr headset, was recently brought to market, but represents years of development and engineering. The prototyping work wasn’t done in some high-end factory. It was made here at valve software, a game developer in bellevue washington. Some of the humble prototypes might look familiar a cannibalized lcd screen, a couple of chopped up hard drives, or some leds and 3d printed parts. The hardware team behind steamvr opened their doors and showed us the evolution of the htc vive hardware from its very beginning to the finished product hi.

The Secret Prototypes of Valve's VR Lab

I’M alan yates, i’m a hardware engineer here at valve. I worked on vr projects here since since pretty much i started four years ago, i’ve worked on the tracking systems and many of the software side of things as well. Hi, i’m monty goodson. Also, a hardware engineer – and i came in started off in some of the tracking system areas, but quickly moved on to displays and worked on getting display. Panels are now used in our vr system, so this guy was affectionately called the the susan for the lazy susan.

The Secret Prototypes of Valve's VR Lab

It was a a system that we used to work out how your vestibular ocular reflexes, worked. So you would put this thing over your head and you’d bite down on a bar, and the display here, which is in front, was, was taken from a gaming monitor that had been modified for lope assistance and you would rotate around we’d. Have this high resolution encoder? All plugged into the pc and from that we learned a lot about how your vestibular system, the balance organs in your ears and the eyes work together.

The Secret Prototypes of Valve's VR Lab

Yeah one of the most difficult parts of developing vr system was how do we get around all the problems that existed in the previous attempts at making vr, and so we had to do a lot of learnings about the research and learn about what was it. That was making people sick what kept people from having a good experience in vr, and so it really drove a lot of the research and the development we did on optics the lenses, the tracking system, it all came into the product – that’s shipping! Now that one we call the lighthouse, i’m sorry, nothing house but telescope, and it combines a optic stack. That is, is essentially a optimally designed in order to translate the laser scan display output into your eye, with no distortion, so that it looks as perfect as we can get, and then behind.

That is a tracking camera that works with the conditional markers you see behind me here, so that we could very accurately track the motion of this and update the display on uh. In this case, it was, i think, a line by line basis. It was chasing the beam yeah yeah, so it was being rendered pretty much as rapidly it was being read out, so that meant it was always accurate.

Whatever light you saw coming out of the device was the correct light for that position in the world, and that really taught us just how important tracking was like this thing. You could hold up to your eye and it was like looking through a tube at another world and you could move it around, whoever you liked and no matter what you did. It was essentially perfect. It was really the first glimpse we had at what could be achieved if you had very low persistence, displays and very good tracking. We also found that, with those experiments that standing up added a lot to the experience, just sitting down was cool, but once you stood up that little swaying you get from your balance, really added the feeling like you’re, actually there, and so we quickly realized that not Only do you need 360, but you need people at least stand and then eventually, as we went to kind of the room scale with markers post, plastered all over the walls that people really wanted to walk around, and so, as we started playing around camera systems like This guy here, which uses a camera and ir emitter and virtual effective dots on the headset that was allowed us to deploy it internally to more developers than like just a single room, could support. But it was very limited in tracking volume and what people could do with it.

So we, you know, tried many alternative, different kinds of tracking and one of them was lighthouse, but there were others as well. So we have some early examples here. This is a galva system that monty built. We were kind of not really competing with each other, but we were kind of building things in parallel and we had two approaches. This one uses a laser and galvo’s and it tracks a point. So that’s a yeah, a qpd, a quad photo diode uh.

It’S a quick experiment to see if, like this, would actually work, and essentially this is just using analog feedback to the galvos, measuring the analog difference between uh diagonals of the qpd and driving the galvos, and so once the the laser locks onto this, you can move This all around and laser will track it, and so you can see how that could turn into a tracking system, except that realize that scaling it to something that could actually track enough points to get a complete, pose and then scaling it even further to track multiple Objects like controllers would be difficult and that’s where alan at the same time was working on some of the first proof of concepts for lighthouse, and it was proving to be much more interesting and much more scalable and so uh quickly moved on from this to work. On display stuff, because there was a lot of work needed to be done there, you know took off on the lighthouse. So my my solution for the the problem of essentially scaling this to multiple points was to continuously sweep something across the room. Now i actually made another tracking system before this called sparkle tree that used a projector to do this, but it had field of view limitations. So i had this idea based on um. You know those rotating beacons that you see on top of like police cars and things like that. There used to be these sand skimmers back in the beach where i used to live in australia, and i would watch the the rotating beacons on that and i kind of got the idea. Well, you know if i knew exactly what angle they were pointing at. I could probably work out where the skimmer was, and many years you know passed before i actually built it. So i 3d printed this this little simple system, it’s a motor out of an old xbox controller and a laser, and this was like the first scanning assembly that we ever built. It just produces a sheet of light and the sheet of light gets spun around, and what from this i then went on and built the real proof of concept that could actually give you angle data so that the original prototype still works today, and this was the Proof of concept that said, yep you’re onto something this could actually work in reality from there. It was a long journey to make it practical. So we built some of our first base stations. Um ben krasnow was, you know, is a great mechanical guy. He he took a lot of interest in this project. At that point, this is like two sawn up hard drives. Like literally ben got, a chop saw, cut the hard drives up and he put these motor controllers, there’s off the shelf stuff and a bunch of electronics. Here we’ve got um a carrier generator for the lasers, and this was really sort of the the first true two axis lighthouse that could really track things um.

Now, at this point, everything was cable sync, so the the synchronization signals that were telling you exactly where the the rotor was in time were delivered over cable and the the sensortech which we’ll talk about in a minute was also evolving. At the same time, some of the very first sensor prototypes were actually built using a a kit that ben found online for am radio. So he took this am radio project and hacked it up and put a photo diet on the front end, and that was one of our very first sensors yeah. So alan was rapidly doing various schematics and sensor designs and i was trying to keep up with layouts of them eventually getting smaller and smaller, and this is essentially where we stopped and we said: okay, we have a good enough sensor.

If you look very closely on there, there’s, like some 40 odd discrete components on there, so you see, there’s the photodiode here and then on the back side are all the components a little connector at the bottom, and this is this version of the sensor – is actually What carried us through all of the dev kits so including uh, the first uh htc vive, uh v0s? We call them and all the dev kit controllers some of our first prototypes. They were much more. You know humble like this guy here. This was sort of the first tracking front for it’s designed to to fit on one of those hmds that we saw before we just sort of took one of the boards and there’s a there’s, an fpga board in here. That’S just hot glued on and there’s a bunch of wires connecting sensors, where we literally cut up several boards to make this, and that’s where we sort of give gave this to the software guys and they could first get their angle data and actually start really trying To to make a tracking system out of it, it didn’t take too long before we could. You know actually track other objects, so this guy here is one of our. We call it the ufo, because it’s kind of lenticular, i guess, but it was the very minimum object at that point. We needed five points to guarantee that we could start tracking, and this thing has five points in a configuration that that you know is ideal for tracking, so we built this board.

This is all monty’s work. Um, which has the fpga and the the associated electronics, is exactly what you see here. It’S just baked down into something that’s more convenient and all of these little gum stick form sensors that we that we then had quite a large quantity of this.

Had very humble beginnings, i mean look at all this stuff. It’S all you know 3d printed, and it’s made out of you know junk basically like these were literally hard drives, that we took out of the bin here that had been thrown out because they had failures. All of this is totally accessible right. Turning into a product. Obviously, is this this long longer thing where you end up with something: that’s that’s this beautiful monolithic piece of technology, but none of this is something that you can’t do in your own garage right.

Potentially, you can start off with the very simplest of things i mean. I still use this guy here, which is why long-suffering companion of it’s a single sensor receiver, although it’s got various sensors on here from the different generations, and you know i still use this to debug every day and it’s built out of a circuit board. Out of one of our super early steam controllers actually and i’ve just bolted stuff onto this piece of plastic – that i laser cut – that there’s some 3d printed parts in here and the rest of it is just all bodge wires. But that is, you know an example of how it doesn’t have to be these beautiful, perfect manufactured pieces of technology it can.

It can be something that anyone can put together. This was was my prototype for the the first optical sync system, so i you know it’s literally built on a breadboard using some breakouts for some cat5 cabling, and that would plug into one of these guys way before they had these these infrared emitters, and that eventually Became this, which is, i guess, kind of the same level of bodiness but electrically pretty pretty much equivalent, and then we took camera illuminators and we modified the board to fit on the back of that, and all of this was done just with our local circuit mill. Pretty much like you get at any any hacker space when we want to do full 360 tracking, we needed a way to synchronize multiple base stations together.

So again i took one of the old circuit boards from one of our really early controller experiments and onto this i’ve added. You can see here just a connector and a sensor, and i wrote a bunch of firmware that would enable the base station a remote base station to see another base station’s signal and synchronize to it, and then it regenerated the synchronization signal that used to be a Cable and so all modern base stations like the htc base stations, have this built in and they no longer need a cable between the base stations to synchronize with each other, because we’re made of obviously the product stuff i mean: that’s, that’s one part of it, but R d is still very much focused on hack and slash prototyping like we will take whatever’s off the shelf. You know we buy a lot of stuff from from amazon or just the local store and we tear it apart and we build things out of it, and that’s that way.

You can just do things so much more quickly and it doesn’t have to be beautiful. I mean you get things like this, where there’s just wires everywhere, but you learn from it and ultimately that goes into what’s going to be your product, especially we found in vr that so much of what works in vr is difficult to predict. You know so many things that we think will work well, don’t turn out to work well and so being able to rapidly prototype and try these things without spending. A lot of effort on making a prototype has been very valuable and we want to talk.

You know more about this in the public, and give people like valve is very much all about opening up all this technology and letting people play with it right, if you want to you, know, put something in the headset that tells where it’s touching on your face. I mean that’s a fantastic thing that someone could do right. You can get off the shelf pressure sensors and build it into the thing. That’S a simple! You know kind of project that anyone can do with just an arduino and a usb connection, and you know you can see if that’s interesting, we we don’t know and a lot of our intuitions about what will be interesting and what won’t be are generally wrong with Fine and to that point the vibe actually offers a lot of hackability to it. It’S it’s quite modular. Most people haven’t carried this out yet, but you can snap the uh, the head strap off completely, and so there’s this. You know opportunity for people to mount this and, like you know, different ways on their face. The foam itself is just velcroed on, so people are already starting to.

You know, find new and different facial interfaces that work better for them. Even the nose gasket itself is clipped in there, and so, if you don’t like it, you can just take it out or potentially do something different and then on the top here kind of hidden. Underneath this uh top strap cover is an auxiliary usb port. So we’ll find people that are sticking uh, various uh different things on the front or interfacing, with different uh sensors um, adding to the experience. So we’ve really kind of worked with htc to make this much of an experimentation platform, as it is a consumer product because we know in these early stages of vr, there’s still a lot to figure out and a lot to explore. We want to encourage that.

Similarly, if you pull apart base stations or controllers you’ll find the headers in there you’ll find um for programming. We didn’t lock down any of the firmware. It’S all you know accessible.

You can download the device reverse engineer. You can upload your own custom, firmware to it. Where and obviously htc is not going to support you warranty if you do that, but you’re completely free to do that and we encourage it.

Do you .