CES 2015: Introducing HP Sprout

CES 2015: Introducing HP Sprout

Hi, this is Wayne again with a topic “CES 2015: Introducing HP Sprout”.
Sprout by HP is basically a to screen multi-touch computer system running on Windows 8. It consists of a touch Matt, a vertical screen and an armature that we call the illuminator the armature houses, a bunch of specialized sensors, all ending down on the touch Matt. The vertical screen is, can think of it as a standard high-end all-in-one pc, i 7 processor, the nvidia, discrete graphics. You can swap a hard drive put more memory in it, a really a no-compromise pc solution. So before i get into the scanning, let me just explain what this software is.

This is what we call the sprout workspace, which is running over the top of Windows. 8. You can always access Windows 8, easily sideswiping. I can run all the modern apps up here. I can go to the desktop. I can toggle off the gallery and you see the normal desktop, so i can write all existing applications that i usually run on a pc, no compromise solution there. But when I toggle on the gallery, we believe it’s very important to have the visual sense of the objects and art and creation. So all of your content – that’s stored on the pc or on the cloud through HP connected drive, is all stored, visually and visualize.

So i can see everything what that allows me to do is trigger and spark imagination. It’S just like my desk here has a bunch of artwork and projects that I’m working on well now, the computer simulates that, with a digital version of that. So if I want to work on, say, pull this ribbon down.

You can see just this flick motion and it transmitted transitions that digital content right down onto the table and surface of this mat, and it’s as if I brought in a digital thing. It became like a physical object and things just move around now. What, if I have these objects were already digital, but what if I have a physical object? Maybe I have a flower here and, let’s just say, for shell I’ll scan this shell, so I put them on this virtual piece of paper.

CES 2015: Introducing HP Sprout

Click on the camera click a one button press for the camera capture. Now the touch matt area turns into essentially a 20-inch flatbed scanner, and you can see how the shell on the flower re projected one to one. So we maintain the same scale and ratio. There digitally corrected everything color-matched it, so you can see a preview up on the vertical screen with a high res screen and it automatically subtracted the background.

CES 2015: Introducing HP Sprout

This is a key thing: we’re using the realsense camera depth in IR to find the edges and remove the background. So now, when I accept it into my little layout here and I’ll, give a preview up on the vertical screen, you can see that there are individual objects. I started with two physical objects and I end with two digital objects: that’s a really important functionality to really make the transition from physical to digital in a seamless way.

CES 2015: Introducing HP Sprout

So, let’s take a look at our 3d snapshot application. So when i run this, it brings up an interface looks very similar to 2d capture, and this was important for us. We wanted to make 3d scanning as simple as 2d capture, and so i’m just going to put a shell here and a dollar bill and start the scanning process.

Now, what’s going to happen, is we project vertical and horizontal stripes of different frequencies and we use a phase shift technology to calculate the depth for every pixel of the 14.6 megapixel camera? So there’s no mismatch error between the color texture map and the the depth data for every pixel. So after just a few seconds of that, the structured light pattern. It now shows you the result of that capture and this is a snapshot shot. So you can see that the quality of the color texture image pasted on top of the depth data for both the shell, where you can see the intricate 3d depth data, as well as the folds and wrinkles in that dollar bill. And then we can do a simple process of selecting the surfaces we want to keep and it gets rid of all the noise. In this case, maybe i just showing the dollar bill to show you the resolution of the color map, but i can also use this functionality to get rid of the dollar bill.

In this case, i just want to keep the shell. So now i’ve created a 3d scan of the shell and i can see what it looks like without the color texture map. So let me just turn that off like this now you can see just the underlying geometry. Okay, but now let’s say I have this – this collection of shells here and I want to share that with someone else. So what we can do is use click this button here and it launches HP myroom, which is a remote collaboration, simultaneous collaboration software and it took all the content that was on my workspace, sent it up to our servers and now what I’m going to do is Click, someone in my contacts and call them and if we move to this sprout next to me, these are both connected to the Internet. This one got a call, so I’m going to accept the call and you can see the shells are now appearing on this unit and now, as I move a shell, you can see it moving down on the other side, so I could work with someone a co-worker.

You know across town across the world, someone can be on East Coast West Coast and I could show them what I’m working on. I can even annotate so everything’s happening in real time, and what makes this very unique is that the remote participant in this could also annotate at the same time, and maybe they they pick a red color here, and so they can annotate mark things up and even Move the same objects like this okay. Now, let’s say the remote participant in the collaboration session wants to add a flower in this case to this project that we’re both looking at together. So now they can add the flour click this camera button and now it’s going to basically take a capture reproject the flower on the participants side here, one to one so now i see this flower here sends the flower up to the server and back down to The other side, so you can see the flowers now on both sides, and i can pick my cursor here.

So it’s as, if we’re having a face-to-face conversation with physical objects, even though they’re digital and you can imagine how efficient that’s going to make remote collaboration process. So pretty pretty amazing technology blended reality going from physical to digital back to physical. I can collaborate remotely seamlessly full-fledged pc, no compromise, so that’s prob HP. In a nutshell.

.