New App Demonstrated To Capture 3D Video With Microsoft’s Kinect Through iPad2

Augmented Reality Microsoft Kinect is a controller that has the ability to capture your motion and work accordingly And it is an XBox controller, now iOS developer Laan Labs used the String Augmented Reality SDK to display real-time 3d video+audio recorded from the Kinect and it makes you think of the diversity of possibilities to augment people in a new reality………….

 

iOS developer Laan Labs demonstrated a concept app (created with the String AR SDK) that can use video captured from Microsoft’s Kinect as augmented-reality content when viewed through an iPad 2 camera. This means that one day you could possibly record a video of yourself on a Kinect camera, transfer the data to an AR card and send a virtual 3D movie greeting of yourself to someone. This could really unleash creative possibilities for your mother-in-law’s next birthday card. The small software company behind the project is run by two brothers, Christopher and Jason Laan, who have electrical and chemical engineering backgrounds, respectively. They describe the homebrew effort as a just for fun experiment, but even in their test video it’s easy to spot the limitless potential for creating amusing personal 3D AR content. For example, during the demo video, a test filter is shown that makes the person displayed on the AR card look similar to Princess Leia being projected from R2-D2 in Star Wars. String recognises framed images and understands where they are in 3D space.

 

It’s a piece of software you can plug in into any iOS project, that lets you display rich 3D graphics on top of the camera view as if they existed in the real world. All the while adapting and fine-tuning the SDK to match the demands of real-life software development. That same String SDK is now available to all. Specs and Features are below:

  • Frame rate limited only by camera hardware. Typically runs internally at 70FPS on a 3G, excluding video capture.
  • Track up to 10 images simultaneously.
  • Exceptionally robust tracking.
  • Tracks well even with varying light levels across a single marker.
  • Lets you mimic real-world lighting in your scene by measuring the relative colour and lightness around each marker.
  • Typically uses 120 kB of memory at runtime, excluding video capture.
  • Clean and simple tutorial project to get you started.
  • In-editor preview plug-in for instant testing.
  • A single static library, no header files needed.
  • Add maps, messaging, location tracking and other iOS goodness to your AR apps.

 

Thanks: (1),(2)

 [ttjad keyword=”ipad”]

Leave a Reply