So yesterday, I was chatting on a developer e-mail list about how you can use basic statistics to find the most prevalent color in a camera frame.
One thing led to another, which in my case meant sample code followed by -- hey, this is kind of cool. Maybe I should put it in App Store. (Thanks go to Roland Gröpmair, who provided me with the on-screen buttons you see on the screen shot.) Which then led to a flurry of coding, submission to app store, and chatting with some dev buddies on IRC afterwards about how I built this thing.
It works by sampling the live camera preview provided by the iPad 2's audio video framework. The centermost part of the image provides the data, which the app runs through some histogramming. It then selects the most statistically significant color from that image.
Greg Hartstein looked at what I had built and basically said: "Hey, you know, I bet I could convert that code into a kind of Kinect." He demanded a dark room, a standard iPad 2, and a bit of time to massage the code.
His take on the problem works like this: instead of looking for color, Hartstein measures proximity by looking for pixels that reflect light back to the iPad. In a totally dark room, the iPad 2's screen provides the only light source. The brighter the data in the camera sensors, the closer some object is to the screen. An on-screen meter provides the "proximity" feedback, but this code could easily be adapted into a game where your nearness triggers responses without having to touch the screen.
Even those iOS devices that ship with proximity sensors cannot provide variable measures from the sensor -- it is either triggered or it is not. So Hartstein's approach offers a fun take on proximity that's far more flexible.
Admittedly, you do end up having to game in a dark room in front of an iPad, but we all have to make sacrifices somewhere.
An early video demonstration of his work follows.
Source : tuaw
One thing led to another, which in my case meant sample code followed by -- hey, this is kind of cool. Maybe I should put it in App Store. (Thanks go to Roland Gröpmair, who provided me with the on-screen buttons you see on the screen shot.) Which then led to a flurry of coding, submission to app store, and chatting with some dev buddies on IRC afterwards about how I built this thing.
It works by sampling the live camera preview provided by the iPad 2's audio video framework. The centermost part of the image provides the data, which the app runs through some histogramming. It then selects the most statistically significant color from that image.
Greg Hartstein looked at what I had built and basically said: "Hey, you know, I bet I could convert that code into a kind of Kinect." He demanded a dark room, a standard iPad 2, and a bit of time to massage the code.
His take on the problem works like this: instead of looking for color, Hartstein measures proximity by looking for pixels that reflect light back to the iPad. In a totally dark room, the iPad 2's screen provides the only light source. The brighter the data in the camera sensors, the closer some object is to the screen. An on-screen meter provides the "proximity" feedback, but this code could easily be adapted into a game where your nearness triggers responses without having to touch the screen.
Even those iOS devices that ship with proximity sensors cannot provide variable measures from the sensor -- it is either triggered or it is not. So Hartstein's approach offers a fun take on proximity that's far more flexible.
Admittedly, you do end up having to game in a dark room in front of an iPad, but we all have to make sacrifices somewhere.
An early video demonstration of his work follows.
Source : tuaw