Interesting report from New Scientist: Researcher from Microsoft and UC Berkeley are developing a touch computing system that can digitize physical objects placed on its surface and, more impressively, do the reverse, project digital objects on to physical ones. It is similar to Microsoft Surface but with visual recognition of and interaction with physical objects.
Andy Wilson and colleagues at Microsoft Research in Redmond, Washington, teamed up with Björn Hartmann at the University of California, Berkeley, to design Pictionaire, a touchscreen table 1.8 metres long. The device is positioned directly beneath a ceiling-mounted camera and projector, which can “read” and respond to items placed on the table.
For example, if you put a sketchbook face up on the surface, the overhead camera will recognize it, allowing the computer to follow its position and project a digital copy of the sketch if you “drag” it away from one of the sketchbook’s corners. More impressively, you can take that digital sketch and drag it on to a different, physical sketchbook, allowing a user to easily trace and copy it to paper.
Arguably, it’s a lot easier to just do all the sketching on screen, but Andy Wilson of Microsoft makes this point:
“Contrast this with a tablet PC,” says Wilson. “I don’t care how well made it is – it won’t replicate the true feeling of pencil on paper. And designers are very particular about the type of paper they use, the style of pens.”
True enough. I personally prefer the feel of pen on screen, which cannot be replicated by pencil on paper. In fact, as my handwriting samples show, I’m much better on screen than on paper, so I’m sure the reverse is true of many people. This is a big step forward to making the two seamlessly come together.