Connect with us

Software

More Info on InkSeine

Published

on

On Wednesday, I posted this article about a new application, called InkSeine, being developed within the Microsoft Research Labs. Ken Hinckley, the developer of InkSeine, picked up on the post and posted some great comments regarding InkSeine, as well as answering some users questions. I’m posting the user’s question and Ken’s response here, as it really helps to explain InkSeine. In addition, you can learn more about InkSeine at Ken’s website and by watching a more current video here (wmv format).

Comment from Ken Hinckley:
For your information I have some new stuff about InkSeine posted on my home page (http://research.microsoft.com/Users/kenh/Default.htm) and you can find a newer video (.WMV format only right now, sorry) at https://research.microsoft.com/users/kenh/papers/InkSeine-release.wmv. The app is not yet available outside of Microsoft – but I hope that I’ll be able to make a download for people to use relatively soon – maybe later this year.

The video that CTitanic posted is an older prototype of the system that we published at an academic conference. So it had some more experimental stuff in it with quirky requirements (like having a button on your tablet to do mode-switching – something that I wish was on current tablets, but just isn’t widely available yet). The new app has a much more polished “notebook” functionality but does a little bit less with fancy gestures. Anyway, I hope this community finds it of interest.

Reader question from David:
cool stuff, wouldn’t it make more sense to put the mode switcher on the pen instead of the tablet? It would be a lot more natural that way

Answer from Ken Hinckley:
To respond to David Martin’s question re: mode switch button on pen vs. tablet, we’ve studied this very carefully.

Using the button on the pen is (1) slower and (2) more error prone (you tend to hit it by mistake when you don’t really want it) than hitting a button on the bezel of the tablet with your nonpreferred hand. The other difficulty is that (3) pressing the pen button interferes with drawing with the pen itself – and if you’re fingers not resting all ready to go right on the button, you have to fumble with your pen to find the button. So it can be fairly cumbersome.

Our experimental results have shown that when a suitable non-preferred hand button is available, on average it takes less than 150 milliseconds to switch modes, with a 1% error rate. Granted, there are times where you holding your tablet and you can’t be bothered with a button on the bezel with your other hand – but for fast and robust performance it’s very hard to beat.

The other thing that is a little more subtle is that, for multiple stroke gestures, it’s very easy to keep holding down a button with your off hand; then the system knows unambiguously that a series of strokes should all be interpreted as part of the same “gesture”. But if you hold down the pen button instead, the state of the pen button is not sensed as soon as the pen leaves the sensing range of the Tablet PC screen, so it cannot be used in this way. Another way of saying this is that the pen button is really only useful for mode-switching for single-stroke gestures.

In the updated video I posted above, we use the button-on-pen solution despite its warts, since this prototype is for use on existing tablets which do not have useful bezel buttons for mode switching. I use the pen button myself in daily use, but still end up hitting it by mistake way too often. By the way, the pens that ship with the Motion slate Tablet PC’s are really nice and offer the best integrated pen button that I have found.

Mode switching is still a very tough problem for these kinds of systems, and is one that we continue to research to try and come up with some elegant new solutions that are fast and reliable to use – but it’s really a tough nut to crack and to be perfectly blunt about it, I’m not totally satisfied with any of the approaches that we’ve come up with so far. So we continue to work on this problem.

Ken

PS: So-called “modeless” approaches where the system infers whether your strokes are meant to be ink annotations, or gestures that act on that ink, are rife with their own challenges and problems as well. So in my view, that’s not the way to go either. But there are other researchers out there who would disagree with me on this point.

Technorati Tags: , ,

Click to comment

Leave a Reply

Your email address will not be published.

As an Amazon Associate I earn from qualifying purchases.