Android Getting Native Active Pen Support in Ice Cream Sandwich

Among the many new features sneaking into Android 4.0 Ice Cream Sandwich is built-in support for stylus input, specifically the kind needed for tablets with active pen digitizers.

Current Android devices with active pen input, such as the HTC Flyer and Samsung Galaxy Note, rely on custom software. HTC, for example, uses a system called Scribe for pen input, which would be opened to developers to make their own pen-aware apps. This new API in Ice Cream Sandwich, however, makes that custom stuff unnecessary (or perhaps this is that custom stuff incorporated into ICS).

Stylus input, button support, hover events

Android 4.0 includes full support for stylus input events, including tilt and distance axes, pressure, and related motion event properties. To help applications distinguish motion events from different sources, the platform adds distinct tool types for stylus, finger, mouse, and eraser. For improved input from multi-button pointing devices, the platform now provides distinct primary, secondary, and tertiary buttons, as well as back and forward buttons. Hover-enter and hover-exit events are also added, for improved navigation and accessibility. Developers can build on these new input features to add powerful interactions to their apps, such as precise drawing and gesturing, handwriting and shape recognition, improved mouse input, and others.

While identified as “stylus input” what’s clearly described is active pen input. Tilt, pressure, pen buttons, eraser, hover, these things are not possible with finger-substitute styluses. This requires active digitizers, such as Wacom and N-Trig, that can detect proximity and input from electronic pens. This is true pen functionality, not “sausage” support.

But before you get too excited, keep in mind, this API is a tool for developers to easily implement pen input in their own apps. It’s not like Windows tablets where pen input acts as a mouse substitute, so it works everywhere. Pen input on Android tablets has been specific to pen-type tasks, like writing notes. It doesn’t replace finger touch input. Basically, added software is still needed to take advantage of this pen support. It just makes it easier for developers to build that software. Advanced features, such as handwriting recognition, are still not baked (or frozen) into Ice Cream Sandwich. And of course, active digitizer hardware is still required.

Also of interest is the mention of mouse support and navigation buttons, which indicate more functionality for tablets while desk-docked.

Via Reddit by way of Liliputing and my friends at