Here's another example of where HCI is headed: multi-touch sensing.
If you have seen an iPod touch or iPhone from Apple, or the Microsoft Surface coffee-table computer, you have seen early examples of what multi-touch computing is all about. Single-touch computing operates like a mouse: touching moves the mouse pointer, tapping is like clicking, tap-and-drag works like click-and-drag.
These early commercial products in the multi-touch realm add a second recognized touch point, allowing you to resize and rotate screen objects using two separate fingers. Multi-touch doesn't have to stop at two touch-points though, nor is it limited to the kinds of two-dimensional operations we associate with mouse-as-pointer interaction. Check out Jeff Han's (NYU) page regarding multi-touch interfaces:
While touch sensing is commonplace for single points of contact, multi-touch sensing enables a user to interact with a system with more than one finger at a time, as in chording and bi-manual operations. Such sensing devices are inherently also able to accommodate multiple users simultaneously, which is especially useful for larger interaction scenarios such as interactive walls and tabletops.
The Perceptive Pixel company is the vehicle NYU has spun off to commercialize Han's inventions. You can search YouTube to find more videos of Han showing off his inventions at various conference and media venues.
Comments