MPX already supported multiple input devices. Which blows pretty much all assumptions in user interfaces (input) out of the water. Now I’ve gone one step further and added support for multi-touch displays. Have a look at this video: http://www.youtube.com/watch?v=olWjnfBoY8E
I did not build some kind of touchscreen or tracking system.
I did not build some kind of gesture recognition system.
I built the stuff in between.A while ago I started thinking about how multi-touch and gesture support could look like. Looking around on the web and in the research literature, I found that all the multitouch systems are a hack (I’m talking about software integration here, not the hardware!). Multi-touch support needs to be in the windowing system. Any client-side approach is wrong. (Feel free to disagree with me on that)
So how can we get multi-touch gestures into the windowing system?
We don’t need gesture support in X. Gestures depend a lot on the context. A gesture in one context can mean something different in a different context. And the only thing that knows the context is the application. This is very similar to a button press. Pressing a mouse button can mean a zillion different things, depending where and when it happens. That’s why all X does is relay the button press to a client application, which then does the right thing.
What we really need is a way to convey events from a touch device to a client application. MPX now has a new type of events: BlobEvents. These events specify a bounding box, a hotspot and a bitmap specifying the contents of the bounding box. The device driver generates these events and passes them to X. X relays them to the correct client.
A BlobEvent has a device-id and a field to specify a subdevice. The device-id should map to the user. Each user represents one device. The subdevice is the body-part the user is using. If your device is smart enough, it can specify that the blob was generated by User 1, middle-finger right hand.
The BlobEvent bitmap can have multiple formats. 1 bit per pixel, 8 bits, 32 bits per coordinate, whatever your device can think up. Just remember: the client has to know how to read the data.
Windowing system benefits
Now all that could be seen as just another way of transporting touch events. Right. But remember, it’s in the windowing system. That means we know exactly what application is where on the screen and thus we know which application to send the event to. So we can run two or more touch-applications at the same time and it’ll just work.
The other thing about BlobEvents is that MPX can automatically emulate a core pointer event for each BlobEvent. And an X Input pointer event. This is where things start getting interesting. Using BlobEvents you can have your multi-touch photo-sorting-lava-lamp application running on the same screen as your standard GNOME, KDE, etc. applications and use them all at the same time.
To your multi-touch driver all this doesn’t matter. It sends blob events, the server takes care of the rest. You’re guaranteed to be able to interact with any X application. X doesn’t care about the hardware. You can use your DiamondTouch, your FTIR table or – if you can afford one – your MS Surface table.
Oh. And by the way. You can use a standard mouse and keyboard on the same box as you use the touchscreen. After all, everything is just a device.
This is the last big change. I’ll now focus on getting MPX stable enough to put it upstream.
Peter’s Blog Link