Multi-touch on Mac

Not long after Brad blogged about multi-touch in May, I got my hands on a brand new MacBook Pro under the excuse of implementing multi-touch support for Qt on Mac. This machine has a big and sexy trackpad (excuse my sexual orientation) that goes beyond normal mouse control to also include multi-touch- and gesture support.

After jumping on to the implementation, my first thought was that this was going to be a fairly quick thing to do. The API from Apple is rather straight forward. But after a short while, we started to realize that a touchpad (or trackpad in the case of a Mac) is more different from a touch screen than originally anticipated. The main difference being that on a touch screen, you place your finger directly on the widget you want to interact with. Qt then identifies that widget as the receiver of any following touch events. With a touchpad, you cannot place your finger directly on any widget in a similar fashion. You need to move the mouse to the widget first using the touchpad. This means that your first touch (and all following touches) will go to the widget under the cursor when the move starts, and not to the widget you are navigating to. So you would then need to do an extra step of lifting all your fingers and put them back-on the pad once the cursor is over the correct widget. In practice, this feels wrong and unintuitive.

After some hallway discussions and try-outs, we found that which widget to send touch events to should be decided after the second finger has been pressed. This is almost the same as saying that you will only get touch events when you are not controlling the mouse. This will normally be enough for applications that uses multi-touch for implementing gestures. If you are implementing single touch gestures, or otherwise need to handle the first touch, we added a flag you can set to override this default behavior. On a touch screen, you will always get all touch events.

A second problem regarding single touch (on a touchpad only) is the process of mapping touchpad position to screen position. When the touch begins, it is easy: screen position should be where the mouse is. But then it gets more blurry. The mouse will now move on the screen with a user defined speed and acceleration. So when your finger moves two centimeters on the touchpad, the mouse might move ten centimeters on the screen. In that case, should the screen position of the touch move with two centimeters? Or should the touch follow the mouse? The user would normally expect the latter (since there exist a logical connection between the touchpad and the mouse), especially if there are no visual clues on screen to mark the touch. However, implementing a strategy like this will grow in complexity (and inconsistency) once you start moving more fingers on the touchpad. And besides, recognizing gestures with mouse acceleration applied is not for the faint of heart. Conclusion: a touch will move on screen 1:1 with the touchpad. If you need to know where the mouse is, use mouse events.

In some cases, it might be more useful to know where the fingers are on the touchpad, rather than on the screen. To accommodate for this, we added to the touch events what is called a normalized position. This position range from zero to one, with the upper left corner of the touchpad being (0, 0), and lower right being (1, 1). With this information you can implement your own screen mapping, if you like.

Finally, there are certain limitations for using multi-touch. The API available from Apple is only for Cocoa, and only for Mac OS 10.6 (Snow Leopard). As a result, multi-touch for Qt/Mac will only be available for the Cocoa port running on OS 10.6. On the upshot, gestures are supported starting with OS 10.5.2, both in Carbon and Cocoa. Qt will of course mirror the same support.


Blog Topics:

Comments