Gestures
The common method of user interaction in Harmattan devices is using the touch screen, which supports multipoint touch events. The device detects different types of touch events - called gestures - that you can use in your application.
In Harmattan applications, basic touch events such as tapping a Button are handled automatically. For example, you do not need to explicitly set the Button area to listen to tap events. However, the more advanced gestures such as swipes or multipoint touch gestures such as pinches and rotations, you need to implement specific listeners or UI elements.
Qt Quick applications can use the QML element Flickable to listen to swipe gestures and QML element PinchArea to listen to multipoint touch gestures. For example, the following code snippet creates a full screen rectangular PinchArea that contains a text string that reacts to rotate and pinch or spread gestures. All state changes are recorded in the console.
Rectangle { height: 854 width: 480 anchors.top: titleBar.bottom anchors.horizontalCenter: parent.horizontalCenter Label { id: pinchItem anchors.centerIn: parent text: "Hello World!" color: "red" font.bold: true font.pixelSize: 32 } PinchArea { anchors.fill: parent pinch.target: pinchItem pinch.maximumRotation: 360 pinch.minimumRotation: -360 pinch.minimumScale: 0.1 pinch.maximumScale: 50 onPinchUpdated: { console.log("Pinch rotation:", pinch.rotation.toFixed(2), ", scale: ", pinch.scale.toFixed(2)) } } }
For using gestures in plain Qt, see Gestures overview.
For using gestures in MeeGo Touch, see the libmeegotouch module in Harmattan Platform API reference.