Android SDK: Supporting Alternative Input Devices

This tutorial will explore alternative mobile interaction methods like trackballs, mice, and styluses. To explore this concept, we'll focus on enhancing a drawing application to add support for these input devices.

Android devices are now many and varied, with differing hardware controls as well as software versions. In a recent series, we created a drawing app for Android using touchscreen interaction, but some users may be accessing your apps through different methods. For example, the Android platform runs on devices with trackballs, styluses, and even the traditional mouse and keyboard model. In this tutorial, we will run through the options you can explore when making apps accessible to a range of user hardware, as well as outlining additional capabilities these interaction models offer.

We'll be referring throughout this tutorial to options you can see in action if you check out the Android SDK API Demos app. You can find this in your SDK installation folder at samples/android-<version>/ApiDemos. The relevant classes to start looking at are TouchPaint and FingerPaint, although they both refer to other classes in the app. The FingerPaint app uses touchscreen interaction, while the TouchPaint app includes support for trackball, mouse, and stylus interaction. You can open, explore and experiment with the API Demos app in Eclipse, running it on an emulator or device to get acquainted with the functionality.

We won't be building an app in this tutorial, but we will indicate ways that you can enhance any drawing functions you use to suit the aims you have for your own projects. We will indicate how to use the code we cover if you build the app from the drawing series, and you will be able to use this tutorial as inspiration for possible extensions of the functionality within it.

The source code download contains the Java classes for the drawing app we created in the series (plus pattern and opacity follow-ups) with the trackball functionality below added. We will be focusing primarily on trackball interaction since it is the most complex and is fundamentally different to touch interaction. We will also use this to explore the details of touch interaction at a deeper level than we did in the series.


1. Preparation

Step 1

Although we won't actually build a new app in this tutorial, let's first run through a sample excerpt of drawing functionality you can use to try out the options we explore below. If you built the drawing series app, you can use it to try out the new options.

A typical Android drawing app will have a custom View representing the canvas area users can draw on. The following demonstrates a basic example of such a View class:

In this case, the View facilitates drawing using the onTouchEvent listener for touch screens, so this is ideal for finger-painting style interaction. When the user touches the screen, the path moves to their finger's start point. When they move their finger, still touching the screen, the path takes a line from the previous position to the new one. When the user takes their finger off the screen, the path is submitted and drawn.

Notice that in the above class, the drawing is implemented by detecting the X and Y co-ordinates of where the user's finger has touched the screen, whether they have just touched it, are currently dragging it across, or have just lifted it off. This interaction model works in exactly the same way if the user is drawing with a mouse or stylus, although these offer additional options we will explore below. If the user chooses to draw using a trackball, we need a different implementation we will work through next. The trackball event listener also receives a MotionEvent parameter, but the information we can retrieve from it is different.


2. Trackball Interaction

Step 1

If the user is interacting using a trackball, you can handle this in a number of possible ways. In the TouchPaint sample code, moving the trackball on the drawing area is treated the same way as touching and moving the finger, mouse, or stylus over it, with a series of oval shapes painted along the path followed by the user. Since in the drawing app series we adopted a slightly different approach (indicated in the custom View class above) we will translate that to trackball interaction here. The result will be a simplistic one, but will illustrate the differences between trackball and touchscreen interaction.

If touching and moving across the canvas is treated as a drawing action, then we could treat pressing and rolling the trackball in the same way. This means that, just as we detected pressing, moving and lifting the finger, stylus or mouse, we will want to detect the trackball being pressed, rolled, and released. As with the touch interaction, we can start a path where the trackball is pressed, move it using a line when the trackball is rolled, and complete the drawing path operation when it is released.

We will assume that we are only going to handle trackball interaction for drawing on the canvas, so the user will not be able to use the trackball to interact with the other UI elements. Trackballs on Android devices are typically coupled with touch screens.

To detect trackball interaction, add the following method to your View class:

The method is similar to the touch event listener, receiving a MotionEvent parameter and returning a boolean value which determines how future events of the same kind are handled.

Step 2

When handling trackball actions, we first need to work out what user action triggered the method. We are interested in three actions: pressing the trackball, moving it after pressing it, and releasing it after drawing. We are only interested in the trackball move event when it is pressed. For this reason, we want to detect presses that occur, indicating that the user is currently drawing. To achieve this, add a boolean variable to your class to keep track of whether or not the trackball is currently pressed:

We can assume that the trackball will be up initially. We will also want to keep track of the user's position as logged via the trackball, so add another couple of instance variables:

We will scale the trackball movement values up since the trackballs on Android devices are typically small and we don't want the user to have to roll theirs too much to draw across the canvas. Add another instance variable for the scaling value we will use:

You can adjust this if you like. Since the trackball coordinates are relative, we are going to use the width and height of the canvas area as part of our calculation. Add instance variables for these:

You can initialize these after the canvas size has been set, so for the drawing series app, do it in the onSizeChanged method:

Step 3

Back in your onTrackballEvent method, you can now retrieve information about where the trackball is positioned. In the touch event listener, we simply retrieved the X and Y coordinates from the MotionEvent object using getX and getY. However, when the MotionEvent has been fired by a trackball, these actually indicate different information. With a trackball, the X and Y values are relative rather than absolute, so we must carry out some processing to turn this into a value we can pass to the Path and Canvas objects to draw in the right place.

To use the trackball interaction to draw a line as we did with touch, we can adopt the following technique, which uses a simplified and slightly altered version of the algorithm you will see in the API Demos app. In the onTrackballEvent method, first retrieve the type of event that has occurred:

Let's now calculate the X and Y co-ordinates we want to pass to the drawing objects:

Multiplying the X or Y pointer by the precision should give us a more accurate hardware value. We also multiply by the scaling number we defined as a variable. Since the trackball coordinates are relative, we need to check in case we move off the canvas area:

Step 4

Now let's handle the event that occurs when the user presses the trackball, after calculating the X and Y values:

Inside the conditional block, first set the boolean variable we created to true:

Begin the drawing operation using the same technique we used for the touchscreen model:

When the user first presses the trackball to draw, it will by default start at the top left corner. For subsequent drawing operations, the starting point will be relative to previous drawing actions. Now we are ready to draw when the user moves the trackball after pressing it - after the conditional block testing for the "down" action:

We check for the move event, but only want to implement drawing when the trackball is pressed because the move event will also fire when the user moves the trackball without pressing it. Inside this block, we can draw again using the same technique.

Finally, we can complete drawing when the trackball is released, after the conditional for the move event:

Now we can again use the same approach as when the user stops touching the screen. If the trackball is currently pressed, we'll know that drawing has been occurring:

After the "up" event conditional block, complete the method by invalidating the View and returning a true value so that future trackball events will be processed:

Step 5

That completes the basic algorithm for trackball drawing. However, your custom drawing View class will only be able to handle the trackball events if it has focus. In the Activity hosting your custom View (the main Activity in the drawing series app), you can achieve this by dispatching the event as follows:

Inside this method, you will need a reference to the custom drawing View in your Activity. To do this, focus on it and specify the event handling method we added:

Step 6

Trackballs on Android devices are relatively rare now, but you can test your functionality on the emulator. Note, however, that the emulator results will be a little crude. When you create a virtual device, enable trackball support on it. You can then toggle trackball interaction on and off while interacting with your app on the virtual device by pressing F6. You may encounter issues getting the trackball to function on the emulator for certain API levels.

When you try the drawing function using the trackball, emulating trackball presses by clicking your mouse button, you will see instantly that this isn't the most sensitive drawing interaction model. This is why some apps use the approach demonstrated in the API Demos, which involves a slightly more complex algorithm. You will also see the inherent difficulty in handling drawing interaction from a trackball rather than touch, a stylus, or a mouse. The problem is that the user cannot know where their start point is. One possible way to address this would be emulating a cursor while the trackball is in use, moving it to reflect the drawing position before the user presses it so that they can start their drawing operations with more accuracy.

Drawing With Trackball
Pressing F6 with a virtual device running toggles trackball mode on and off, you can then interact with the virtual track ball using your mouse.
Tip: If you want to learn more about trackball interaction, have a look at the MotionEvent getAxisValue() method, passing MotionEvent.AXIS_X|Y as a parameter. The class defines values for both axes within the range of -1 to 1, representing the relative displacement of the trackball along each axis. Full horizontal displacement to the left would be an AXIS_X value of -1, with full displacement downwards an AXIS_Y value of 1 and so on. To map these to a UI element, you could add one, divide by 2, and multiply by the View width or height. This would give, for example, an X value of 100 if the X axis value was 0 and the View had a width 200, since 0 is the halfway point in the full trackball range.

Tip: For a more extensive approach to mapping trackball coordinates to painting operations, see the TouchPaint class in the API Demos. It uses the historical information provided by the MotionEvent to draw a series of points instead of a line, moving each one relative to the last by referring to the getHistoricalX and getHistoricalY methods within a loop.


3. Mouse and Stylus Interaction

Step 1

As mentioned above, your touch event-driven drawing functions will work for mouse and stylus interaction. However, there are other options you can explore to enhance drawing functionality for these hardware items. For example, for both mouse and stylus, you can detect pressure, tilt, and orientation data, allowing you to implement drawing improvements such as increased opacity or perhaps brush size, depending on your existing algorithms. The tilt and orientation data can be used in drawing apps where a non-round brush shape is being used. In some cases a touchscreen device may also provide pressure and orientation information. All of the relevant methods are in the MotionEvent class.

Check out the following methods for this additional data:

Step 2

Your apps can detect interaction through the buttons on a mouse or stylus as well as refinements on the drawing function itself. For example, in the TouchPaint class from the API Demos, the buttons are used to change colors and to switch drawing tools. You can also handle hover events for these interaction models, potentially letting you indicate the user's drawing position or implement certain drawing functions before the screen is actually touched.

See the following code excerpts for more on this:

See the API Demos classes for a more detailed overview of how to use these.


Conclusion

We have now covered and indicated a range of user interaction options for drawing apps in Android. If you have been working on the app we initially created in the series, see if you can think of ways to use this new information to enhance the app further. There are lots of other options to consider too, such as different brush types, the ability to set a wider range of colors or patterns, loading images into the app, and sharing images from it. If you think of more and have any luck implementing them, let us know in the comments!

Tags:

Comments

Related Articles