This tutorial will explore alternative mobile interaction methods like trackballs, mice, and styluses. To explore this concept, we'll focus on enhancing a drawing application to add support for these input devices.
Android devices are now many and varied, with differing hardware controls as well as software versions. In a recent series, we created a drawing app for Android using touchscreen interaction, but some users may be accessing your apps through different methods. For example, the Android platform runs on devices with trackballs, styluses, and even the traditional mouse and keyboard model. In this tutorial, we will run through the options you can explore when making apps accessible to a range of user hardware, as well as outlining additional capabilities these interaction models offer.
We'll be referring throughout this tutorial to options you can see in action if you check out the Android SDK API Demos app. You can find this in your SDK installation folder at samples/android-<version>/ApiDemos. The relevant classes to start looking at are TouchPaint and FingerPaint, although they both refer to other classes in the app. The FingerPaint app uses touchscreen interaction, while the TouchPaint app includes support for trackball, mouse, and stylus interaction. You can open, explore and experiment with the API Demos app in Eclipse, running it on an emulator or device to get acquainted with the functionality.
We won't be building an app in this tutorial, but we will indicate ways that you can enhance any drawing functions you use to suit the aims you have for your own projects. We will indicate how to use the code we cover if you build the app from the drawing series, and you will be able to use this tutorial as inspiration for possible extensions of the functionality within it.
The source code download contains the Java classes for the drawing app we created in the series (plus pattern and opacity follow-ups) with the trackball functionality below added. We will be focusing primarily on trackball interaction since it is the most complex and is fundamentally different to touch interaction. We will also use this to explore the details of touch interaction at a deeper level than we did in the series.
1. Preparation
Step 1
Although we won't actually build a new app in this tutorial, let's first run through a sample excerpt of drawing functionality you can use to try out the options we explore below. If you built the drawing series app, you can use it to try out the new options.
A typical Android drawing app will have a custom View representing the canvas area users can draw on. The following demonstrates a basic example of such a View class:
public class DrawingView extends View { //drawing path private Path drawPath; //drawing and canvas paint private Paint drawPaint, canvasPaint; //initial color private int paintColor = 0xFFFF0000; //canvas private Canvas drawCanvas; //canvas bitmap private Bitmap canvasBitmap; //constructor public DrawingView(Context context, AttributeSet attrs){ super(context, attrs); setupDrawing(); } //prepare drawing private void setupDrawing(){ drawPath = new Path(); drawPaint = new Paint(); drawPaint.setColor(paintColor); drawPaint.setAntiAlias(true); drawPaint.setStrokeWidth(50); drawPaint.setStyle(Paint.Style.STROKE); drawPaint.setStrokeJoin(Paint.Join.ROUND); drawPaint.setStrokeCap(Paint.Cap.ROUND); canvasPaint = new Paint(Paint.DITHER_FLAG); } //view assigned size @Override protected void onSizeChanged(int w, int h, int oldw, int oldh) { super.onSizeChanged(w, h, oldw, oldh); canvasBitmap = Bitmap.createBitmap(w, h, Bitmap.Config.ARGB_8888); drawCanvas = new Canvas(canvasBitmap); } //draw view @Override protected void onDraw(Canvas canvas) { canvas.drawBitmap(canvasBitmap, 0, 0, canvasPaint); canvas.drawPath(drawPath, drawPaint); } //respond to touch interaction @Override public boolean onTouchEvent(MotionEvent event) { float touchX = event.getX(); float touchY = event.getY(); //respond to down, move and up events switch (event.getAction()) { case MotionEvent.ACTION_DOWN: drawPath.moveTo(touchX, touchY); break; case MotionEvent.ACTION_MOVE: drawPath.lineTo(touchX, touchY); break; case MotionEvent.ACTION_UP: drawPath.lineTo(touchX, touchY); drawCanvas.drawPath(drawPath, drawPaint); drawPath.reset(); break; default: return false; } //redraw invalidate(); return true; } }
In this case, the View facilitates drawing using the onTouchEvent listener for touch screens, so this is ideal for finger-painting style interaction. When the user touches the screen, the path moves to their finger's start point. When they move their finger, still touching the screen, the path takes a line from the previous position to the new one. When the user takes their finger off the screen, the path is submitted and drawn.
Notice that in the above class, the drawing is implemented by detecting the X and Y co-ordinates of where the user's finger has touched the screen, whether they have just touched it, are currently dragging it across, or have just lifted it off. This interaction model works in exactly the same way if the user is drawing with a mouse or stylus, although these offer additional options we will explore below. If the user chooses to draw using a trackball, we need a different implementation we will work through next. The trackball event listener also receives a MotionEvent parameter, but the information we can retrieve from it is different.
2. Trackball Interaction
Step 1
If the user is interacting using a trackball, you can handle this in a number of possible ways. In the TouchPaint sample code, moving the trackball on the drawing area is treated the same way as touching and moving the finger, mouse, or stylus over it, with a series of oval shapes painted along the path followed by the user. Since in the drawing app series we adopted a slightly different approach (indicated in the custom View class above) we will translate that to trackball interaction here. The result will be a simplistic one, but will illustrate the differences between trackball and touchscreen interaction.
If touching and moving across the canvas is treated as a drawing action, then we could treat pressing and rolling the trackball in the same way. This means that, just as we detected pressing, moving and lifting the finger, stylus or mouse, we will want to detect the trackball being pressed, rolled, and released. As with the touch interaction, we can start a path where the trackball is pressed, move it using a line when the trackball is rolled, and complete the drawing path operation when it is released.
We will assume that we are only going to handle trackball interaction for drawing on the canvas, so the user will not be able to use the trackball to interact with the other UI elements. Trackballs on Android devices are typically coupled with touch screens.
To detect trackball interaction, add the following method to your View class:
@Override public boolean onTrackballEvent(MotionEvent event) { //respond to trackball interaction }
The method is similar to the touch event listener, receiving a MotionEvent parameter and returning a boolean value which determines how future events of the same kind are handled.
Step 2
When handling trackball actions, we first need to work out what user action triggered the method. We are interested in three actions: pressing the trackball, moving it after pressing it, and releasing it after drawing. We are only interested in the trackball move event when it is pressed. For this reason, we want to detect presses that occur, indicating that the user is currently drawing. To achieve this, add a boolean variable to your class to keep track of whether or not the trackball is currently pressed:
private boolean trackOn=false;
We can assume that the trackball will be up initially. We will also want to keep track of the user's position as logged via the trackball, so add another couple of instance variables:
private float trackX, trackY;
We will scale the trackball movement values up since the trackballs on Android devices are typically small and we don't want the user to have to roll theirs too much to draw across the canvas. Add another instance variable for the scaling value we will use:
private int scale=5;
You can adjust this if you like. Since the trackball coordinates are relative, we are going to use the width and height of the canvas area as part of our calculation. Add instance variables for these:
private int canvasWidth, canvasHeight;
You can initialize these after the canvas size has been set, so for the drawing series app, do it in the onSizeChanged method:
canvasWidth = canvasBitmap.getWidth(); canvasHeight = canvasBitmap.getHeight();
Step 3
Back in your onTrackballEvent method, you can now retrieve information about where the trackball is positioned. In the touch event listener, we simply retrieved the X and Y coordinates from the MotionEvent object using getX and getY. However, when the MotionEvent has been fired by a trackball, these actually indicate different information. With a trackball, the X and Y values are relative rather than absolute, so we must carry out some processing to turn this into a value we can pass to the Path and Canvas objects to draw in the right place.
To use the trackball interaction to draw a line as we did with touch, we can adopt the following technique, which uses a simplified and slightly altered version of the algorithm you will see in the API Demos app. In the onTrackballEvent method, first retrieve the type of event that has occurred:
int action = event.getActionMasked();
Let's now calculate the X and Y co-ordinates we want to pass to the drawing objects:
trackX += event.getX() * event.getXPrecision() * scale; trackY += event.getY() * event.getYPrecision() * scale;
Multiplying the X or Y pointer by the precision should give us a more accurate hardware value. We also multiply by the scaling number we defined as a variable. Since the trackball coordinates are relative, we need to check in case we move off the canvas area:
if(trackX >= canvasWidth) trackX = canvasWidth-1; if(trackY >= canvasHeight) trackY = canvasHeight-1;
Step 4
Now let's handle the event that occurs when the user presses the trackball, after calculating the X and Y values:
if (action==MotionEvent.ACTION_DOWN) { //trackball pressed }
Inside the conditional block, first set the boolean variable we created to true:
trackOn=true;
Begin the drawing operation using the same technique we used for the touchscreen model:
drawPath.moveTo(trackX, trackY);
When the user first presses the trackball to draw, it will by default start at the top left corner. For subsequent drawing operations, the starting point will be relative to previous drawing actions. Now we are ready to draw when the user moves the trackball after pressing it - after the conditional block testing for the "down" action:
if (action==MotionEvent.ACTION_MOVE && trackOn) { //trackball pressed and moving }
We check for the move event, but only want to implement drawing when the trackball is pressed because the move event will also fire when the user moves the trackball without pressing it. Inside this block, we can draw again using the same technique.
drawPath.lineTo(trackX, trackY);
Finally, we can complete drawing when the trackball is released, after the conditional for the move event:
if(action==MotionEvent.ACTION_UP) { //trackball released }
Now we can again use the same approach as when the user stops touching the screen. If the trackball is currently pressed, we'll know that drawing has been occurring:
if(trackOn){ drawCanvas.drawPath(drawPath, drawPaint); drawPath.reset(); trackOn=false; }
After the "up" event conditional block, complete the method by invalidating the View and returning a true value so that future trackball events will be processed:
invalidate(); return true;
Step 5
That completes the basic algorithm for trackball drawing. However, your custom drawing View class will only be able to handle the trackball events if it has focus. In the Activity hosting your custom View (the main Activity in the drawing series app), you can achieve this by dispatching the event as follows:
public boolean dispatchTrackballEvent(MotionEvent ev) { //trackball event }
Inside this method, you will need a reference to the custom drawing View in your Activity. To do this, focus on it and specify the event handling method we added:
drawView.requestFocus(); return drawView.onTrackballEvent(ev);
Step 6
Trackballs on Android devices are relatively rare now, but you can test your functionality on the emulator. Note, however, that the emulator results will be a little crude. When you create a virtual device, enable trackball support on it. You can then toggle trackball interaction on and off while interacting with your app on the virtual device by pressing F6. You may encounter issues getting the trackball to function on the emulator for certain API levels.
When you try the drawing function using the trackball, emulating trackball presses by clicking your mouse button, you will see instantly that this isn't the most sensitive drawing interaction model. This is why some apps use the approach demonstrated in the API Demos, which involves a slightly more complex algorithm. You will also see the inherent difficulty in handling drawing interaction from a trackball rather than touch, a stylus, or a mouse. The problem is that the user cannot know where their start point is. One possible way to address this would be emulating a cursor while the trackball is in use, moving it to reflect the drawing position before the user presses it so that they can start their drawing operations with more accuracy.
Tip: For a more extensive approach to mapping trackball coordinates to painting operations, see the TouchPaint class in the API Demos. It uses the historical information provided by the MotionEvent to draw a series of points instead of a line, moving each one relative to the last by referring to the getHistoricalX and getHistoricalY methods within a loop.
3. Mouse and Stylus Interaction
Step 1
As mentioned above, your touch event-driven drawing functions will work for mouse and stylus interaction. However, there are other options you can explore to enhance drawing functionality for these hardware items. For example, for both mouse and stylus, you can detect pressure, tilt, and orientation data, allowing you to implement drawing improvements such as increased opacity or perhaps brush size, depending on your existing algorithms. The tilt and orientation data can be used in drawing apps where a non-round brush shape is being used. In some cases a touchscreen device may also provide pressure and orientation information. All of the relevant methods are in the MotionEvent class.
Check out the following methods for this additional data:
//used in event handlers motionEvent.getPressure(); motionEvent.getTouchMajor(); motionEvent.getTouchMinor(); motionEvent.getOrientation(); MotionEvent.AXIS_DISTANCE; MotionEvent.AXIS_TILT;
Step 2
Your apps can detect interaction through the buttons on a mouse or stylus as well as refinements on the drawing function itself. For example, in the TouchPaint class from the API Demos, the buttons are used to change colors and to switch drawing tools. You can also handle hover events for these interaction models, potentially letting you indicate the user's drawing position or implement certain drawing functions before the screen is actually touched.
See the following code excerpts for more on this:
//used in event handlers tailored to specific input tools motionEvent.getToolType(); MotionEvent.BUTTON_PRIMARY; MotionEvent.BUTTON_SECONDARY; MotionEvent.BUTTON_TERTIARY;
See the API Demos classes for a more detailed overview of how to use these.
Conclusion
We have now covered and indicated a range of user interaction options for drawing apps in Android. If you have been working on the app we initially created in the series, see if you can think of ways to use this new information to enhance the app further. There are lots of other options to consider too, such as different brush types, the ability to set a wider range of colors or patterns, loading images into the app, and sharing images from it. If you think of more and have any luck implementing them, let us know in the comments!
Comments