Touch screens are flooding our homes and schools, making technology more accessible than ever. But as the industry continues to develop at ever increasing speed can the touch screen survive longer than the mouse?
The majority of ClassThink articles are related to the iPad, so you might think that I use Apple’s tablet regularly. But the truth is that I rarely use an iPad at all. A good portion of my work-day is spent in front of a touch screen, but personally I’ve just never found it that compelling a device.
Once the luster of the hardware has faded, I’m left at a loss as to what I’m supposed to be doing, how I’m supposed to be working. I find that my work flow is broken, and simple things, like being able to pass data between apps, just doesn’t work. Many network managers and technicians feel the same, which is why a lot of schools are struggling to convince IT staff that iPads are essential for teaching.
I’ve peeled the cellophane and unboxed so many iPads that they lost their wonder a long time ago. I no longer look at them lustfully as the pinnacle of modern technological design — they are but peas to be shelled, again, and again, and again. I have the same feeling towards the iPad that a Tesco shelf-stacker likely has towards Heinz beans.
With the iPad I feel restricted, like I’m working with one arm tied behind my back.
A mouse and keyboard is where I’m most comfortable. I’m a desktop savant. I can Jackie Chan around Windows like nobody’s business. With the iPad I feel restricted, like I’m working with one arm tied behind my back. But I’ve grown up with the keyboard and mouse. I’ve had the best part of twenty years to become one with the desktop — the majority of people haven’t. So, it’s because of this I can appreciate why many non tech people find the iPad the greatest invention since divided wheat products.
Apple’s vision of the touch screen, which has taken the industry by storm, works so well because it removes the physical hurdle that a mouse and keyboard put between the user and the software. Yes, equally “good” touch screens are available on other tablets, but Apple’s understanding of user interface, which many in IT consider limiting, to the average person feels accessible and friendly. You only have to watch a two year old navigating an iPad to understand why the device is so compelling for many.
[pullquote]Most people used to using a mouse really don’t appreciate what an strange thing it is.[/pullquote]
Most people used to using a mouse really don’t appreciate what a strange thing it is. You grip onto a plastic nodule and manipulate it around on a horizontal flat surface while a tiny on screen pointer translates the movement to vertical movement. A mouse is not a natural metaphor for translating human analog input into digital.
The result is that you have to learn to use a mouse, in the same way they have to learn to drive, or use a pair of chopsticks. If you put the effort in the pay off is great, but for many the effort simply isn’t worth the end result, and so a barrier exists which filters out huge groups of people from ever touching a computer. That’s sad.
Developing the motor skills required to use a mouse is an immense task. It’s for this reason that my three year old son can navigate Netflix on an iPad but can’t on my laptop. The same is true of older people who have never used a desktop. The touch screen removes the barrier between the user and the software. With a touch screen when you press an on-screen button you’re actually pressing a button, when you slide to the next photo that’s exactly what you’re doing. A touch screen allows for 1:1 interaction.
Despite this I predict that the touchscreen will have a shorter lifespan than the mouse. The ultimate user interface is one which is invisible to the user, that anticipates needs and provides services without direct interaction.
Touch screens are most accessible form of digital input we have, but they’re not the ultimate input method. In fact, when we look back in ten years time we’ll see touch as antiquated as we now consider the trackball or the ball mouse.
We’re already starting to see glimpses of this in consumer products like Microsoft’s Kinect, Google Now, and Google Glass. The technology uses visual and verbal input as well as data stored about the user to provide output. Our technology has started to become good at predicting what we will want before we ask for it. At the moment this is limited to trivial functions such as watching films, or guessing where you will likely travel next, but in future the majority of device input will be voice or environmental and the touch screen will be limited to very specific applications.