Intuitive tech: the future of human-computer interaction

Thorsten Stremlau

Monday 4 April 2016

Want to know how we’ll interact with our PCs in the future? Thorsten Stremlau, Lenovo’s WW Principal IT Architect, says it’ll be more responsive than ever.

Technology lets us interact with our computers in more ways than ever before. But – as history shows us – only the most intuitive methods will catch on.

We’ve basically been stuck with a keyboard and mouse for the last 30 years. Alternatives have come and gone. Voice interaction emerged in the late 90s. I did technical support for IBM’s ViaVoice, which let you dictate to your computer. But there were problems with background noise, and the sound cards didn’t work properly. It didn’t gain widespread adoption.

Around the same time, the first wave of touchscreens featured on CRT monitors. The technology was much more primitive in those days – it didn’t have any of the pinch-to-zoom, or swipe to navigate functions of today’s touchscreens. It wasn’t really innovative enough, and didn’t capture people’s attention in the same way.

Lenovo’s heritage

Lenovo has a great track record of innovation in this area. All of our laptops have the TrackPoint – this is the pencil eraser-sized red dot on the keyboard between the G, H and B keys. It’s used to move the cursor, and is pressure-sensitive, so the firmer you push the faster it moves.

IBM – whose Personal Computing Division was bought by Lenovo – introduced the ThinkPad TransNote, one of the first laptops compatible with a pen tool. But, because of the paucity of applications supporting the pen input, it didn’t take off. It was a big success with the police, though, as it was very practical for issuing tickets.

We’re still leading the way when it comes to pen inputs. Every device in our touch portfolio is pen-enabled, and we have this awesome technology called AnyPen – this lets you draw or write on-screen using almost any object, be it a carrot, screwdriver, or a nail. The pen used to be a primitive way of inputting data into a device, but now it’s primary, especially when using a tablet. Hence the huge growth in apps equipped for pen input (Microsoft estimates the number of Windows Active Pen and Touch devices sold devices sold has grown 50 per cent year-on-year). That’s why we’re investing in this technology.

Mightier than the pen?

But pen is just the tip of the input iceberg. With Siri, Google Now and Cortana, speech input is becoming more viable, though only in a personal environment.

Gesture control is also becoming more popular. This is like Minority Report made real – using 3D cameras, you can move the mouse, and left click and right click by waving your hands in front of the screen. However, for using a PC, it’s not all that intuitive – it’s more effort than using a keyboard, for one thing. But it is gaining traction with things like supermarket displays and other digital signage.

In some cases, you don’t need to use your hands at all. Lenovo is working with a company called Tobii in Sweden that tracks your eye movement and uses that to move the cursor. Stephen Hawking uses a system called ACAT – which stands for Assistive Context Aware Toolkit – that lets him input text just by using his eyes. It’s about twice as fast as typing on a phone. This isn’t just for people who have a disability, but for those whose jobs require them to use their hands; they can then use a PC at the same time. It could also be used for augmented and virtual reality, letting users type text by looking at a menu in their smartglasses.

Looking further into the future, we have mind control. A company called Emotiv has a headset that lets you move the cursor and click just by thinking. Though it requires a lot of concentration. It’s early days for this technology – the sensors are very expensive, and you have to smear saline solution into your hair to stick them on, which isn’t very nice. But in the future it could be used for controlling lamps, or even steering cars.

PCs can even read the user’s mood. A company called Sight Corp can monitor the emotions of an audience using tablets placed in front of each person. It feeds this back to the presenter, giving them rea-ltime feedback on how satisfied the audience is. It can even split it by gender, so you can see if you’re reaching your male or female audience. Though sometimes you might not want to know!

Intuitiveness is key

The potential success of any interaction method comes down to one factor: Is it intuitive enough? Touch and sight feel natural, because they’re how we interact with the world. Thinking is more abstract, so is a lot trickier to gauge.

Human computer interaction is becoming more and more enabled, but not necessarily more intuitive. The coming years will show a whole bunch of new ways of interacting with PCs. But we shouldn’t kiss our keyboards goodbye just yet.

YOU MIGHT ALSO LIKE