Counterclockwise: the touchscreen evolution

The human-computer interface has been a problem since the dawn of computers. As far as smartphones are concerned, the clear winner is the touchscreen. But how did we get here?

The proto-touchscreen was the light pen, which was first developed in the 50s and 60s and was available to consumers as an accessory for 8-bit computers in the 80s. The light pen is a simple but clever solution that works only with CRT monitors - it senses the electron beam as it scans across the phosphor drawing the image line by line, pixel by pixel. When the beam is sensed by the pen, the computer just makes a note of which pixel was being drawn at the time, that's how it knows where the pen is on the screen.


A light pen from 1969

In the 80s a different solution was developed - infrared beams criss-crossed the screen. Touching the screen (with a finger or stylus), blocked some of the beams and the computer registers a press. This kind of tech was used by Neonode, one of the earliest all-touch phones (the N1 came out in 2003). The company left the phone business quickly, but still makes touchscreen kits for computers.

Neonode N1
Neonode N1

The PDA and early touchscreen smartphone eras were defined by the resistive touchscreen. It featured two thinly-separated layers, which would make an electrical connection when you press down on them. Styluses were typically used as their thin tips reduced the force required to press down and were more accurate on the fairly small screens.

O2 XDA Atom Exec
O2 XDA Atom Exec

The Sony Ericsson P800 put an interesting twist to it. It was an all-touch phone (running Symbian UIQ), but it hedged its bets and offered a flip-out keypad (this was in 2002, all-touch devices were rare). The keypad simply had styluses on the back of each screen, so pushing a key really pressed the touchscreen behind it.

Sony Ericsson P800
Sony Ericsson P800

Capacitive touchscreens work another way - your finger changes the capacitance of the screen, which is picked up by a sensor grid. This is designed to work specifically with fingers, so most styluses (or even gloved fingers) don't work. Apple didn't invent capacitive touchscreens, but the original iPhone was certainly the most major contributor in their rise to stardom.

Apple iPhone
Apple iPhone

The early phones with capacitive screens had a separate touch-sensitive layer. Later on "in-cell" tech allowed this layer to be embedded into the display itself. The best-known example of that tech is Samsung's Super AMOLED. The advantage is that without the extra layer, the image appears closer to your finger and glare is reduced too.

Samsung I9000 Galaxy S
Samsung I9000 Galaxy S

With the Xperia sola Sony removed the "touch" from touch screen. The Floating Touch sensor could track your finger at a distance, allowing you to "hover" over elements without pressing them (an interaction typically reserved for computer mice). A few other phones tried this, but the tech didn't catch on.

Sony Xperia sola
Sony Xperia sola

Apple tried something else - Force Touch can sense how hard you're pressing down, which also enabled additional interactions. This was introduced with the original Apple Watch, but iPhones have it too starting with the 6s (where, confusingly, it's known as 3D Touch). Apple has been neglecting Force Touch in recent versions of iOS, though.

Apple iPhone 6s
Apple iPhone 6s

Samsung brought back the stylus with the Galaxy Note. Besides the capacitive touchscreen, the Notes feature a Wacom digitizer allowing precise pressure-sensitive way to track the stylus.

Samsung Galaxy Note N7000
Samsung Galaxy Note N7000

One weakness of touchscreen is that they lacked tactile feedback. BlackBerry infamously tried to fix that with the SurePress screen of the Storm. This tech allowed the screen to be physically pressed down like a button. People hated it and the much-ridiculed tech was quickly abandoned.


BlackBerry Storm 9500

A company called Tactus tried to take it a step further - the Tactile Layer technology inflatable buttons on the screen, which can be raised or lowered to give a physical shape to the on-screen buttons. This didn't even make it past the prototype stage.

The Tactus display had physical buttons that can be raised or lowered
The Tactus display had physical buttons that can be raised or lowered

An honorable mention goes to the Sony Xperia Projector. It runs Android and not only projects the screen, but uses IR sensors to turn the image into an actual touchscreen. This is the holy grail of mobile tech - you can have a screen tens of inches in size (user adjustable) while keeping physical dimensions of the device small.

Sony Xperia Projector Microsoft PixelSense, formerly Surface (<a href="https://commons.wikimedia.org/wiki/File:Surface_table.JPG" target="_blank" rel="noopener noreferrer">image credit</a>)
Sony Xperia Projector • Microsoft PixelSense (formerly Surface)

The Microsoft Surface - no, not the tablet, the table computer - used several infrared cameras to see you touching the screen. But they could see more than that, they also spotted objects you placed on the table (e.g. a digital camera) and offered relevant options (e.g. downloading your photos off that camera).

Where will touchscreen technology go next? Right now it seems that makers are more concerned with the shape of the screen rather than how it works, so it could be a while before the next major change.

Counterclockwise: the touchscreen evolution Counterclockwise: the touchscreen evolution Reviewed by MR-Cheezy on February 04, 2021 Rating: 5

No comments

Featured Posts

[Break][feat1]