Glazed eyes, cross legged, staring at a screen, whilst images flicker across pupils.
Clunky multi press , swipe and tap motions with one hand as I pull up the calculator – type , correct, type, undo.
Faux wooden bookshelves and dials, containing a mock toolkit which crudely beeps and swooshes, as if I’m an inhabitant of a Jeff Koon artwork.
Heads of digital departments in large electronic corps give keynotes at CES in Vegas declaring ‘soon you’ll have a screen in your smart fridge where you can get the latest weather report’.
It all seems so dated, so Microserfs, so CD-Rom – not the world of Internet of Things, the Quantified Self or wearable tech.
With Amazon Fresh’s announcement of a screen-free Dash device , that works only via scanning or speaking, we see the beginning of the end for practical screen based services. As much as the mouse feels like a superfluous disconnection in personal computing, now that we have touch screens – so the screen will seem naively disjointed in achieving our computer assisted practical tasks.
As user design veteran Don Norman describes in his 1998 book The Invisible Computer, computer design has been counter intuitive. The role of technology is to assist and extend human existence seamlessly. The PC that developed out of the 1980s computer boom, never quite left its old paradigms behind and didn’t blend with intuitive human behaviour. It became a new language to learn. Technology over usability.
However the paradigm has noticeably shifted over the last three years.
Tablets have become adopted on a massive scale across the age and social spectrums. Why? Because their’s no detachment – it is the most basic human behaviour: see-want- touch. I’ve witnessed 2 year olds and 80 year olds proficiently using iPads for tasks that they wouldn’t have handled on PCs.
We’re also seeing the ubiquity of signal, data capture, minified components, malleable materials and low cost physical printing. This takes us further away from the disconnected interaction of the screen by opening up the possibility of aware and connected everyday devices that are cheap to manufacture in small runs. Offerings like the Jawbone, Nest home services and Android Wear are moving us off screens into the Internet of Things. Combined with interoperable services brokered by consumer services like If This Then That or Zapier – we can bring the smart automation of web services into our natural physical domains.
It will be fascinating to see how this movement of practical utility apps away from the screen to wearables and embeddables will alter how we spend our time and change the way we interact. This change will challenge many sectors, such as advertisers who current invade our visual fields whilst we undertake essential tasks. It also potentially gives additional focus to sound design and oratory skills, as audio becomes a more important feedback loop.
Gesture controls within post-screen devices could alter our physicality too. We saw in just a few years after text messaging became prevalent amongst teenagers, a change to thousands of years of evolution ; as the thumb overtook the index finger as the most dexterous digit for that generation. Could sign language or a greater range of physical expressions emerge amongst the post-screen generation?
There will still be a place for screens. Visual storytelling will be hard to replace without screens (or projections). Whilst telepresence (physically present in another location via robots) my take off for some, a face to face conversation Skype or FaceTime is still an engagingly simple experience. Books and the printed word may have less of a share of peoples’ time in long-form, but the power and efficiency of written language should keep it alive.
The vision of invisible technology and an augmented existence feels much closer in 2014, so as product developers we need to move our focus from visual design to embodied experience design.