We bring ourselves into the present and we contemplate ethics questions after watching contemporary science fiction clips.
This week we will look at the rise and spread of mobile interaction design around the world, the beginning of the internet of things which shepherded in voice user experience and we will briefly touch on design systems which have evolved out of the earlier design pattern libraries we learned about a few weeks ago.
Shortly after the worst of the dot com bust, mobile phones, which were still a rarity with only the richest folks, were finally able to access the internet. Prices started to come down and interaction designers started to think about and design mobile experiences. It was like going back to the early nineties again — UIs were rudimentary, very text based and had minimal features compared to their browser based counterparts. Mostly people used the phones to make calls and to text.
During this time we also saw the rise of PDAs like Palm Pilot and Handspring. They attempted to bring more functionality and more applications to people on the go. They were internet enabled but slow, black and white, required a stylus but allowed for portability of core applications like email, texting, games etc.
Then the first iPod came out. It was sized slightly smaller than the PDAs — more like the cellphones of the time. The ease of use between the hardware and software was simple and elegant. The evolution of the iPod to the iPod nano and then iPod touch laid the ground work for the iPhone.
It wasn’t until 2007, when the first iPhone was released, that we started to see a richer opportunity for interaction design on mobile platforms. The iPhone brought a level of ease to the functionality of applications. The platform was more graphic than its predecessors and the creation of the app store ecosystem launched a whole new market for designers to become involved in. Speaking from my own experiences, at the beginning it was hard to do mobile work if you hadn’t ever done mobile work. But with the iPhone — being more graphic and more like miniature web applications — it was easier to break into designing for this mode. Now it’s commonplace for us to have richer features and more robust applications on the phone due to the global spread and expectations for full experiences on the phone. In many countries, people only have their phones for internet access and not computers, and as companies have realized that and gone global, the need for IXD, for researchers and for UI designers has only increased. The iPhone brought us gestural interfaces — that mimic real world actions — and more commonplace use of voice UX.
Voice UX is seen with Siri, Google home, and Alexa. Although we have had some Voice UI since the 1980s — in telephone systems — the first Voice UI for computers was voice recognition software from dragon systems — NaturallySpeaking — which was an early voice dictation application. The second phase of voice UIs is happening now and is still in its infancy. This has required the development of natural language processing in computing and more sophisticated methods for research and design.
Current Voice UX is akin to designing stories and conversations. It requires a clear understanding of language, semantics and context and developing multiple outcomes. Voice UI is manifested in our smart home devices as well as with chat bots — although most of our interactions with these bots are textual.
All this leads to the rise of the Internet of Things. I am not going to go into tons of details here because I recommend watching the video “What is the Internet of Things” and a video interview with David Rose, who was an early designer of smart objects, who talks about these as Enchanted Objects. The readings this week also cover UX of Voice, Internet of Things and best practices developed to encourage calm technology as we fill up our lives and houses with all these devices that clamor for our attention.
The other part of this week looks at the current state of design systems. We read about pattern libraries a few weeks ago. Over the years these have become more commonplace within companies, many who have entire design teams dedicated to creating, testing and implementing design systems (starting with patterns, but expanding to include UI components and relevant code). Read the article On the Current State of Design Systems and also read my counter article A History of Patterns in User Experience to fill in the missing holes from the first article.
The video homework that I gave my students was to watch a set of clips from several SciFi TV shows (Black Mirror, The Feed and Altered Carbon)and then to write about the ethics of technology as we move closer and closer to these kinds of realities. How far in the future are the ideas from these shows. Is this how you would like to live? What aspects seem interesting and worth pursuing? What aspects don’t and why? After watching the short video on the Internet of Things — think about the amount of data we currently are sending to servers about every aspect of our lives. How much data is swirling in the cloud about every aspect of our lives? How much more data would we send into the ether if we get to the point shown in these sci-fi shows? How close are we to the scenarios shown in the Black Mirror Nosedive clips versus The Feed versus Altered Carbon?
Note: All these lectures were delivered via video with related slide decks of images. Following the intro, students had a series of readings and videos to watch related to the topics covered in the lecture or the overall time frame. They were then given a set of prompts to stimulate their thinking and writings which ended up in a class blog.
Computer Age: AOL, Girl Games, intro lecture 11
Computer Age: The World Wide Web, Browsers, Early Community, intro lecture 10
Computer Age: Early personal computers & games – intro lecture 9
Computer Age: Christopher Alexander, Muriel Cooper and Architecting Space, intro lecture 8