A Recap of Sam Otis’ “Click to Continue”

On Wednesday, we were led down a path of conventions and interfaces — from the first computers, to the future of how we interact with technology — as the Sioux Falls Content + Design Group participated in Sioux Falls Design Week with a presentation by Blend’s senior designer, Sam Otis, on the past and future of graphical user interfaces (GUIs).


    Sam Otis and Corey Vilhauer
  • Oct. 06 2017

Without directions, we are lost.

We cannot pick up a tool — a brand new tool, one that we’ve never seen before — and understand how to use it. At least, not without a bit of a hint. A nudge in the right direction. A purposeful explanation.

Explanation and direction come from a lot of areas. There's explicit direction — something telling us what to do, whether through non-verbal clues or written documentation. But there’s also something called convention: agreements made between what we’ve always seen and what we have come to expect.

Using conventions, we can understand a hammer the moment we see it — not because there’s a set of directions, but because we understand its design: where to hold it, what side to use for hitting, the pain it will bring upon our thumb at least once or twice a week.

On Wednesday, we were led down a path of conventions and interfaces — from the first computers, to the future of how we interact with technology — as the Sioux Falls Content + Design Group participated in Sioux Falls Design Week with a presentation by Blend’s senior designer, Sam Otis, on the past and future of graphical user interfaces (GUIs).

hammer, book, phone, warning button

How We Got Here

“Why do we use digital devices?” Sam asked. To enhance our abilities, and to hopefully make life better. We can see our family, who may live thousands of miles away. We can make complex tasks easier, and we can spending less time fighting with (and more time doing) the things we love. Which gives those of us who work in the technology design industry a pretty specific goal: make sure these digital devices are easy to navigate and understand.

This is a relatively new field. Before computers, things were designed in a way that helped us know how the worked or what they do. You knew how to use an alarm clock. The wind-up in the back matched the movement of the clock face. The button on top clearly stoped the bells from ringing.

These things — the wind-up on an alarm clock, the handle of a hammer, the steering wheel inside of a vehicle — are made clear through what we call “affordances,” the properties within an object that clearly show its potential interaction points. Sam gave us a quick example: you know how to use a light switch because there is an up and down switch; however, if we ignore this affordance and force the user to turn the whole switch to the left, we are going against the natural expectations of the user.

The thing is: computers don’t have these affordances. We have to program them in, and we have to do it in a way that seems natural. “The more naturally we can interact with digital information, the greater the potential,” and as Sam walked us through a history of GUIs, we saw what this meant. We invented ways to manipulate digital design in a way that couldn't with our hands — the mouse, touch screens, VR headsets. We updated and tweaked and changed until roughly 1981, when the Xerox Star was launched, offering the basic foundation of what our operating systems look like still today.

click ... i mean tap to continue, what is the difference?

From Desktop to Mobile

With the larger conventions locked in, we’ve spent the last several decades working on the details within our operating systems. We relied on — and still rely on — real world metaphors to help illustrate computer functions. Deleting a file — represented by a piece of paper, organized into folders — places it into a trash can. This type of metaphor — “skeuomorphism” — leads to a calendar with a faux-leather background, and mobile icons that look like actual buttons.

(NOTE: even as we assign real-world dopplegangers to our technology to breed familiarity, those metaphors aren’t perfect, often breaking down and limiting the native capabilities of what digital can do. An example is Google Drive, where individual files can be placed within more than one folder, skewing the actual properties of a folder and confusing those who were raised in the real world, where one paper could go in one folder and that was the limit. Because physics.)

Over the years, we have developed digital affordances — which we call conventions — that are ubiquitous with how we navigate the web, both through a browser and through applications. How an active text field looks. Where top-level navigation lives. How to exit a browser pop-up. These are all inventions that we have folded into our web expectations, and as we get used to them we can better navigate websites. The conventions evolve, and so do we.

But, are we evolving fast enough? It seems obvious, but the mouse is not the computer, yet we still get stuck designing our interfaces to follow that model. Nowhere is that more prominent than on a mobile device, where “Click to Continue” is no longer relevant, where touching and tapping have taken over, and so instead of interacting with interfaces via an added accessory, we’re reaching right through to the site itself, the only thing separating us from our content is a bit of glass.

For the first time, digital elements can be accessed directly. It's freeing — after all, it’s released us from the ties that bind (and by ties, I mean mice and keyboards and dongles). But, it also has forced us in the web industry to rethink how we’re allowing people to interact. We're bringing back buttons instead of underlined words. We’re moving from icons to switches. We’re allowing things to slide and twist, pinch and zoom, where before we were one-dimensional. Click. Click.

Apple saw this with the first iPhone, and while they got a lot of flak for being cute with their skeuomorphism, they understood the need to make things metaphorical again. They saw what happened with the desktop operating system, so they raised those icons like buttons. They made the notes app feel more like a legal pad. They understood that the iPhone wasn't just reaching out to existing desktop users, but to a whole new generation of people who were just now learning to interact with technology.

In this, we see GUI design following a cycle. New ways of interaction require new metaphors, until a critical mass of understanding allows for subtle shifts to become new conventions. We see this over and over again: the RSS symbol, the “X” to close a modal, the hamburger icon. They shift from skeuomorphic to flat to abstract as our understanding of the space becomes more mature. Flat design doesn't work on its own — at least, not until we’ve created the conventions to make them work. There are no affordances in a blank box, but there are in a box with a blinking cursor, or a button with an underling text link.

rollovers in google cardboard

GUIs, Moving Forward

So what of the future? Beyond click. Beyond touch. “Blink to continue?

We’re still trying to figure out how to use virtual reality and augmented reality to make our lives better. Easier. More efficient. For every silly augmented reality demo, there’s massively popular games like Pokemon Go, or worthy systems like Apple’s ARKit. For every goofy scene in a 90s movie pretending for all-encompassing virtual reality, there’s a breakthrough in VR gaming systems, and the accessibility of Google Cardboard. And with all of these, we’re looking at new GUIs.

Sometimes, it means taking a step back. Virtual reality takes us from in front of the screen to within the screen, but in doing so we lose our ability to interact with the content itself. That piece of glass might have separated us from the content, but it also provided a way to manipulate the content. We have nothing to touch, nothing to click; so we revert into using accessories again, like hand joysticks, or we see our interfaces walk back into dropdowns and rollovers — things we have seen nearly disappear in today's touch-centric landscape.

We’re still figuring that all out — I mean, we haven’t even figured out all of the conventions still required for a good desktop site, let along the varying widths required for mobile and touch. And that’s the biggest challenge — and the biggest thrill — in moving forward on new web projects, on new web platforms, using new technology. That spirit of taking our already accepted conventions and adjusting them for something new.

As Sam detailed the world of GUIs, he took us back to a time when we used metaphors from the physical world to help us make sense of digital information. We still do that today, but now digital is reaching out to enhance the physical world. It’s exciting. And we’re all looking forward to where it leads.

All we need to do is click to continue.

Huge thanks to Sam Otis for speaking at this month's Content+Design meetup. Sam's slides for the presentation, “Click to Continue,” can be found on Slideshare. Or, they are embedded below.