Back in the 1970’s, if you wanted to use a computer, you had to use the command line interface. The graphical interfaces used today didn’t yet exist commercially. For a computer to work, users needed to communicate via programming language, requiring seemingly infinite lines of code to complete a simple task.
By the 1980’s the first graphical user interface (GUI) was developed by computer scientists at Xerox PARC. With this groundbreaking innovation, users could now interact with their personal computers by visually submitting commands through icons, buttons, menus, and checkboxes.
This shift in technology meant that anyone could use a computer, no coding required, and the personal computer revolution began.
By 1984 Apple Computer released the Macintosh personal computer which included a point and click mouse. The Macintosh was the first commercially successful home computer to use this type of interface.
The accessibility and prevalence of personal—and office—computers meant that interfaces needed to be designed with users in mind. If users couldn’t interact with their computers, they wouldn’t sell. As a result, the UI designer was born.
As with any growing technology, the UI designer’s role has evolved as systems, preferences, expectations, and accessibility has demanded more and more from devices. Now UI designers work not just on computer interfaces, but mobile phones, augmented and virtual reality, and even “invisible” or screenless interfaces (also referred to as zero UI) like voice, gesture, and light.
Today’s UI designer has nearly limitless opportunities to work on websites, mobile apps, wearable technology, and smart home devices, just to name a few. As long as computers continue to be a part of daily life, there will be the need to make the interfaces that enable users of all ages, backgrounds, and technical experience can effectively use.