Last Friday I attended User Experience conference UX Brighton 2012. This was the third year that the conference had been run and it lived up to the high standards set by the previous two years. Set up and curated by Brighton based UX designer Danny Hope this year’s conference had a theme of how knowledge of the past informs us to create better products for the future. This involved eight high quality speakers looking back at the history of the web and looking forward to how we might be interacting with technology in the future.
Alex Wright – NY Times
Alex gave a history on the web that might have been. He spoke about scientists and innovators who were exploring alternative systems that often bore little resemblance to the web as we know it today. He reminded us that history is not always a straight line and that the best ideas and the best technology doesn’t always win.
Early ideas along the lines of the web that we use today were around as early as the 1800’s with visionaries like mathematician Ada Lovelace coming up with concepts that share a lot with the way that the web currently works. Other innovators, like Paul Otlet, had their works effected by war, with most of Otlet’s work being destroyed during World War II. Even Tim Berners Lee lost a lot of his initial ideas on the world wide web due to a hard disk failure.
Mark Backler – Lionhead Studios
Mark talked about NUIs; Natual User Interfaces. These are control systems such as those used by the Xbox Kinnect where the user actually becomes the controller. NUIs include speech, gesture and touch as input methods rather than traditional keyboards, mice or video game controllers.
NUIs make it easier for anyone to use games consoles and other devices. Traditional controllers often provide a barrier to entry as they can be overwhelming for less experienced users. Mark spoke about how Lionhead Studios user tested their Kinnect games, and some of the common issues that they found. One of the main issues is finding unique gestures that feel comfortable and natural for people playing the games. Another issue Mark touched on was speech input, where words used should be at least two syllables long to make sure that they are not confused with other commands. They should also not sound distinct from all other commands to avoid confusion. This can cause issues when games are translated into other languages, making the process of finding the right words even more difficult.
He underlined the difficulty in finding the right time for testing saying that there were “two times to do user testing; too early and too late”. Testing early versions of a game may give inaccurate results as users often dislike the unpolished version of the game that they are playing and get frustrated with the inevitable bugs and gameplay issues. Testing later in the development process means better feedback from users but it often means there is little time to act on the user feedback, particularly if major issues are uncovered.
Guy Smith-Ferrier – Author & Developer
This was one of the talks I had been looking forward to seeing as Guy promised to control a computer live on stage using just his mind! His futuristic demo saw him changing his facial expression which in turn changed the expression on a ‘mannequin’ face on the screen. This was based not on any sort of face recognition but on Guy’s thoughts which were read by a headset. He then went on to move an onscreen cube around using just his mind. While this was impressive, he shared how difficult it is to achieve and how he had to ‘train’ his computer and headset for two years to get to where he is today. Despite the difficulty of using this kind of system there are ambitious projects underway to make better use of the technology, notably the Brain Driver project in Germany where users can control a car just using their minds!
The short term uses for this technology are for helping severely disabled people communicate and, on a more trivial level, creating computer games that could be controlled using the mind. The technology could also be useful for gaining a better insight into the thought process of users during user testing where the tools could be used to see the subconscious reactions of a user.
When the software and hardware are perfected though, the potential for this technology could be huge.
That completed the morning session of the event, read about the afternoon session.