Google just had their conference and presented Material Design. It's a solid design approach and bumps-up their OS design game. However, after learning more about it, and thinking about Apple's approach to OS, I'm wondering if hardware and software are more united than we really thought.
What made the iPhone innovative is that it was a unique approach for combining a camera, email, book reader, text message reader, etc. into a phone, creating a mini computer. Rather than having a separate keyboard that takes up space, there was a flat screen, redefining how someone interacts and communicates with the device. The icons may change, the animations may change - but the core innovation was creating a new approach to interacting with a device using hardware and software to incorporate touch technology.
The Andriod OS was pretty much a copy of touch interaction - why Steve Jobs was so annoyed at Google. And Google was more open to installing this OS on other company's devices (also why Stevie was also upset at Samsung).
The Andriod OS is being perfected. Materials is beautiful and really brings a new game to devices. However, I wonder if the device OS trend is to just perfect what is available rather than to think about new ways humans can interact with devices.
What I often think about:
- Better voice command tools . Why do we need to press a button for that to work? Why can't we call the device's name and just have it go? If this exists - let me know. It will help me be more hands-free while I drive.
- Do we really need a keyboard? There weren't a lot of keyboards in Star Trek or Star Wars. Typing should be going away - especially if we have more voice command capabilities or gestures. Is a keyboard the way of the future?
- Do we really need a mouse? Couldn't we draw with a stylus on the screen or point to what we want? This brings me to my next point...
- When will laptop/desktop screens have touch capabilities? Or could we get the tablets powered up to get rid of the laptop/desktop and create a real notebook? HP and some other computers have this. It's great!
- Why do we keep relying on metaphors from daily life that require us to have a manual to operate? If we are truly evolving usability, shouldn't we rethink how we use our devices? Maybe a knob isn't the right way to operate a stove. Maybe a lever isn't the right way to flush a toilet - there are button mechanisms in place now, which is more intuitive. Just something to think about.
- Why are technology implants considered an option at all? Do we really need technology in our bodies to get results? Isn't the brain a master biological computer?
How we get there is to allow users, designers, technologists and researchers the ability to collaborate to create a product. Users are quickly getting to know what they want in a mobile product. They use them more than we think - and they have very clear thoughts about how we sometimes over-complicate a simple device. For example, some think that a phone should be more of a phone while some people (like me) can see the need for a phone on a tablet (video conferencing ability). If there is more dialog between everyone, we will create more interesting devices and interactions.
And this brings me back to combining software and hardware to create better experiences with devices. Now that we are perfecting what we have, maybe it is time for a disruption to appear that will make us rethink technology and what it can do in our lives.