Thursday, April 3, 2014

BBC News - Oculus Rift provides a VR trip inside Game Of Thrones

BBC News - Oculus Rift provides a VR trip inside Game Of Thrones



I spent a lot of time in the 90s working on virtual reality. I even set up a masters course at UCL on it in the School of architecture (now rebranded to adaptive architecture). It was fun being in a virtual world, but no one ever managed to come up with the killer app for virtual reality. VR died but  the work wasn't wasted as most of the algorithmic improvements that have been made over the years appeared in the high-end 3-D gaming world.



It's kind of strange seeing a whole VR hype returning with Oculus rift. I think what I am most impressed about with the BBC item is that people are working on this as a new medium. It could be an incredibly amazing way to explore stories. 



So I am trying to get out of the feeling this is Groundhog Day. Maybe this time around with better graphics faster responses and simpler tools we can begin to deliver on some of the excitement that the first generation of VR experienced. 

Wednesday, April 2, 2014

My article for the conversation gets 1,967 readers.

You'll hardly notice the next generation of wearable tech



You'll hardly notice the next generation of wearable tech

By Nick Dalton, The Open University

On a trip to Germany, David Cameron has announced £45 million to prompt a “new industrial revolution” based on the internet of things. This marks the beginning of a new era for computing and for wearable technologies. We are entering the age of calm computing and devices that truly help us live. And they’ll be a lot more subtle than a computer stuck to your glasses.

Wearable technology has been the main feature of every technology trade show this year. Google has led the pack with Google Glass and Samsung is already on its second generation smartwatch in competition with smaller startups such as Pebble.

All of these technologies were predicted back in the 1990s by a researcher called Mark Weiser, then chief technology officer at Xerox Palo Alto research centre. And it’s to Weiser we can look when we think about what is going to happen next.



Something’s familiar here. Alan Kaye and the Dynabook.
Marcin Wichary, CC BY


Xerox PARC originally came up with the interface we use on most computers that combines mouse, window and menu, as well as Ethernet, the thick cable your desktop computer might be still attached to today. In the late 1970s and early 1980s, under then-director Alan Kay, PARC developed a vision of Dynabook, which is a good blueprint for today’s iPad and Android tablets.

It was up to Mark Weiser to develop a vision of computing which was as bold for the 1990s as Alan Kay’s vision of a tablet computer was for the 1970s.

Weiser followed the trends for computing and observed that the number of computers per person kept increasing. It began with one when people just used a personal computer. Then, in the 1990s, he noticed that when you surf the web, you are using three or four computers at once – one to look at the web page, one to serve the web page to you and normally a couple or more passing information over the internet. He predicted that the number of computers per person would keep increasing and he was right. Today, with cloud storage and smartphones and intelligent car management, you’re probably using up to 20 computers at any one time.

Weiser even foresaw the progressive miniaturisation and cost reductions that would enable you to use first hundreds, then thousands of computers simultaneously. He called this ubiquitous computing but many people in the tech industry have rebranded it the internet of things.

While in many ways this seems like a recipe for pandemonium, Weiser saw it as an opportunity to rethink how we interact with computers, making them recede into the background, something he called calm computing. By putting computers into things we can bring human values back to technology.

Not just gadgets

While Google Glass and intelligent sound systems for the home are the gadgets that most readily spring to mind when we talk about the internet of things, the future may well be more subtle and more functional.

Take for example the problem of living with dementia. Many retired people want to live independent lives for as long as they can, but their adult offspring constantly worry about their parents getting lost outside their own doorstep.

The current solution is a box worn on a belt which monitors the wearer’s location and can alert a friend or helper when the box moves outside a predefined area or has a panic button pressed.

The downside is that wearers can feel like the box acts as a visual mark of disability. They also need to remember to attach it to their belt before going out. The ubiquitous solution is to hide the computer within a walking stick that would be used anyway.

As Weiser predicted, the computer merges into the background of our lives. The user of this subtly hi-tech device will remember to take it with them when leaving the house using habitual memory so no extra learning is required.

A computer within the wooden stick might cause it to vibrate when the user walks beyond the limits of their habitual neighbourhood. It could vibrate again when pointed in the direction of home to help the user find their own way out of the predicament. A hidden button could be used to summon help, all without drawing attention to the user. Then, once the user has returned home, the stick can be charged in an umbrella stand, again negating the need to learn the new habit of plugging in a dedicated device.

The need for ubiquitous computing like this will spur the next generation of startups and innovation. University courses are emerging to support the next generation of developers that can make this happen and David Cameron’s £45-million funding boost will go a long way too. But strangely, if Weiser’s vision of calm computing is delivered as promised, the only way you will know it has worked is if you don’t see it.

The Conversation



Nick Dalton received funding from The Engineering and Physical Sciences Research Council (EPSRC).

This article was originally published on The Conversation.
Read the original article.