Monday, March 30, 2015

Tech wave continues

Tiree Tech wave 

I popped to shop by bike( lovely sunny) but the weather changed so rapidly that on way back I had to get off and push through the 30mph wind. 

Saturday, March 28, 2015

Techwave

It's all go

Mencap shows that 60% of people with learning disabilities didn't register to vote for the last election because they found it too difficult.

From an HCI point of view it sounds like a design failure here

Mencap shows that 60% of people with learning disabilities didn't register to vote for the last election because they found it too difficult.

"It was very complicated to fill in. I did not know what I had to do. The information that explained the forms was not clear. The form had too many boxes and difficult words. There was not enough room in the form to write information. It made me feel excluded " is quoted on website.

Not sure if Mencap is including dyslexia in this but I'm wondering if it's also had an influence.

and more generally a social justice failure here
BBC reports "After the 2010 election, one in five people with learning disabilities told Mencap in a survey that they had been turned away at the polling station because of their disability."



https://www.mencap.org.uk/about-learning-disability/about-learning-disability
http://www.bbc.co.uk/news/blogs-ouch-30920778

Wednesday, March 25, 2015

On the way to Tiree tech wave

On my way to tech wave . I get to stop in one of my favourite stations- Carlise 

Tuesday, March 24, 2015

Monday, March 23, 2015

David Eagleman: Can we create new senses for humans? | Talk Video | TED.com

David Eagleman: Can we create new senses for humans? | Talk Video | TED.com





I'm fairly familiar with this kind of work the TED Talk was kicked off by  this paper  https://noisebridge.net/images/9/91/Jne5_4_r02.pdf we did some work in the our lab to http://oro.open.ac.uk/17968/1/dap2008final.pdf a few years ago.

I originally suggested using distance sensors in the back of a car to give you a feeling for your blindspot ( hard to do experimentally). I quite liked the idea of building a chair which took readings from the CPU to give a programmer a sense of how well their software was doing ( pretty hard core programming ). 



good to know it's still going.

Saturday, March 14, 2015

Repertory grid -new Analytic methodology of the day.

Repertory grid - Wikipedia, the free encyclopedia



"The client is asked to consider the elements three at a time, and to identify a way in which two of the elements might be seen as alike, but distinct from, contrasted to, the third. For example, in considering a set of people as part of a topic dealing with personal relationships, a client might say that the element 'my father' and the element 'my boss' are similar because they are both fairly tense individuals, whereas the element 'my wife' is different because she is 'relaxed'. And so we identify one construct that the individual uses when thinking about people: whether they are 'Tense as distinct from Relaxed'. In practice, good grid interview technique would delve a little deeper and identify some more behaviorally explicit description of 'Tense versus Relaxed'. All the elements are rated on the construct, further triads of elements compared and further constructs elicited, and the interview would continue until no further constructs are obtained."


Friday, March 6, 2015

Mummy that man with the glasses is back again.





 Just after Google Glass has stopped Sony have released their own development version. On the plus side this is aimed at businesses that might find a use for the layered screen. For example a surgeon while he is engaged in an operation but more likely a worker in a warehouse.



I guess the basic problem is that they cannot get a model of the world which they could then overlay with three-dimensional information quick enough. We cannot track the body/head movements that accurately yet so nothing lined up. Hollywood Films like the Kingsmen show what potential these kinds of systems have but the technology is not close enough yet.

Monday, March 2, 2015

MOVE: Movement in Embodied Adaptive Architecture





"MOVE is an architectural prototype and research platform to explore the relationship of body movements and movements in adaptive architecture. Using a Kinect motion sensor, MOVE tracks the gross body movements of a person and allows the flexible mapping of those to the movement of building components. In this way, a person inside MOVE can immediately explore the creation of spatial configurations around them as they are created through the body. 
This can be done live, by recording body movements and replaying them and through manual choreography of building elements. Trial feedback has shaped our four-stage iterative design and development process. The video shows Tetsudo performers Hamish Elliott and Natalie Heaton exploring interaction with MOVE. 

MOVE was created at the Mixed Reality Lab, School of Computer Science, The University of Nottingham by Holger Schnädelbach and Hendro Arieyanto in 2014." 



Like Exobuilding I love these kinds of transitions between software and buildings. - Back to Ubicomp submissions...