Monday, April 29, 2013

Chi2013 Monday

What an extremely busy day. CHI opened the announcement is the biggest CHI ever. You could fill a football stadium with the 3500+ computer interaction people who turned up.

 It's been a whirlwind of meetings and interesting papers. Lovely meeting with Christoph H. which allowed us to talk about the new user interface for the analytic portions of the new People Watcher app.

I managed to also speak about the synergy between interaction and software engineering at the software engineering special interest group meeting.

I had lunch with Paul Marshall and friends and it was lovely to sit in a French restaurant discussing heavy scientific issues. One thing that we discussed was the new replication CHI. I'm not sure exactly what replication CHI is about but if it is about replicating papers that have been published that cannot be a bad thing.

After a day full of papers I was then able to go along to the interaction event which kept me moving around for two hours.

This is from the University of Nottingham. All the plate patterns you see are in fact cleverly composed binary codes which a computer vision system can use to identify the plate. Nottingham do show what you can do when computer scientists collaborate with artists. Ceramic artists in this case


This was a very interesting work from Berkley about moving tactile interaction and sound interaction together to make something closer to a real musical instrument.

This was also from the Berkley people and it just looked wonderful. Another tactile interface.


This was an interface designed to sit on a dancer's back as a kind of external spine which, I think, also controlled sound or  music.

This was a nice interface designed to allow you to create multiple devices by turning the slab into different shapes.
Could your next computer be pen and paper? Possibly with this interface. Here anoto pens are used to draw music which you can then hear played back from the computer.

During the into action event I bumped into a Ph.D. student from London who turned out to be have a ADHD, I managed to pontificate about Neurodiversity having these chats is allowing me to clarify a lot of issues in my head. I hope my 10 minute talk can match do Neurodiversity justice. 


After interaction I had dinner with Eva H. and some German colleagues. I get the feeling that Eva is very happy in her new job as Prof of interaction at the Bauhaus.


Monday, April 22, 2013

What We Know About Spreadsheet Errors

If you ever needed to know why we need human computer interaction as a subject this is it. 
I've heard rumours about this paper but never saw it before and I finally I have the link to th paper. This is a damming bit of information for the digital world. 

Close to 90% of spreadsheet documents contain errors, a 2008 analysis of multiple studies suggests. “Spreadsheets, even after careful development, contain errors in 1% or more of all formula cells,” writes Ray Panko, a professor of IT management at the University of Hawaii and an authority on bad spreadsheet practices. “In large spreadsheets with thousands of formulas, there will be dozens of undetected errors.”


Making things harder by simplifying them. CDSA form

I'm currently struggling with this CDSA form. (I'm not sure what CDSA stands for - possibly carreer we have to fill it in). 

The old forms were pretty bad but these new ones are much harder. Partly it's the fact that in entry in to word forms keeps meaing the text jumps around. 

Secondly and more importantly for HCI is that by breaking the form down in to smaller chunks of more specific elements it making it much harder for me to work out what to write. I am having great difficulty on focusing on the micro details and ignoring the overall all flow and narrative. 

Oddly this is a problem my wife found with the kinds of 'help' she got writing her course essay. Basically you had to tweet small isolated chunks of text in response to vauge elements like Narrative, Action, millstone and timescale. By separating them out like this you loose all sense of coherence. 

This seems like another way apparently 'helpful' attempts are actually making it harder for dyslexics to compete in the work place. 


Thursday, April 11, 2013

You are our 10,099th visitor

I don't often talk about this blog its self, I see it as a notebook for things ubicomp. I'm happy to report ( surprised even) that some time yesterday I got my 10,000th 'hit'. Having a look I have seen that I am getting more than 600 hits per day on a busy day. 

Perhaps I should take this moment to say thank you for all your interest.

Please feel free to leave a comment or subscribe I am curious to know more about you. 


Screens in the Wild: Introduction - YouTube

Screens in the Wild: Introduction - YouTube:  Nottingham Mixed Reality Lab  and UCL/Bartlett are producing some very curious urban interfaces. 

Your Mobile Experience Is Not Theirs

Interesting and quick presentation on the notions of neurological difference between the West ( read US) and the East ( read China/Taiwan) and how this has impact on the HCI of mobile devices. 


Wednesday, April 10, 2013

Accurate eye center localisation for low-cost eye tracking - YouTube

Accurate eye center localisation for low-cost eye tracking - YouTube


My guess is someone is going egienfaces for eyes. I'm still not sure what you would use this for if you had the use of your hands. I'm sure Steven Hawkin could get some use out of something like this. I know there is some use in Affective Tech - understanding people's emotions for this perhaps.

I'm not sure about having it as a new modality. Perhaps dialog boxes which disappear when you have read them. Autoscrolling on reading.  Adding Lip reading to voice recognition to improve accuracy.

But this would have to be balanced having a machine which 'knew' when you were working ( good for Yahoo home workers!) and having the security problems when used in an office. People get funny in offices with cameras.

Thursday, April 4, 2013

RFduino

Basically a mini/low power Arduno which talks using low power blueTooth. Could talk to you iPhone...
nice.

Eye tracking for the masses

NUIA eyeCharm: Kinect� to eye tracking by 4tiitoo — Kickstarter

This is another set of corrective glasses for the Kinect which makes it able to track
your eyes, with more software to handle the recognition process.

Tobbi have been trying to get into the end  user market but the problem is still lack of apps. The best user interfaces I have seen are incidental ones - for example something which notices you have been looking at something and then when you come back highlites the last thing you looked at as a way of getting you back in the zone quicker. Something you might do in a more systems level way.

Any way - with this kickstarter project you can try this out cheaper than before good for a masters project.