Monday, May 4, 2009

Playing the game. EyeSpy: supporting navigation through play

EyeSpy: supporting navigation through play.

This paper was written by Marek Bell1, Stuart Reeves1, Barry Brown2, Scott Sherwood1,
Donny MacMillan1, John Ferguson1 and Matthew Chalmers1 (1. University of Glasgow, 2. University of California)

One of the central ideas behind this paper is getting useful information from a game, while keeping the game fun. I am a big supporter of these kind of things. One of the cool things about EyeSpy, is that it takes place in a real world environment. Alternate Reality Games, and augmented reality I think will soon offer more and more interesting things to do simply while you are doing your everyday activities.
But now on to what they actually did.

EyeSpy is a game designed to extract easier navigational aides and landmark recognition from photos as well as text tags. Though the text tags really did not work, the pictures where able to help quite a bit.

Here is how the game is played, and it's played over a weeks time integrated into your daily life. Players take pictures that they think people will recognize, and then other people have to confirm it by visiting the location and confirming the photo. (The software that the game works with tracks location by looking at wireless networks.). Over time the players got better at finding spots that people would easily recognize. After the game was finished, people were asked to navigate using the photos, versus random ones of the same area from Fliker. The photos from EyeSpy were significantly better at this task.

I am looking forward to seeing more articles on extracting useful interaction from games.

No comments:

Post a Comment