PITTSBURGH—Luis von Ahn, a Carnegie Mellon University computer scientist who has been named a 2006 recipient of a John D. and Catherine T. MacArthur Foundation "genius grant," has invented an online, multiple-player game that could help make the Internet more accessible to the visually impaired.
The game, called Phetch, is an Internet scavenger hunt available at www.peekaboom.org/phetch in which players use a search engine to look for images that fit certain descriptions. In the process, the players produce and verify captions for unlabeled images from the Web. These captions could be used to enhance the Web-browsing experience of blind people.
This innovative use of online games was one of the reasons cited by the MacArthur Foundation for naming von Ahn one of 25 new MacArthur fellows. Each fellow receives $500,000 in "no strings attached" support over the next five years.
Millions of blind people surf the Web every day with the help of text-to-speech translation programs. But these translation programs are of no help when Web sites feature unlabeled images. Only a small fraction of major corporate Web sites are fully accessible to disabled people; personal and small-business sites are even less accessible. Phetch is designed to eliminate this obstacle.
The game is the latest in a series of "Games with a Purpose" that have been developed by von Ahn, assistant professor of computer science, and University Professor of Computer Science Manuel Blum. The first such game, The ESP Game (espgame.org/), produced key words for images that could be used to aid image searches. Another game, Peekaboom (www.peekaboom.org/), produced images with objects labeled and highlighted in a way that could be used to train computer vision systems.
These games, like Phetch, employ a technique called "human computation" — harnessing the human brain to collectively perform tasks that digital computers have yet to master.
"By making these games enjoyable, we can tap into the millions of people who play online games every day, worldwide," said von Ahn, named one of the "Brilliant 10" young scientists in the October issue of Popular Science magazine. It's the same trick the fictional Tom Sawyer famously used to get his friends to whitewash a fence for him, only multiplied millions of times.
Phetch, which von Ahn and Blum developed with students Shiry Ginosar, Mihir Kedia and Ruoran Liu, is designed for three to five players. One serves as the narrator, writing a description of an image that has been randomly retrieved from a set of one million images gleaned from the Web. Only the narrator can see the image. The other players, the searchers, then use a special browser program to search for it within that set of a million images.
Each round lasts five minutes. The narrator receives points for each successful search and loses points if he decides to pass on describing images believed to be too difficult. The first seeker to find each image receives points and becomes the narrator for the next round.
Pilot testing has shown Phetch to be an engaging game. Players spend an average of 32 minutes with the game and some have played for 10 hours or more in a single session. The researchers calculate that 5,000 people — a modest number compared to the number of players at popular game sites — could produce explanatory descriptions of all of the images indexed by Google in just 10 months.
Phetch might also be used to locate hard-to-find images on the Web, von Ahn said. This might be useful for people who don't have the time or skills to find such an image themselves. In that case, the game could be played without a narrator; the description of the desired image could simply be plugged into the game and the searchers would use a browser that could access the entire Web.
Also, by recording the description of each image and the search terms that proved successful in finding it, researchers might use Phetch to develop automated methods for converting natural language descriptions into keyword-based queries.
Byron Spice | 412-268-9068 | bspice [atsymbol] cs ~replace-with-a-dot~ cmu ~replace-with-a-dot~ edu