Head of the Class

It's a Wednesday morning, and in Christina Levkus' classroom at Steel Valley High School outside Pittsburgh, a group of 10th graders is learning how to derive the measures of the angles in a triangle. They're working on geometry problems using a computerized tutor. Green-colored bars, or "skillometers" (think "thermometers for skill") show how much a student has learned in any given lesson. When an indicator turns gold, the student has mastered that skill.

Suddenly, one boy cries out: "This is crazy! Why can't I get any gold bars?"
The stricken sophomore glances at the computer next to his. "No fair," he says to the classmate working on that machine. "How come you've got gold bars?"

"Hey!" she says, pointing at the screen. "I can't get this one stupid bar to move at all." As they commiserate, Levkus walks over to see why they're struggling.

It's a typical scene in any of the 2,600 schools that use Cognitive Tutors, the interactive, intelligent tutoring programs developed at Carnegie Mellon in the 1990s and marketed by Pittsburgh's Carnegie Learning Inc. The difference between other classrooms and this one is that Levkus' students are using a course that's part of LearnLab, a nationwide resource for education research created by the Pittsburgh Science of Learning Center, or PSLC, a joint venture between Carnegie Mellon and the University of Pittsburgh and sponsored by the National Science Foundation.

The course that Levkus' students are using is capable of measuring many variables. Data are being collected for a study by Vincent Aleven, assistant professor in the Human-Computer Interaction Institute, and post-doctoral fellows Ryan Baker (CS'05) and Ron Salden. As Levkus' students work through the lessons, their mistakes and right answers are logged on a server. The software is also counting the minutes it takes for students to work through problem sets and the number of times they ask for online help.

Each point of information collected is stored in the "DataShop," a massive repository of educational data created and maintained by the PSLC. Every day, DataShop is receiving similar data from schools and universities using six course-length tutoring systems in algebra, geometry, physics, chemistry, Chinese and English; as well as dozens of other learning environments, some created specifically for certain experiments. So far this year, more than 14,000 students have been part of LearnLab experiments, ranging from elementary, middle and high schools in Pennsylvania, New Jersey and Florida to university campuses in Hawaii, British Columbia, Denmark and Germany.

As a result, DataShop is now arguably the world's largest public repository of empirical educational data ever gathered in the field, with a corpus of approximately 100,000 hours of student instruction that comprises 22 million different transactions between students and tutoring programs.

"We really have established that intelligent tutoring systems can be a great way not just of delivering instruction," says Ken Koedinger (HS'88, '90), a professor of human-computer interaction and psychology at Carnegie Mellon, "but that they also provide the technical infrastructure for helping us understand human learning."

. . .

Koedinger is co-director of PSLC with Charles Perfetti, research director and senior scientist at the University of Pittsburgh's Learning Research and Development Center, or LRDC. Members of the executive committee include Aleven; Maxine Eskenazi (HS'73), associate teaching professor in the Language Technologies Institute; David Klahr (TPR'65, '68) of Carnegie Mellon's psychology department; Marsha Lovett (HS'91, '94) of Carnegie Mellon's Eberly Center for Teaching Excellence; and Julie Fiez, Tim Nokes and Lauren Resnick of the Pitt psychology department. Michael Bett (S'86, TPR'00) is managing director.

Created five years ago with $25 million from the National Science Foundation, PSLC learned in February that its work would be funded for another five years with a similar grant.

Perfetti, University Professor of Psychology at Pitt, serves as chief scientist of PSLC. "It's really hard to over-emphasize the value" of the LearnLab infrastructure, he says. Classroom research before the creation of LearnLab required high levels of effort and repeated trips into the field, Perfetti says, likening them to visits to the dentist. "Every time you did one, you wondered if it was the last study," he says, adding that the more data the researcher collected, the harder they were to analyze.

But converting a classroom that's already using Cognitive Tutors into a LearnLab is a relatively simple process. And because so many schools are already using Carnegie Learning products, they're familiar with Cognitive Tutors and comfortable with Carnegie Mellon staff. That's a built-in network PSLC research manager Gail Kusbit uses whenever a researcher needs to conduct an experiment. "We're very fortunate in that so many schools say yes," she says.

As a result, LearnLab and DataShop are the foundation of a network for continuous data collection that Perfetti envisions becoming an educational research resource "for the world." With LearnLab, he says, education and cognitive psychology researchers "never have to think twice" before committing to a classroom study. "You just do it."

. . .

LearnLab is only one important initiative of PSLC's researchers. Others include the Cognitive Tutor Authoring Tools, or CTAT, a software suite that allows non-programmers to design their own intelligent tutors; TuTalk, which enables creators of intelligent tutors to create less obtrusive, more intuitive online help programs; and TagHelper, a suite of programs that partially automates the process of annotating documents in Chinese, English, German and Spanish, making it easier to classify texts gathered in protocol analysis ("think-aloud" or dialogue data).

PSLC's express purpose is to encourage experimental research into "robust learning"--giving students a deeper appreciation of the material and the ability to transfer their knowledge into new situations. Studies indicate that students who learn a "robustly" in a subject area find it easier to master new skills in the future.

But identifying "robust" learning strategies after a class ends is difficult. Assessment tests provide a crude measurement of how much a student learned, but don't offer any insight into why a particular student did (or didn't) retain knowledge, and don't document struggles and milestones in the learning process.

Conventional wisdom, for instance, holds that "practice makes perfect," and that students learn more if they have lots of homework. Research through LearnLab indicates that isn't necessarily so.

"Problem sets are good, and you need practice, but especially for students are just beginning a topic, it's not enough," Koedinger says. Instead of drilling students on homework problems, researchers have had them use intelligent tutors that alternated solved examples with new problems. As a result, the students retained more information and did better on tests. "They spent more of their brain power on understanding rather than just getting the problems done," Koedinger says.

PSLC unites Carnegie Mellon's long heritage of research into cognitive psychology and computer science--a tradition reaching back to Herbert Simon and Allen Newell--with the University of Pittsburgh's School of Education and Learning Research Development Center. Founded in 1963, the LRDC combines research in cognitive science, developmental and social psychology, organizational behavior and education policy to understand the ways children and adults learn.

"It's the kind of thing that shows off these intellectual resources that we have in Pittsburgh that make it such an exciting place," says Carolyn Penstein Rosé (HS'94, CS'97), assistant professor in Carnegie Mellon's Language Technologies Institute and Human-Computer Interaction Institute. Rosé is collaborating with Resnick, former director of the LRDC and Distinguished University Professor of Psychology and Cognitive Science at Pitt, on research into social and communicative factors in learning. They're examining how classroom conversation and interaction--including online chats and collaboration over the Internet--enhance the ways students master new material. Their goal is to teach students how to articulate their reasoning, and recognize when other students are also reasoning through a problem.

Although she's long admired Resnick's research, Rosé hadn't found a way to work with her until PSLC provided a network for collaboration. "PSLC encourages an active exchange between different research communities that might not have gotten involved together otherwise," Rosé says. "It's really evolved organically."

. . .

Through face-to-face meetings and a lively wiki where users propose, document and debate theories, PSLC unites researchers in education, psychology, machine learning, human computer interaction, language technologies and other fields. Many PSLC activities center on experiments conducted via LearnLab courses, or by mining data from the DataShop.

Probing data logged by geometry Cognitive Tutors, for instance, Aleven and other PSLC investigators noticed that a higher than expected percentage of students preferred to make repeated errors rather than ask for online help. Some students were worried (incorrectly) that their "skillometers" would go down. Others weren't sure when to seek assistance. Aleven and his colleagues built a tutor that worked alongside the geometry program to teach students when to ask for help. The results were mixed, he says--students got better at spotting hypothetical situations when they should look for assistance but weren't putting that information into practice.

Still, the experience suggested important new areas for further research into motivational techniques and the role of positive and negative reinforcement, Aleven says. And a recent analysis suggests that students exposed to the "help tutor" used smarter strategies on subsequent exercises after that agent was removed, he says. "We're very excited about this result," Aleven says. "To the best of our knowledge, it's the first time that an intelligent software tutor has been shown to have a lasting effect on student learning at the meta-cognitive level."

The ultimate goal of the PSLC's research isn't just to improve student skills in math or science, Aleven says: "We're trying to help students become better learners."

Humans have been learning new skills since before the dawn of recorded history. But convincing educators to examine the ways students learn--and the ways teachers should teach--is surprisingly difficult, says Klahr, Walter van Dyke Bingham Professor of Cognitive Development and Education Sciences at Carnegie Mellon, who has been studying elementary and middle school science education for more than a decade.

Many science teachers abandon the scientific method when they're evaluating the success of their own lessons, Klahr says. "They love their subject matter and believe they can teach it, but they fail to apply the same rigorous procedures to evaluating the impact of their teaching methods as they would apply to their own scientific domains," he says.

The problem is that educators spend years in the classroom as students, Klahr says, and come out with strong beliefs about what works and what doesn't. Changing their minds requires rigorous, verifiable data obtained through controlled experiments, and gathering that data isn't easy. A researcher can observe a classroom of 30 students at a time, but can't easily study entire school districts.

And a researcher designing an experimental lesson can't ensure that every teacher will deliver that lesson in exactly the same way. Cognitive Tutors don't have that problem. "They're infinitely patient, and you can test things out for as long as you want," says William Cohen, associate research professor in the Machine Learning Department, who worked on development of SimStudent, part of the CTAT suite. Cognitive Tutors provide researchers with a degree of control over an experiment's parameters that human teachers can't duplicate. Because they can be deployed at many sites simultaneously, the amount of data collected is orders of magnitude larger than anything individual researchers can gather independently.

"When you have a weak signal and a lot of noise, you need a lot of data," Klahr says. "Now you can do a very sophisticated analysis because the data sets are so large."

. . .

Even relatively small data sets, however, are providing useful information. Tim Nokes is an assistant professor of psychology at Pitt who studies physics education. Traditional physics courses, he says, rely heavily on showing students examples of different problems, then expecting students to solve new problems by applying the same formulas. Unfortunately, Nokes says, some of the best students finish a course with a good understanding of formulas and a weak understanding of physical concepts.

"There's a real disconnect between the description of the concepts and the examples," he says. "They tend to see a problem as a 'pulley problem' or a 'lever problem,' and not a 'Newton's Second Law of Motion problem.'"

Through LearnLab, Nokes conducted an experiment using 78 midshipmen taking a physics course at the U.S. Naval Academy in Annapolis, Md. One group received a standard lesson with a completed example problem and was asked to solve a similar problem. A second group was asked to examine and explain each step in the sample problem. The third group was given two examples and asked to compare and contrast them. All of the students were then tested on an intelligent tutor equipped as a LearnLab. The students in the second and third groups had better scores, solved the problems more quickly and required less online help, Nokes says, demonstrating they had better mastered the concepts.

Without PSLC and LearnLab, Nokes could have conducted a similar study in a controlled setting like a laboratory, but such experiments lack verisimilitude. They don't take into account variables like a student's motivation to do well in a certain course. "It's not always clear to me as a researcher that something that works in a lab will work in a classroom," he says.

Experiences like that prove the value of the LearnLab concept, says Marsha Lovett, associate director of faculty development in Carnegie Mellon's Eberly Center for Teaching Excellence and associate research professor of psychology. "This is intensive, moment-to-moment, click-stream data, not just 10 minutes in a lab," says Lovett, co-principal investigator on a project to create an intelligent statistics tutor. "And you can collect from thousands of students a semester's worth of data each."

. . .

Not all of the studies being run through the PSLC require collection of new data, says Ryan Baker, a post-doctoral fellow in the HCII. Many experiments can be performed using DataShop's existing corpus--some of which was gathered back in 1996, before the creation of PSLC, but which has been transferred into the system.

"If you're doing educational data mining, (the PSLC) is the place to be," says Baker, technical director of DataShop and associate editor of the Journal of Educational Data Mining. "Educational data mining is a great new frontier," he says, because it's forcing computer scientists to develop entirely new strategies and algorithms. Many existing data-mining tools look for patterns, word associations or clusters but don't account for hierarchy and variables that depend on each other, Baker says. For instance, he says, if a particular student asks for help, researchers need to be able to interpret that request in light of the same student's previous requests for help.

PSLC has developed applications to plot learning curves, error rates and other measurements for DataShop users, and raw data can also be exported in tab-delimited format. But the amount of data has doubled in the past year and will probably increase a hundred-fold in the next two years, Baker says. As a result, scalability is becoming a concern; one fast-moving research area in the PSLC is development of more powerful data-mining tools.

If there's a downside to the LearnLab infrastructure and DataShop, it's that not enough researchers outside of Pitt and Carnegie Mellon are using it. Klahr (quoting a favorite Koedinger analogy) says it's a classic "toothbrush problem"--researchers treat other peoples' theories like toothbrushes, and they don't want to use someone else's.

If the same "not-invented-here" bias also applies to educational data, it may finally be weakening, Baker says. DataShop currently has more than 450 users in a dozen countries, and at the first annual International Conference on Educational Data-Mining, held last June in Montreal, Quebec, nine of 17 papers included material from DataShop. "The educational data-mining community does see us as a resource, and they're getting excited about it," he says.

Baker is developing software that spots when students using a Cognitive Tutor are getting bored and "gaming the system"--solving problems by guessing or abusing online help instead of reasoning through the steps. "I believe that the PSLC, five years from now, will have made enormous progress toward predicting how an individual student will respond to certain methods of instruction," he says.

Lovett looks forward to the day when intelligent tutors are able to provide smarter feedback to students by analyzing a semester's worth of data. A future intelligent tutor might spot students who cram the night before tests and do poorly, she says. It could then suggest smarter strategies for learning. "If you have all of this learning data coming in moment-by-moment, then why not feed it back to the instructors and the students themselves in a way that's meaningful?" Lovett says.

There are many innovations to come, Koedinger says, and much to be discovered about the way human beings learn. So little experimental data has been gathered from classrooms that it's a vast territory waiting for exploration, he says.

 "It's also a particularly challenging one, because there are so many political, ethical and social issues that float around," Koedinger says, "but in the science of learning, there are really still surprises to be had."
Image2: 
Image3: 
Image4: 
Image5: 
Image6: 
For More Information: 
Jason Togyer | 412-268-8721 | jt3y@cs.cmu.edu