Human Development Lab

@ Carnegie Mellon University

About

Impact evaluation is considered harmful.*

Ok, now that we have your attention, let us clarify what we mean. :) The lab's central focus is to make learning opportunities more equitable around the world through affordable, culturally appropriate technologies. When designing any intervention, it is of course important to have an intervention evaluation to understand how effective the intervention has been. In our mixed-methods studies, quantitative evaluations of the interventions we designed have shown nearly one standard deviation improvement on post-test gains, and we continue to seek funds to replicate such results in large-scale, randomized experiments. When employing qualitative methods in the spirit of ethnography, we have studied how gender and caste act as social fault-lines that affect the use of learning technologies in everyday rural settings.

That said, overly focusing on intervention evaluation has the unintended consequence of viewing technology as a black box. By neglecting the crucial aspects that go into intervention design, it is not surprising when pilots fail due to poor designs and/or inadequate attention to intervention design in the first place. Worse, we fail to understand how various design features account for the eventual outcomes observed in an evaluation, and hence lack the fine-grained knowledge that is essential for scaling up any pilot. We take a humanistic view of technology that upholds human agency, that humans use technology to bring about non-deterministic outcomes, and that technology is part of the repertoire of tools that mutually constitute the broader cultural environment.

Intervention Design 

There is a growing community of designers out there who are passionate about applying their skills to improve the lives of the underserved. This is great! Instead of us trying to design yet another system (even though we do this too), our ultimate goal is to systematize a body of knowledge, techniques and tools which other intervention designers can use. When we design educational interventions to draw design lessons for the larger community, we follow these principles:

Healthy skepticism about technology. It may sound ironic coming from technology innovators, but it was our skepticism about adopting technology blindly that led us into human-computer interaction in the first place! We believe technology does not exist in a vacuum, and exists in a broader social, cultural and political context. As such, we experiment with numerous ideas in the early stage of the design lifecycle to understand which ideas are not feasible, and why, with the specific goal of rejecting most design ideas to narrow down on the more viable ideas for further probing.

Culturally appropriate design. Our designs are informed by our qualitative studies of everyday sociocultural practices. For instance, our initial designs for educational games that target literacy among rural children in India were not intuitive to them. We took a step back to study the traditional, physical games that rural children play everyday. Next, we analyze the game mechanics in these games to understand the rules that match the expectations rural children have about games. Since the best way to understand traditional games is to understand what they are not, we went a step further to examine how elements in village games differ systematically from those in contemporary Western videogames, and found over 30 non-trivial differences. Our cross-cultural analysis has informed subsequent educational game designs for rural Indian children. Our collaborators in China have replicated this technique to improve Mandarin literacy in non-coastal regions in China, with early promising results.

Sound pedagogy. Instead of relying on non-scientific intuitions about how humans learn, such as those derived from one's prior experiences as a K-16 student, we draw on the research literature in the learning sciences, the psychology of reading processes and second language acquisition to inform our designs. Technological artifacts are not black boxes, and are instead highly malleable objects whose designs can be shaped by evidence-based pedagogy.

State curriculum standards. Technology is not culturally neutral and needs to exist within broader institutional contexts. As such, we align our technology designs with curricula designs that conform to official standards mandated by local governments, so as to encourage adoption by parents, educators and other stakeholders. At the same time, in practice, we needed to add "remedial" content to existing curricula, since the latter often targets middle-income, urban learners and not their less well-off counterparts from the slums or rural areas. We aim to measure learning gains against conventional tests, in addition to other outcomes.

Reuse, don't reinvent the wheel. In our work, we attempt to capture and extend design knowledge about current best practices. For example, before we designed our literacy learning games, we studied more than 30 commercial language learning applications and distilled their best practices in the form of over 50 design patterns. The latter is a formalism for capturing design knowledge about widely accepted solutions to frequent problems, so that this knowledge can be communicated and reused. We leverage current best practices as a starting point and incorporate them into our instructional designs, versus reinventing the wheel.

In drawing on existing knowledge to inform intervention design, our task is complicated by the limitation that much research on reading acquisition and literacy, for example, is based on learners in industrialized countries with reasonably good access to schooling. On the other hand, our target learners reflect vastly different conditions on assumptions such as “concept of print” and school-based social practices, all of which are important theoretical constructs in existing literacy theories. We have drawn from the existing research base as best as we could. We aim to use our design artifacts as a research infrastructure and tools that can be used to operationalize and test the extent to which existing conceptual frameworks apply in -- and have to be extended to be more insightful for -- culturally divergent environments. In this way, human-computer interaction research in the “developing” world can help us attain a more globally complete understanding of what design, cognition, literacy and learning truly means.

Local Capacity Development

We feel lucky to work alongside countless undergraduate students from the local countries where our research is based, in addition to college students from North America. We have mentored over 50 undergraduates, whether in the field during pilot testing, or hosting them for summer internships at Carnegie Mellon University, or co-advising undergraduate theses. Local undergraduate research assistants are an asset in terms of their knowledge of local languages, culture and engineering when we design, build and pilot systems with local communities.

We are gratified to have in turn contributed to their professional development through this close mentor-apprentice relationship. Building on their experiences in our lab, our strongest local undergraduates have gained admissions to competitive graduate programs in North American universities such as Berkeley, Carnegie Mellon, Georgia Tech, Stanford and Toronto. 

Here are what some of what our former undergraduate research assistants have said about their experiences:

“The biggest concept that I realized about performing human-computer interaction research is that the technological side is critical but makes up less than half of the research process.”

“MILLEE … has to work for the chosen user group. This requires … studying the way of life of urban slums and rural Indian children through field research.”

“MILLEE … involves people working all across the globe, so I've gained insight into what it takes to coordinate large-scale, international projects.”

“I learned how much work goes into planning and successfully executing a user study in a rural area. Matt has really helped me to see what it takes to run a project.”

“MILLEE has ... given me a really interesting look into the world of academic research, which is completely different from … undergraduate classes.”

“I learned to detach myself from a volunteer’s role to a researcher’s role involving monitoring and evaluation.”

“I have gotten lots of great advice about the research process, graduate student life, and the human-computer interaction field.”

“If an undergraduate researcher shows that he or she can handle the work … then their role in the group will increase to challenge and grow them as human-computer interaction researchers.”

“We presented our work in weekly team meetings, where even as an undergraduate researcher I felt my opinions were listened to and valued.”

“My favorite part was how diverse we all were. There were students from a wide variety of fields creating a truly interdisciplinary environment.”

“My experience in the field drove home the fact that children, despite their socio-economic class, are incredibly adept at absorbing and helping others absorb information, as well as learning new technologies, no matter what village, slum, or ghetto they are raised in.”


Geeta Shroff interviewing low-income women in the slums to learn why some of our initial design ideas ought to be discarded


* The computer scientist Edsger Dijkstra wrote an article in 1968 titled Go To Statement Considered Harmful. This critique of prevailing computer programming practices subsequently spurred the adoption of structured programming in the computer science community. In the same spirit of fostering change, we share our views and approach about how educational technologies should be designed with a humanistic philosophy.