Travis D. Breaux Carnegie Mellon University Travis D. Breaux
Assistant Professor of Computer Science
Institute for Software Research
School of Computer Science
5000 Forbes Avenue, Pittsburgh, PA 15213
Office:
Tel:
Fax:
E-mail:
5103 Wean Hall
412-268-7334
412-268-3455

Links: Home | Research | Teaching | Publications | Biography | Vitae

08-605: Engineering Privacy in Software
08-605 for graduate students

This graduate-level course covers the methods and tools needed to design systems for privacy with a specific focus on the requriements, design and testing stages of the software development lifecycle.

Privacy harms that involve personal data can often be traced back to software failures, which can be prevented through sound engineering practices. In this course, students will learn how to engineer privacy using modern methods and tools for software requirements, design and testing. This integration includes how to collect and analyze software and privacy requirements, how to reconcile ambiguous, inconsistent and conflicting requirements, and how to develop and evaluate software designs based on established privacy principles, including how to analyze design alternatives to reduce threats to personal privacy. After completing this course, students will know how to integrate privacy into the software development lifecycle and how, and when, to interface with relevant stakeholders, including legal, marketing and other developers in order to align software designs with relevant privacy laws and business practices.

Prerequisites: 15-313 or equivalent, or permission of the instructor.

Learning Objectives.

  • Integrate privacy into the software engineering lifecycle phases
  • Collect, analyze and reconcile system requirements in a privacy-sensitive ecosystem
  • Evaluate software designs based on privacy principles and privacy requirements
  • Interface with software developers on critical privacy issues
Textbook: Axel van Lamsweerde (2009). Requirements Engineering: From System Goals to UML Models to Software Specifications, New Jersey: John Wiley & Sons, Inc.

Activities and Evaluation

Student performance in this 12-unit course will be evaluated based on class participation, individual assignments and group projects: an equal portion of individual and group work will be used to compile the final grade. There will be no midterm or final exams.
  • (3 hours each week) Readings – students will be assigned readings from the course textbook to learn establish methods based on a strong engineering foundation. Additional readings will be selected and developed by the course instructor to include privacy theories that will be implemented using these methods.
  • (3 hours) Class Participation (10%) – students reflect on readings in class and contribute to in-class assignments based on readings.
  • (2 hours) Assignments (40%) – students have 6-8 take-home assignments during the semester whereby students apply methods taught in class to sample problems.
  • (4 hours) Project (50%) – students work in teams of 3-4 students on projects pre-selected by the course instructor. The aim of these projects is to highlight privacy across different types of systems, e.g., personal vs. multi-stakeholder systems, or mobile vs. online systems, and across different domains, e.g., social networking, retail, health, and travel. The team environment will aid students in considering a more diverse set of trade-offs when designing software to preserve privacy.

Course Modules

  1. Requirements - students will learn how to express and analyze system and privacy requirements using natural language use cases and semi-formal models. This includes reconciling conflicts between system requirements that deliver business value, and privacy requirements that preserve, enhance or otherwise protect privacy. Topics covered, include:
    1. Sources of requirements and how to use trace matrices to manage compliance and preserve rationale. This includes legal or regulatory requirements, privacy principles, privacy patterns and privacy controls as a source of requirements knowledge.
    2. Goal-based analysis to refine privacy goals into functional, privacy-enhancing system specifications.
    3. Privacy threat and risk analysis to apply different risk models to explore privacy threats, vulnerabilities and mitigations, including: a legal compliance model, a FIPs-based model, Calo's subjective/objective harms model, Solove's privacy harms taxonomy, and Nissenbaum's Contextual Integrity.
  2. Design - students identify and evaluate alternative design strategies to implement requirements developed in the first half of the course. Evaluation criteria will consider trade-offs in system and privacy goals. Both data and software lifecycle concerns will be addressed, as follows:
      Architecture vs. Policy - Students will explore the boundary between engineering automation as described in architecture and the human reliance as covered by policy. The translation of policy into system specifications is a special focus under this topic.
    1. Data Lifecycle - Students will explore design "in the life of data" as it moves through the runtime system from collection, use, and retention to transfer. These lifecycle activities include designing for various privacy qualities, including collection and use limitation, data minimization, anonymization or de-identification, destruction, and individual participation, among others.
    2. Evolution & Adaptability - Students will explore software evolution and adaptability as it affects privacy, including deployment, maintenance and upgrades that risk violating privacy requirements.
  3. Testing and Validation - students will learn how to test privacy requirements and how to accommodate those requirements that are not easily tested, e.g., requirements about behavior that is outside the scope of the system specification that relies on environmental assumptions, such as presence of certain privacy-protective activities, or the absence of other threatening activities. This includes code reviews and code audits, and auditing runtime behavior.