CS 230: Software Engineering

Lecture: Mon/Wed 12:00 PM to 1:50 PM, Royce 156
Instructor: Dr. Miryung Kim (ENG VI, 474)
Office Hours: By appointment only
Official Final Exam Time will be used for final project presentations.
  
General Description
As software systems become increasingly large and complex, automated software engineering analysis and development tools play an important role in various software engineering tasks: design, construction, evolution, and testing and debugging of software systems. This course will introduce students to the foundations, techniques, tools, and applications of automated software engineering technology. Students will develop, extend, and evaluate a mini automated software engineering analysis tool and assess how the tool fits into the software development process. This class is intended to introduce current research topics in automated software engineering.

Undergraduate level knowledge of data structures and object-oriented program languages is required. A prior course in software engineering (undergraduate) such as CS130 is recommended. Please feel free to take a look at CS 130 syllabus here. You are welcome to just sit in for a few days and see how this class feels. By understanding the fundamentals behind automated software engineering analysis algorithms, techniques and tools, students will acquire keen eyes on how to systematically design, analyze, and extend large software systems. This will prepare students to become principle software architects and development leads in the future.  
  • Software design patterns
  • Automated software testing
  • Bug finding and advanced debugging techniques 
  • Program understanding and software visualization
  • Automated refactoring

Audience and Prerequisites

Undergraduate level knowledge of data structures, object-oriented programming languages, software engineering is required. Knowledge of compilers, program analysis, and control flow graph and abstract syntax tree program representations is required. Proficient Java programming experience is required as well.

This class can be enrolled by PTE only. Please contact the instructor with your undergraduate transcript, graduate transcript at UCLA and CV. Unofficial transcripts are fine. 

Grading

Reading Assignment Instruction

Please download the above paper from the ACM Digital Library. Access to ACM digital library is free if you are using a computer on campus with a valid UCLA IP address. Please submit short questions or comments in class for each paper discussion. The grading of reading questions will depend on the overall quantity and quality of your questions.
  •     Cool or significant ideas. What is new here? What are the main contributions of the paper? What did you find most interesting? Is this whole paper just a one-off clever trick or are there fundamental ideas here which could be reused in other contexts?
  •     Fallacies and blind spots. Did the authors make any assumptions or disregard any issues that make their approach less appealing? Are there any theoretical problems, practical difficulties, implementation complexities, overlooked influences of evolving technology, and so on? Do you expect the technique to be more or less useful in the future? What kind of code or situation would defeat this approach, and are those programs or scenarios important in practice? Note: we are not interested in flaws in presentation, such as trivial examples, confusing notation, or spelling errors. However, if you have a great idea on how some concept could be presented or formalized better, mention it.
  •     New ideas and connections to other work. How could the paper be extended? How could some of the flaws of the paper be corrected or avoided? Also, how does this paper relate to others we have read, or even any other research you are familiar with? Are there similarities between this approach and other work, or differences that highlight important facets of both?

Mini Exams

We will have a quiz on four pre-defined dates. The instructor cannot accommodate quizzes on a different date. However, we will count the best 3 quizzes out of 4 quizzes. Each quiz should take about 20 minutes. 

Mini Projects

  • proposal of a new research project
    • develop and assessing new algorithms to replace currently-used ones
    • translate a methodology to a new problem domain
    • apply known techniques to new problem domains, such as operating systems, networks, embedded systems, security, biology, aerospace, etc.
    • port an existing tool to a new domain (e.g., a new programming language)

Each team should submit a written report (max 10 pages) to the CCLE system. Each team should consist of 3 to 4 people. You may include an appendix beyond 10 pages, but your paper must be understandable without it. Submissions should be in the ACM format.  In Week 5, I will release a few sample project ideas to guide you with the process of choosing a project topic. A short proposal is due on Week 6 and a final project report is due on the final exam week.

Your report should be structured like a conference paper, meaning that your report should contain:

  • Abstract
  • A well-motivated introduction
  • Related work with proper citations
  • Description of your methodology
  • Evaluation results 
  • Discussion of your approach, threats to validity, and additional experiments
  • Conclusions and future work 
  • Appendix: Describe how to run and test your implementation.
If you are doing a project that involves implementation, please submit your source code by sharing an on-line repository. Please describe how to run and test your code in your report.

Here's grading guidelines for your project report.

Motivation & Problem Definition

  • Does this report sufficiently describe motivation of this project?
  • Does this report describe when and how this research can be used by whom in terms of examples and scenarios?
  • Does the report clearly define a research problem?

Related Work

  • Does the report adequately describe related work?
  • Does the report cite and use appropriate references?

Approach

  • Does the report clearly & adequately present your research approach (algorithm description, pseudo code, etc.)?
  • Does the report include justifications for your approach?

Evaluation

  • Does this report clarify your evaluation’s objectives (research questions raised by you)?
  • Does this report justify why it is worthwhile to answer such research questions?
  • Does this report concretely describe what can be measured and compared to existing approaches (if exist) to answer such research questions?
  • Is the evaluation study design (experiments, case studies, and user studies) sound?

Results

  • Does the report include empirical results that support the author’s claims/ research goals?
  • Does the report provide any interpretation on results?
  • Is the information in the report sound, factual, and accurate?

Discussions & Future Work

  • Does the report suggest future research directions or make suggestions to improve or augment the current research?
  • Did the report demonstrate consideration of alternative approaches? Does the report discuss threats to validity of this evaluation?

Clarity and Writing

  • Is the treatment of the subject a reasonable scope as a class project?
  • How well are the ideas presented? (very difficult to understand =1, very easy to understand =5)
  • Overall quality of writing and readability (very poor =1, excellent =5)

Class Discussion: Think-Pair-Share

How Does It Work?
1) Think. The teacher provokes students' thinking with a question or prompt or observation. The students should take a few moments (probably not minutes) just to THINK about the question.

2) Pair. Using designated partners (such as with Clock Buddies), nearby neighbors, or a deskmate, students PAIR up to talk about the answer each came up with. They compare their mental or written notes and identify the answers they think are best, most convincing, or most unique.

3) Share. After students talk in pairs for a few moments (again, usually not minutes), the teacher calls for pairs to SHARE their thinking with the rest of the class. She can do this by going around in round-robin fashion, calling on each pair; or she can take answers as they are called out (or as hands are raised). Often, the teacher or a designated helper will record these responses on the board or on the overhead

Class Schedule, Reading List, and Project Milestones


Lectures
Reading
Week 1
4/2 (Mon)
4/4 (Wed)
Introduction to Software Engineering
Syllabus 
Background Survey
Software Design and Software Architecture
Ease of Change
Software Architecture
Architecture Description Support
OPTIONAL: On the criteria to be used in decomposing systems into modules

READ: An Introduction to Software Architecture
OPTIONAL:
ArchJava: Connecting Software Architecture to Implementation


Week 2
4/9 (Mon)
4/11 (Wed)

Software Design Patterns
Design Patterns
Design Patterns: FactoryMethod, Singleton, Adapter, Bridge, Flyweight, Strategy, Mediator, Observer
Week 3
4/16 (Mon)
4/18 (Wed)
Hoare Logic and Weakest Precondition
Software Verification
Quiz 1 (4/16 Monday)

Week 4
4/23 (Mon)
4/25 (Wed)
Empirical Studies of Software Evolution
Code Decay


Reverse Engineering and Knowledge Discovery
Reflexion Model
Software Visualization

READ: Does code decay? Assessing the evidence from change management data

READ: Software reflexion models: bridging the gap between design and implementation

OPTIONAL: Polymetric views-a lightweight visual approach to reverse engineering

Week 5
4/30 (Mon)
5/2 (Wed)

Interactive Code Review

API Usage Mining
Quiz 2 (4/30 Monday)
READ: Interactive Code Review for Systematic Changes
(local pdf)

READ: Are Code Examples on an Online Q&A Forum Reliable? A Study of API Misuse on Stack Overflow (local_pdf)
OPTIONAL:
Visualizing API Usage Examples at Scale (local_pdf)

Week 6
5/7 (Mon)
5/9 (Wed)

Program Differencing and Merging
The Longest Common Sub-sequence Algorithm
Abstract syntactic tree based program differencing
Control Flow Graph based Program Differencing

Refactoring

Refactoring Practices
Refactoring Reconstruction

READ: A differencing algorithm for object-oriented programs.
OPTIONAL:
Identifying syntactic differences between two programs
Identifying and summarizing systematic code changes via rule inference
Interactive Code Review for Systematic Changes
Basic Background:
Abstract Syntax Tree
Control Flow Graph

READ: A Field Study of Refactoring Benefits and Challenges

OPTIONAL: Template-based Reconstruction of Complex Refactorings

Week 7
5/14 (Mon)
5/16 (Wed)
Refactoring
Automated Refactoring

Debugging and Fault Localization
Delta Debugging
Spectra-based fault localization
FindBug

Quiz 3 (5/14 Monday)
READ: Does Automated Refactoring Obviate Systematic Editing?
OPTIONAL: LASE: Locating and Applying Systematic Edits by Learning from Examples


READ: Yesterday, my program worked. Today, it does not. Why?
OPTIONAL: Simplifying and Isolating Failure-Inducing Input
Locating Causes of Program Failures
Isolating Cause Effect Chains from Computer Programs
Week 8
5/21 (Mon)
5/23 (Wed)
Fault Localization

Regression Testing
READ: Visualizing information to assist fault localization
OPTIONAL: Finding bugs is easy

READ: Regression Test selection for Java software
OPTIONAL: Scaling regression testing to large software systems
Week 9
5/28 (Mon)--No Class
5/31 (Wed)--No Class
Memorial Day and ICSE Conference

Week 10
6/5 (Mon)
6/7 (Wed)
Change Impact Analysis
Code Clones
Quiz 4 (6/7 Wednesday)


READ: Chianti: a tool for change impact analysis of java programs
OPTIONAL: FaultTracer: a spectrum-based approach to localizing failure-inducing program edits

READ: CCFinder: A multilinguistic token-based code clone detection system for large scale source code
OPTIONAL: An empirical study of code clone genealogies
Final Week
Final Project Presentations


Feedback Statement

During this course, I will be asking you to give me feedback on your learning in both informal and formal ways, e.g., including anonymous midpoint survey about how my teaching strategies are helping or hindering your learning. It is very important for me to know your reaction to what we are doing in the class, so I encourage you to respond to these surveys, ensuring that we can create an environment effective for teaching and learning.  Occasionally, at the end of the lecture, I will hand out index cards to ask "what is the most important thing you have learned in this class session?" and "what questions do you still have?" This feedback will be anonymous and this is to check your understanding and promote Q&A. Please also take the time to write written comments when submitting your course instructor survey. Your course instructor survey is important to me and future students. To reward your participation, I will add 3% of the total grade for participating in the course instructor survey.

Academic Integrity

Each member of the university is expected to uphold these values through integrity, honesty, trust, fairness, and respect toward peers and community. In your first week, you must read and sign UCLA's Academic Integrity Statement.