CS 230: Software Engineering

Lecture: Mon/Wed 10:00 AM to 11:50 PM, BOELTER 5273   
Instructor: Dr. Miryung Kim (ENG VI, 474)
Office Hours: By appointment only
Final Exam:  Monday, June 6, 2022, 11:30 AM - 2:30 PM (Official University Designated Time)
  
General Description
As software systems become increasingly large and complex, automated software engineering analysis and development tools play an important role in various software engineering tasks: design, construction, evolution, and testing and debugging of software systems. This course will introduce students to the foundations, techniques, tools, and applications of automated software engineering technology. Students will develop, extend, and evaluate a mini automated software engineering analysis tool and assess how the tool fits into the software development process. This class is intended to introduce current research topics in automated software engineering.

Undergraduate level knowledge of data structures and object-oriented program languages is required. A prior course in software engineering (undergraduate) such as CS130 is recommended. Please feel free to take a look at CS 130 syllabus here. You are welcome to just sit in for a few days and see how this class feels. By understanding the fundamentals behind automated software engineering analysis algorithms, techniques and tools, students will acquire keen eyes on how to systematically design, analyze, and extend large software systems. This will prepare students to become principle software architects and development leads in the future.  
  • Software design patterns
  • Automated software testing
  • Bug finding and advanced debugging techniques 
  • Program understanding and software visualization
  • Automated refactoring

Audience and Prerequisites

Undergraduate level knowledge of data structures, object-oriented programming languages, software engineering is required. Knowledge of compilers, program analysis, and control flow graph and abstract syntax tree program representations is required. Proficient Java programming experience is required as well.

This class can be enrolled by PTE only. If you are interested in enrolling in the class, you must come to the class and fill out a background survey. I will provide PTEs based on your attendance and background survey.  

Grading

Reading Assignment Instruction

Please download the assigned papers from the ACM Digital Library. Access to ACM digital library is free if you are using a computer on campus with a valid UCLA IP address. Please submit short questions or comments in class for each paper discussion via BruinLearn. The grading of reading questions will depend on the overall quantity and quality of your questions.
  •     Cool or significant ideas. What is new here? What are the main contributions of the paper? What did you find most interesting? Is this whole paper just a one-off clever trick or are there fundamental ideas here which could be reused in other contexts?
  •     Fallacies and blind spots. Did the authors make any assumptions or disregard any issues that make their approach less appealing? Are there any theoretical problems, practical difficulties, implementation complexities, overlooked influences of evolving technology, and so on? Do you expect the technique to be more or less useful in the future? What kind of code or situation would defeat this approach, and are those programs or scenarios important in practice? Note: we are not interested in flaws in presentation, such as trivial examples, confusing notation, or spelling errors. However, if you have a great idea on how some concept could be presented or formalized better, mention it.
  •     New ideas and connections to other work. How could the paper be extended? How could some of the flaws of the paper be corrected or avoided? Also, how does this paper relate to others we have read, or even any other research you are familiar with? Are there similarities between this approach and other work, or differences that highlight important facets of both?

Exam

There will be three exams. All exams are in-person, closed book. The instructor is unable to accommodate different dates for mini exams. No exceptions will be made. However, you may drop one exam out of three exams and your best two exams will be counted towards the final grade.

Mini Project

  • proposal of a new research project
    • develop and assessing new algorithms to replace currently-used ones
    • translate a methodology to a new problem domain
    • apply known techniques to new problem domains
    • port an existing tool to a new domain (e.g., a new programming language)

Each team should submit a written report (max 10 pages) to the BruinLearn system. Each team should consist of 5 to 6 people. You may include an appendix beyond 10 pages, but your paper must be understandable without it. Submissions should be in the ACM format.  In Week 2, I will release a few sample project ideas to guide you with the process of choosing a project topic. A short proposal is due on Week 3 and a final project report is due on the final exam week. A part of instruction time will be reserved for project team meetings with the instructor and there will be a sign up via Piazza. While it is a team project, each team member is not automatically entitled to receive the same grade and their score will be adjusted based on each one's contribution, if necessary. Multiple sources of evidence including a peer evaluation will be collected.

Your report should be structured like a conference paper, meaning that your report should contain:

  • Abstract
  • A well-motivated introduction
  • Related work with proper citations
  • Description of your methodology
  • Evaluation results 
  • Discussion of your approach, threats to validity, and additional experiments
  • Conclusions and future work 
  • Appendix: Describe how to run and test your implementation.
If you are doing a project that involves implementation, please submit your source code by sharing an on-line repository. Please describe how to run and test your code in your report.

Here's grading guidelines for your project report.

Motivation & Problem Definition

  • Does this report sufficiently describe motivation of this project?
  • Does this report describe when and how this research can be used by whom in terms of examples and scenarios?
  • Does the report clearly define a research problem?

Related Work

  • Does the report adequately describe related work?
  • Does the report cite and use appropriate references?

Approach

  • Does the report clearly & adequately present your research approach (algorithm description, pseudo code, etc.)?
  • Does the report include justifications for your approach?

Evaluation

  • Does this report clarify your evaluation’s objectives (research questions raised by you)?
  • Does this report justify why it is worthwhile to answer such research questions?
  • Does this report concretely describe what can be measured and compared to existing approaches (if exist) to answer such research questions?
  • Is the evaluation study design (experiments, case studies, and user studies) sound?

Results

  • Does the report include empirical results that support the author’s claims/ research goals?
  • Does the report provide any interpretation on results?
  • Is the information in the report sound, factual, and accurate?

Discussions & Future Work

  • Does the report suggest future research directions or make suggestions to improve or augment the current research?
  • Did the report demonstrate consideration of alternative approaches? Does the report discuss threats to validity of this evaluation?

Clarity and Writing

  • Is the treatment of the subject a reasonable scope as a class project?
  • How well are the ideas presented? (very difficult to understand =1, very easy to understand =5)
  • Overall quality of writing and readability (very poor =1, excellent =5)

Class Discussion: Think-Pair-Share

How Does It Work?
1) Think. The teacher provokes students' thinking with a question or prompt or observation. The students should take a few moments (probably not minutes) just to THINK about the question.

2) Pair. Using designated partners (such as with Clock Buddies), nearby neighbors, or a deskmate, students PAIR up to talk about the answer each came up with. They compare their mental or written notes and identify the answers they think are best, most convincing, or most unique.

3) Share. After students talk in pairs for a few moments (again, usually not minutes), the teacher calls for pairs to SHARE their thinking with the rest of the class. She can do this by going around in round-robin fashion, calling on each pair; or she can take answers as they are called out (or as hands are raised). Often, the teacher or a designated helper will record these responses on the board or on the overhead

Class Schedule, Reading List, and Project Milestones


Lectures
Reading
Week 1
3/28 (Mon)
3/30 (Wed)
Introduction to Software Engineering
Syllabus 
Background Survey

Software Design and Software Architecture
Ease of Change
Software Architecture
Architecture Description Support
OPTIONAL: On the criteria to be used in decomposing systems into modules

READ: An Introduction to Software Architecture
OPTIONAL:
ArchJava: Connecting Software Architecture to Implementation

Week 2
4/4 (Mon)
4/6 (Wed)

Software Architecture Styles and Constraints
Software Design Patterns

Design Patterns

Design Patterns: FactoryMethod, Singleton, Adapter, Bridge, Flyweight, Strategy, Mediator, Observer
Week 3
4/11 (Mon)
4/13 (Wed)
Developers Tools for Heterogeneous Computing

Software Engineering for Data Analytics
READ: HeteroGen: Transpiling C to Heterogeneous HLS Code with Automated Test Generation and Program Repair (local_pdf)

READ: Software Engineering for Data Analytics (local_pdf)
Week 4
4/18 (Mon)
4/20 (Wed)

Mini Exam 1 on 4/18 Monday
Interactive Code Review

API Usage Mining

Book Chapter on Software Evolution (Section 1, Section 2, and Section 4.1)
READ: Interactive Code Review for Systematic Changes
(local pdf)

READ: Are Code Examples on an Online Q&A Forum Reliable? A Study of API Misuse on Stack Overflow (local_pdf)
OPTIONAL:
Visualizing API Usage Examples at Scale (local_pdf)
Week 5
4/25 (Mon)
4/27 (Wed)
Empirical Studies of Software Evolution
Code Decay

Program Differencing and Merging

The Longest Common Sub-sequence Algorithm
Abstract syntactic tree based program differencing
Control Flow Graph based Program differencing
READ: Does code decay? Assessing the evidence from change management data

Book Chapter on Software Evolution (Section 4.2)
READ: A differencing algorithm for object-oriented programs.
OPTIONAL:
Identifying syntactic differences between two programs
Identifying and summarizing systematic code changes via rule inference
Interactive Code Review for Systematic Changes
Basic Background:
Abstract Syntax Tree
Control Flow Graph
Week 6
5/2 (Mon)
5/4 (Wed)



Refactoring

Refactoring Practices
Refactoring Reconstruction
Automated Refactoring


Book Chapter on Software Evolution (Section 3.4)
READ: A Field Study of Refactoring Benefits and Challenges
OPTIONAL: Template-based Reconstruction of Complex Refactorings

READ: Does Automated Refactoring Obviate Systematic Editing?
OPTIONAL: LASE: Locating and Applying Systematic Edits by Learning from Examples

Week 7
5/9 (Mon)
5/11 (Wed)


Mini Exam 2 on 5/9 Monday
Debugging and Fault Localization

Delta Debugging
Spectra-based fault localization
FindBug


READ: Yesterday, my program worked. Today, it does not. Why?
OPTIONAL: Simplifying and Isolating Failure-Inducing Input
Locating Causes of Program Failures
Isolating Cause Effect Chains from Computer Programs

READ: Visualizing information to assist fault localization
OPTIONAL: Finding bugs is easy
Week 8
5/16 (Mon)
5/18 (Wed)

Change Impact Analysis


Book Chapter on Software Evolution (Section 5)

READ: Chianti: a tool for change impact analysis of java programs
OPTIONAL: FaultTracer: a spectrum-based approach to localizing failure-inducing program edits
OPTIONAL: Regression Test selection for Java software
OPTIONAL: Scaling regression testing to large software systems


Week 9
5/23 (Mon)
5/25 (Wed)
ICSE Travel

Video Lectures:
Hoare Logic and Weakest Precondition

Hoare Logic Part 1
Hoare Logic Part 2
Week 10
5/30 (Mon)--No Class
6/1 (Wed) Class
Memorial Day

Automated Repair
Mini Exam 3 on 6/1 Wednesday
READ: GenProg: A Generic Method for Automatic Software Repair
Final Week
6/6 Monday Final Presentation 11:30 AM to 2:30 PM


Feedback Statement

During this course, I will be asking you to give me feedback on your learning in both informal and formal ways, e.g., including anonymous midpoint survey about how my teaching strategies are helping or hindering your learning. It is very important for me to know your reaction to what we are doing in the class, so I encourage you to respond to these surveys, ensuring that we can create an environment effective for teaching and learning.  Occasionally, at the end of the lecture, I will hand out index cards to ask "what is the most important thing you have learned in this class session?" and "what questions do you still have?" This feedback will be anonymous and this is to check your understanding and promote Q&A. Please also take the time to write written comments when submitting your course instructor survey. Your course instructor survey is important to me and future students. To reward your participation, I will add 3% of your final exam grade for participating in the course instructor survey.

Academic Integrity

Each member of the university is expected to uphold these values through integrity, honesty, trust, fairness, and respect toward peers and community. In your first week, you must read and sign UCLA's Academic Integrity Statement.