me |
Research | Publications |
Teaching | Students | CV | Software | Funding | Activities |
How Does It Work?
1) Think. The teacher provokes students' thinking with a
question or prompt or observation. The students should take a few
moments (probably not minutes) just to THINK about the question.
2) Pair. Using designated partners (such as with Clock Buddies),
nearby neighbors, or a deskmate, students PAIR up to talk about the
answer each came up with. They compare their mental or written notes
and identify the answers they think are best, most convincing, or most
unique.
3) Share. After students talk in pairs for a few moments (again,
usually not minutes), the teacher calls for pairs to SHARE their
thinking with the rest of the class. She can do this by going around
in round-robin fashion, calling on each pair; or she can take answers
as they are called out (or as hands are raised). Often, the teacher or
a designated helper will record these responses on the board or on the
overhead
Lectures |
Reading | Project |
|
Week 1 (1/13, 1/15) |
Introduction to Software
Evolution Syllabus Background Survey Graduate Project Ideas Lecture 1 Introduction Lecture 2 Silver Bullet |
(HW#1) No Silver Bullet: Essence and Accidents of Software Engineering (Due: Wed) |
|
Week 2 (1/22) |
Program Differencing and Merging The Longest Common Subsequence Algorithm Abstract syntactic tree based program differencing Control Flow Graph based Program Differencing Lecture 3 Program Differencing |
A differencing algorithm for object-oriented programs: If you already have read this Jdiff paper, you can submit your review about the jdiff instead of the lsdiff paper below. | |
Week 3 (1/27, 1/29) |
Program
Differencing and Merging Logical Structural Diff Merging Conflicts |
(HW#2)
Identifying and summarizing systematic code changes via rule
inference (Mon) Proactive detection of collaboration conflicts (Wed) |
|
Week 4 ( (2/3, 2/5) |
Recommendation Systems Recommending Related Changes Recommending Relevant Artifacts Class Activity on Recommendation Systems Lecture 4 Recommendation Systems Part 1 Lecture 5 Recommendation Systems Part 2 Quiz 1 (Wed) |
(HW#3) Mining
version
histories to guide software changes Due: Mon (HW#4) Hipikat: recommending pertinent software development artifacts Due: Wed |
Form Project Groups |
Week 5 (2/10, 2/12) |
Refactoring Lectures 6 and 7 Refactoring |
A
Field Study of Refactoring Benefits and Challenges (Mon) LASE: Locating and Applying Systematic Edits by Learning from Examples (Wed) |
Undergraduate Project Topic Selection
Start thinking about the type of tools that you want to survey, such as code reviews, program differencing, recommendation systems, refactoring, testing, change impact analysis, debugging, repair, comprehension, visualization, and clone detection. Grad Project Topic Selection Start thinking about what kinds of projects interest you and try to form a group. Here are a few recommended options below.
|
Week 6 (2/17, 2/19) |
Regression
Testing Lecture 8 slides: RTS Change Impact Analysis Lecture 9 slides: Chianti |
(HW#5) Regression
Test selection for Java software Due: Mon (HW#6) Chianti: a tool for change impact analysis of java programs Due: Wed |
Grad Team Status Meeting-
Proposal. Please email a one or two page summary of the following and team formation.
|
Week 7 (2/24, 2/26) |
Debugging Applications of the Delta Debugging Algorithm and Automated Repair Lecture 10 Slides: Delta Debugging Lecture 11 Slides: Repair Quiz 2 (Wed) |
Yesterday, my
program worked. today, it does not. why? (Mon) (HW# 7) A systematic study of automated program repair: fixing 55 out of 105 bugs for $8 each (Due: Wed) |
|
Week 8 (3/3, 3/5) |
Design Patterns Wed: We will hand out papers for mock up PC meetings. Overview-Design and Software Architecture HeadFirst-Adapter HeadFirst-Factory HeadFirst-Observer HeadFirst-Strategy HeadFirst-TemplateMethod |
FactoryMethod,
Singleton,
Adapter,
Bridge,
Flyweight,
Strategy,
Mediator,
Observer |
|
(3/10, 3/12) |
Spring Break Enjoy! |
||
Week 9 (3/17, 3/19) |
Monday: How to create a clickable demo in Web apps. How to create a clickable demo in Swing Wednesday: Design Patterns Quiz 3 (Wed) |
Undergrad Project Part A (Due March 17th, beginning of the class): In the project part (A), students will survey a state of the practice SE tool used in industry in a particular topic area of this class such as code reviews/ program differencing, recommendation systems, testing, software comprehension and visualization, etc. Students will download and install them, use them as a user, write a report on the pros and cons of the tool and demonstrate the tools and their assessment in class to other students. During the week of March 24th and 26, each team will have a 15 to 20 minute presentation on the tool demo and their evaluation results. The group should also submit a slide deck for their presentation and live demo. Each group should submit a tool survey report (max 8 pages) to the blackboard system:
|
|
Week 10 (3/24, 3/26) |
Midpoint Presentation Tool Survey Demo Presentations |
Grad
Project Midpoint (Due March 24th, beginning of the class) Each team should submit a written report (max 5 pages) to the blackboard system. Please submit the report and the slide deck as separate PDF files. Submissions should follow the ACM format. We will have a 15 to 20 minute presentation. Your report should be structured like a conference paper, meaning that your report should contain:
|
|
Week 11 (3/31, 4/2) |
Mini
PC Meeting I (Mon: 5 Papers) Mini PC Meeting II (Wed: 5 Papers) |
|
|
Week 12 (4/7, 4/9) |
Comprehension and Code Clone
Lecture 12 slides: Clone Detection Lecture 13 slides: Clone Genealogy |
(HW#8) CCFinder:
A multilinguistic token-based code clone detection system for
large scale source code (Due: Mon) An Empirical Study of Code Clone Genealogies (Wed) |
|
Week 13 (4/14, 4/16) |
Empirical Studies Threats to Validity Lecture 14 slides: Code Decay and Threats to Validity Lecture 15 slides: Program Comprehension Quiz 4 (Wed) |
(HW#9)
Does code decay? Assessing the evidence from change
management data Due: Mon (HW#10) How do professional developers comprehend software? Due: Wed |
|
Week 14 (4/21, 4/23) |
Course Instructor Survey
(Monday) Design Patterns HeadFirst-Proxy (Monday) ClassActivity-ProxyCode HeadFirst-State (Wednesday) ClassActicity-StateCode Quiz 4 (Wed) |
Undergraduate Project Part B: Clickable Demo (Due: April 28th) Based on the team's Part A survey, students will then propose a clickable prototype that overcomes the limitations of the surveyed tool and design and implement the clickable prototype. They will then demonstrate the prototype to other students in the class. Your report must describe
Here is a grading guideline for a final project report (pdf). Each team should submit a written report (max 10 pages), as well as a slide deck, to the blackboard system. Please submit the report and the slide deck as separate PDF files. You may include an appendix beyond 10 pages, but your paper must be understandable without it. Submissions not in the ACM format will not be reviewed (this is to model program committees for conferences and workshops, which have the option to automatically reject papers if they do not comply with the submission guidelines). Each team will have a 15 to 20 minute presentation. Your report should be structured like a conference paper, meaning that your report should contain:
|
|
Week 15 (4/28, 4/30) |
Final Project Presentation and Demo |
Mini PC meeting emulates an actual programming committee meeting
where program committee members select peer-reviewed research articles
for technical conferences. Each committee member will review papers
assigned to them and submit a critical assessment for each assigned
paper, either from a perspective of an advocate or from a perspective
of skeptic. In this exercise each graduate student will review
4 assigned papers and each undergraduate student will
read 1 paper. Every student will have to submit a review
report for the papers assigned.
A review template is provided below and you will write a review but
also provide scores to indicate your assessment of the paper's
originality, comprehensiveness, and new research contributions. Only
after you submit your reviews, you will be able to see other committee
members' reviews. On the days of PC meetings, Dr. Kim will take a role
of PC chairs and we will have a roundtable PC meeting where each paper
is discussed by the committee members who review the paper. To emulate
a blind peer review process, the titles and author names are masked on
purposes. To provide anonymity in some degree, we only use initials
for the reviewer assignment.