me
Research Publications
Teaching Students CV Software Funding Activities

EE382V Software Evolution (Graduate) and EE 379K Software Evolution (Advanced Elective), Spring 2014


Instructor: Dr. Miryung Kim (ACES 5.118)
    Office Hours: Monday and Wednesday 2PM to 3:00 PM
   
Lectures: Monday and Wednesday: 3:00 PM to 4:30PM, ENS 126

General Description

Software evolution plays an ever-increasing role in software development. Programmers rarely build software from scratch but often spend more time in modifying existing software to provide new features to customers and fix defects in existing software. Evolving software systems is often a time-consuming and error-prone process. This course focuses on understanding the fundamentals of state-of-the art methods, tools, and techniques for evolving software based on the current software engineering research literature. This class is a reading based, discussion oriented class. Students are expected to read the assigned reading materials in advance and actively participate in the class discussions.

Intended Benefits for Students

By understanding the fundamentals behind automated software engineering analysis algorithms, techniques and tools, students will acquire keen eyes on how to systematically design, analyze, and extend large software systems. This will prepare students to become principle software architects and development leads in the future.  

Audience and Prerequisites

This class is intended to students to introduce current research topics in software engineering with focus on software evolution. Undergraduate level knowledge of data structures and object-oriented program languages is required. Knowledge of compilers, program analysis and program representations is encouraged. If you are unsure of your qualifications, please contact the instructor, who will be happy to help you decide if this course is right for you. You are welcome to just sit in for a few days and see how this class feels.

Grading for 379K (undergraduate)

Reading Assignments (25%, 8*3% each)
Quiz (15%)
Class Participation (10%)
Class Project
   Part A: Tool Survey Presentation and Demo (20%)
   Part B: Clickable Demo Presentation (30%)

Grading for 382V (graduate)

Reading Assignments (25%, 8*3% each)
Mini PC Reviews & Meetings (20%)
Project
   Part A: Midpoint Presentation (20%)
   Part B: Final Presentation, Demo, and Report (30%)
Peer Evaluation and Class Participation (6%)

Reading Assignment

Instructions.  Please download the above paper from the ACM Digital Library. Access to ACM digital library is free if you are using a computer on campus with a valid UT Austin IP address. You are welcome to meet in small groups to discuss papers, but each student must submit his or her own review. You can follow this format if you like. You will write four short paragraphs addressing the following points. Long reviews are not necessarily good reviews. Please limit your review to 500 words at most.
  •     Stated goals and solution. What problem are the authors trying to solve? What are the bounds on this problem, i.e., what are they not trying to solve? What techniques or tools do the authors offer to solve the problem at hand? How do the authors know they have solved the problem? Do the authors test or validate their approach experimentally? Does the solution meet the stated goals, or does it fall short in some way? Avoid simply quoting the authors’ own abstract. Restating in your own words demonstrates your understanding.
  •     Cool or significant ideas. What is new here? What are the main contributions of the paper? What did you find most interesting? Is this whole paper just a one-off clever trick or are there fundamental ideas here which could be reused in other contexts?
  •     Fallacies and blind spots. Did the authors make any assumptions or disregard any issues that make their approach less appealing? Are there any theoretical problems, practical difficulties, implementation complexities, overlooked influences of evolving technology, and so on? Do you expect the technique to be more or less useful in the future? What kind of code or situation would defeat this approach, and are those programs or scenarios important in practice? Note: we are not interested in flaws in presentation, such as trivial examples, confusing notation, or spelling errors. However, if you have a great idea on how some concept could be presented or formalized better, mention it.
  •     New ideas and connections to other work. How could the paper be extended? How could some of the flaws of the paper be corrected or avoided? Also, how does this paper relate to others we have read, or even any other research you are familiar with? Are there similarities between this approach and other work, or differences that highlight important facets of both?
Please take the time to edit your reviews. Unclear or unnecessarily long prose will be graded accordingly.

Please submit your reading assignment using Blackboard before the due date. Out of 10 assignments, we will grade best 8 assignments. In other words, you can drop 2 reading assignments during the semester. Therefore, no late submissions are allowed. Some papers are listed above as optional readings with out the (HW #) prefix.

Class Discussion: Think-Pair-Share

How Does It Work?
1) Think. The teacher provokes students' thinking with a question or prompt or observation. The students should take a few moments (probably not minutes) just to THINK about the question.

2) Pair. Using designated partners (such as with Clock Buddies), nearby neighbors, or a deskmate, students PAIR up to talk about the answer each came up with. They compare their mental or written notes and identify the answers they think are best, most convincing, or most unique.

3) Share. After students talk in pairs for a few moments (again, usually not minutes), the teacher calls for pairs to SHARE their thinking with the rest of the class. She can do this by going around in round-robin fashion, calling on each pair; or she can take answers as they are called out (or as hands are raised). Often, the teacher or a designated helper will record these responses on the board or on the overhead

Class Schedule, Reading List, and Project Milestones 


Lectures
Reading Project
Week 1
(1/13, 1/15)

Introduction to Software Evolution
Syllabus 
Background Survey
Graduate Project Ideas
Lecture 1 Introduction
Lecture 2 Silver Bullet


(HW#1) No Silver Bullet: Essence and Accidents of Software Engineering
(Due: Wed)

Week 2
(1/22)
Program Differencing and Merging
The Longest Common Subsequence Algorithm
Abstract syntactic tree based program differencing
Control Flow Graph based Program Differencing
Lecture 3 Program Differencing
A differencing algorithm for object-oriented programs: If you already have read this Jdiff paper, you can submit your review about the jdiff instead of the lsdiff paper below.

Week 3
(1/27, 1/29)
Program Differencing and Merging
Logical Structural Diff
Merging Conflicts

(HW#2) Identifying and summarizing systematic code changes via rule inference
(Mon)
Proactive detection of collaboration conflicts (Wed)

Week 4 (
(2/3, 2/5)
Recommendation Systems
Recommending Related Changes
Recommending Relevant Artifacts
Class Activity on Recommendation Systems 
Lecture 4 Recommendation Systems Part 1
Lecture 5 Recommendation Systems Part 2
Quiz 1 (Wed)
(HW#3) Mining version histories to guide software changes Due: Mon
(HW#4) Hipikat: recommending pertinent software development artifacts
Due: Wed
Form Project Groups 
Week 5 (2/10, 2/12)
Refactoring
Lectures 6 and 7 Refactoring
A Field Study of Refactoring Benefits and Challenges  (Mon)
LASE: Locating and Applying Systematic Edits by Learning from Examples (Wed)

Undergraduate Project Topic Selection
Start thinking about the type of tools that you want to survey, such as code reviews, program differencing, recommendation systems, refactoring, testing, change impact analysis, debugging, repair, comprehension, visualization, and clone detection. 
 
Grad Project Topic Selection

Start thinking about what kinds of projects interest you and try to form a group. Here are a few recommended options below.
Week 6
(2/17, 2/19)
Regression Testing
Lecture 8 slides: RTS
Change Impact Analysis
Lecture 9 slides: Chianti
(HW#5) Regression Test selection for Java software Due: Mon
(HW#6) Chianti: a tool for change impact analysis of java programs
Due: Wed
Grad Team Status Meeting- Proposal.
Please email a one or two page summary of the following and team formation.
  • Problem definition and motivation. What are the goals of your project and why are these goals important?
  • Your approach for addressing the problem that you defined above.
  • How you plan to evaluate your approach.
  • A more in-depth discussion of related work. If appropriate, also explain how your approach is different from existing approaches.
  • A list of milestones and dates.
Week 7
(2/24, 2/26)
Debugging
Applications of the Delta Debugging Algorithm and Automated Repair
Lecture 10 Slides: Delta Debugging
Lecture 11 Slides: Repair
Quiz 2 (Wed)
Yesterday, my program worked. today, it does not. why? (Mon)
(HW# 7) A systematic study of automated program repair: fixing 55 out of 105 bugs for $8 each (Due: Wed)

Week 8
(3/3, 3/5)
Design Patterns

Wed: We will hand out papers for mock up PC meetings. 
Overview-Design and Software Architecture
HeadFirst-Adapter
HeadFirst-Factory
HeadFirst-Observer
HeadFirst-Strategy
HeadFirst-TemplateMethod

FactoryMethod, Singleton, Adapter, Bridge, Flyweight, Strategy, Mediator, Observer

(3/10, 3/12)

Spring Break Enjoy!

Week 9
(3/17, 3/19)


Monday: How to create a clickable demo in Web apps.
How to create a clickable demo in Swing

Wednesday: Design Patterns
Quiz 3 (Wed)


Undergrad Project Part A (Due March 17th, beginning of the class): In the project part (A), students will survey a state of the practice SE  tool used in industry in a particular topic area of this class such as code reviews/ program differencing, recommendation systems, testing, software comprehension and visualization, etc. Students will download and install them, use them as a user, write a report on the pros and cons of the tool and demonstrate the tools and their assessment in class to other students. During the week of March 24th and 26, each team will have a 15 to 20 minute presentation on the tool demo and their evaluation results. The group should also submit a slide deck for their presentation and live demo.

Each group should submit a tool survey report (max 8 pages) to the blackboard system:  
  • Background:
    • What is the goal of the tool? Who can use the tool for doing what, and when and how? Describe a scenario of how the tool can be used in pratice
  • Evaluation plan
    • the objectives of evaluation, a set of experiments or studies you designed to evaluate the tool and why, a set of subject programs (data sets) you used and why
  • Evaluation results
    • comparison between what you expected the tool to do and what the tool actually did
    • discuss strengths and provide rationales about why the tool demonstrated strengths on certain cases 
  • discuss weaknesses and provide rationales about why the tool did not work well on certain cases
  • Future innovation/ research directions
Week 10
(3/24, 3/26)
Midpoint Presentation
Tool Survey Demo Presentations

Grad Project Midpoint (Due March 24th, beginning of the class)
Each team should submit a written report (max 5 pages) to the blackboard system. Please submit the report and the slide deck as separate PDF files. Submissions should follow the ACM format. We will have a 15 to 20 minute presentation.

Your report should be structured like a conference paper, meaning that your report should contain:
  • Abstract
  • A well-motivated introduction
  • Related work with proper citations 
  • Description of your methodology
  • Preliminary results
You must demonstrate your project output to the class. If you are doing a project that involves implementation, please submit your source code by sharing an assembla repository. This repository must include test cases and a manual in addition to source code.
Week 11
(3/31, 4/2)
Mini PC Meeting I (Mon: 5 Papers)
Mini PC Meeting II (Wed: 5 Papers)



 
Week 12
(4/7, 4/9)
Comprehension and Code Clone
Lecture 12 slides: Clone Detection
Lecture 13 slides: Clone Genealogy
(HW#8) CCFinder: A multilinguistic token-based code clone detection system for large scale source code (Due: Mon)
An Empirical Study of Code Clone Genealogies (Wed)

Week 13
(4/14, 4/16)
Empirical Studies
Threats to Validity
Lecture 14 slides: Code Decay and Threats to Validity
Lecture 15 slides: Program Comprehension
Quiz 4 (Wed)
(HW#9) Does code decay? Assessing the evidence from change management data
Due: Mon
(HW#10) How do professional developers comprehend software?
Due: Wed

Week 14
(4/21, 4/23)
Course Instructor Survey (Monday)
Design Patterns
HeadFirst-Proxy (Monday) 
ClassActivity-ProxyCode
HeadFirst-State (Wednesday)
ClassActicity-StateCode
Quiz 4 (Wed) 

Undergraduate Project Part B: Clickable Demo (Due: April 28th)
Based on the team's Part A survey, students will then propose a clickable prototype that overcomes the limitations of the surveyed tool and design and implement the clickable prototype. They will then demonstrate the prototype to other students in the class. Your report must describe
  • A well-motivated introduction
  • Description of your methodology and functionality
  • User interfaces / screen snapshots
  • Evaluation data set and how the prototype will works on the data set
Grad Project Final Presentation and Demo (Due: April 28th) 
Here is a grading guideline for a final project report (pdf).
Each team should submit a written report (max 10 pages), as well as a slide deck, to the blackboard system. Please submit the report and the slide deck as separate PDF files. You may include an appendix beyond 10 pages, but your paper must be understandable without it. Submissions not in the ACM format will not be reviewed (this is to model program committees for conferences and workshops, which have the option to automatically reject papers if they do not comply with the submission guidelines). Each team will have a 15 to 20 minute presentation.

Your report should be structured like a conference paper, meaning that your report should contain:
  • Abstract
  • A well-motivated introduction
  • Related work with proper citations 
  • Description of your methodology
  • Evaluation results 
  • Discussion of your approach, threats to validity, and additional experiments
  • Conclusions and future work 
  • Appendix: Describe how to run and test your implementation (See below).
You must demonstrate your project output to the class. If you are doing a project that involves implementation, please submit your source code by sharing an assembla repository. This repository must include test cases and a manual in addition to source code. Your manual must describe how to run and test your code.
Week 15
(4/28, 4/30)


Final Project Presentation and Demo


Homework, project report, and presentation grading scheme

Projects

Undergraduate students who are registered for 379K will carry out two mini projects. In the project part (A), students will survey a state of the practice SE  tool used in industry in a particular topic area of this class (such as code reviews/ program differencing, recommendation systems, testing, software comprehension and visualization, etc. Students will download and install them, use them as a user, write a report on the pros and cons of the tool and demonstrate the tools and their assessment in class to other students.  In the project part (B), students will then propose a clickable prototype that overcomes the limitations of existing tools in your team's survey area and design and implement the clickable prototype. They will then demonstrate the prototype to other students in the class. This class project will be done in a team of 5 to 6 students.

Part A: Tool Survey Project Details

Part B: Grading Rubric for Clickable Demo

Final Report: Grading Rubric for Graduate Student Research Projects

Swing tutorial for creating a clickable demo prototype.

Graduate students who are registered for 382V will carry out a semester long team research project to design, implement and evaluate a novel software engineering tool to improve developer productivity. Student will demonstrate the course project at the end of the class. Students could propose their own idea or choose from the following topics:

Mini Program Committee Meetings Instruction

Mini PC meeting emulates an actual programming committee meeting where program committee members select peer-reviewed research articles for technical conferences. Each committee member will review papers assigned to them and submit a critical assessment for each assigned paper, either from a perspective of an advocate or from a perspective of skeptic. In this exercise each graduate student will review 4 assigned papers and each undergraduate student will read 1 paper. Every student will have to submit a review report for the papers assigned.
A review template is provided below and you will write a review but also provide scores to indicate your assessment of the paper's originality, comprehensiveness, and new research contributions. Only after you submit your reviews, you will be able to see other committee members' reviews. On the days of PC meetings, Dr. Kim will take a role of PC chairs and we will have a roundtable PC meeting where each paper is discussed by the committee members who review the paper. To emulate a blind peer review process, the titles and author names are masked on purposes. To provide anonymity in some degree, we only use initials for the reviewer assignment.

Mini PC Meeting Instruction and Details

Feedback Statement

During this course, I will be asking you to give me feedback on your learning in both informal and formal ways, e.g., including anonymous midpoint survey about how my teaching strategies are helping or hindering your learning. It is very important for me to know your reaction to what we are doing in the class, so I encourage you to respond to these surveys, ensuring that we can create an environment effective for teaching and learning.  Occasionally, at the end of the lecture, I will hand out index cards to ask "what is the most important thing you have learned in this class session?" and "what questions do you still have?" This feedback will be anonymous and this is to check your understanding and promote Q&A.

Student with Disabilities

The University of Texas at Austin provides upon request appropriate academic accommodations for qualified students with disabilities. For more information, contact the Office of the Dean of Students at 471-6259, 471-4641 TTY or the College of Engineering Director of Students with Disabilities at 471-4382.

Class Policy

  • There are two exemptions for missing reading assignments. In other words, I will use the best 8 scores out of 10 for final grade calculation. Therefore, there are no credit for late submissions for reading assignments. If you have extremely exceptional circumstances, please contact me before the penalty is incurred.
  • You can use your laptops to take notes (no smart phones please). If you plan to use your laptop during the lectures, please send me an email in the beginning of the semester.
  • I will do my best to return graded quizzes and assignments within one week to provide timely feedback to you.
  • Class announcements will be made through Piazza. Not every class announcement will be made available via electronic means.
  • Please use Assembla for version control and project management. Please provide a read permission to us, so that we can access and grade your project. 
  • Quizzes start in the beginning of the class.
  • Review questions are to help you review concepts and prepare for the quizzes and exams. If you do not know answers to the questions, I'd be happy to go through them with you during my office hours.