Tianyi Zhang

Ph.D. Candidate

Department of Computer Science

University of California, Los Angeles

Office: Engineering VI, Room 486

E-mail: tianyi.zhang@cs.ucla.edu

I am a Ph.D. candidate in Computer Science at University of California, Los Angeles. My advisor is Professor Miryung Kim. Previously, I was a longhorn at The University of Texas at Austin. I received my bachelor's degree in Information Security from Huazhong University of Science and Technology (HUST).


My research interests reside primarily in software engineering, with a particular focus on code reuse and software evolution. My interests stem from the strong belief that software can be evolved efficiently with the guidance from both automated tools and human expertise.

The goal of my research is to give software the self-learning capability by leveraging the inherent similarity and repetitiveness embedded in large software systems, just like how machine learning leverages the repetitiveness in massive data to guide complex tasks. My work on interactive code review summarizes similar program edits and detects anomalous edits when updating similar code fragments, so called clones. My work on automated test reuse and differential testing transplants tests between similar code and examines their behavioral consistency via fine-grained differential testing. Now I am working on techniques to visualize and assess code examples on GitHub and Stack Overflow.


  • [Dec. 2018] Our paper about common adaptation patterns of online code examples was accepted to ICSE 2019!
  • [Dec. 2018] Our paper about interactive code search via active learning was accepted to ICSE 2019. Congratulations to Aish!
  • [Nov. 2018] I have released a command-line API misuse detector based on common API usage patterns mined from 380K Java projects in GitHub. The tool is now available on the ExampleCheck website. link
  • [Jul. 2018] Our demo paper on detecting API usage violations in Stack Overflow was accepted to FSE 2018 Demonstrations Track.
  • [Jul. 2018] I will serve on the Artifacts Evaluation Committee of ICSE 2019.
  • [Jun. 2018] Both the dataset and the tool of our API misuse study of Stack Overflow are publically available. link
  • [Jun. 2018] Presented "Are Code Examples on an Online Q&A Forum Reliable? A Study of API Misuse on Stack Overflow" at ICSE 2018.
  • [Apr. 2018] Co-presented "Visualizing API Usage Examples at Scale" with Elena Glassman at CHI 2018.
  • [Apr. 2018] Examplore, an interactive system for visualizing and exploring hundreds of API usage examples is now publicly available! link
  • [Mar. 2018] Our poster about automated transplantation and differential testing for code clones was accepted to ICSE 2018!
  • [Dec. 2017] Our paper on visualizing API usage examples at scale was accepted to CHI 2018!
  • [Dec. 2017] Our paper on the reliability of Stack Overflow examples was accepted to ICSE 2018!
  • [Dec. 2017] Critics, an interactive code review technique for searching similar program edits is now open sourced! link
  • [Dec. 2017] We have completed the tech transfer of Critics to Huawei.
  • [Jul. 2017] I built a command line tool, BibMerge to remove duplicates in bib files and also update the corresponding references in tex files. Feel free to grab it if you also have trouble with merging bib files.
  • [Jul. 2017] I received the 2017-2018 UCLA Dissertation Year Fellowship.
  • [Apr. 2017] I received the 2017-2018 Google Outstanding Graduate Student Research Award.
  • [Jan. 2017] Our test reuse tool and dataset are now publicly available here.
  • [Jan. 2017] Our work about test reuse was presented at the Dagstuhl Seminar!
  • [Dec. 2016] Our paper on test reuse and differential testing was accepted to ICSE 2017!
  • [Sept. 2016] I have passed the Oral Qualifying Exam (OQE) and now advanced to candidacy!


Analyzing and Supporting Adaptation of Online Code Examples (Acceptance Rate: 20.6%)
Tianyi Zhang, Di Yang, Cristina Lopes, Miryung Kim [PDF]
Active Inductive Logic Programming for Code Search (Acceptance Rate: 20.6%)
Aishwarya Sivaraman, Tianyi Zhang, Guy Van den Broeck, Miryung Kim [PDF]
Are Code Examples on an Online Q&A Forum Reliable? A Study of API Misuse on Stack Overflow (Acceptance Rate: 20.9%)
Tianyi Zhang, Ganesha Upadhyaya, Anastasia Reinhardt, Hridesh Rajan, Miryung Kim [PDF][Dataset and Tool]
Visualizing API Usage Examples at Scale (Acceptance Rate: 25.7%)
Elena L. Glassman*, Tianyi Zhang*, Björn Hartmann, Miryung Kim [PDF][Tool]
* The two lead authors contributed equally to the work as part of an equal collaboration between both institutions.
Augmenting Stack Overflow with API Usage Patterns Mined from GitHub
Anastasia Reinhardt, Tianyi Zhang, Mihir Mathur, Miryung Kim [PDF][Demo][Tool]
Poster: Grafter: Transplantation and Differential Testing for Clones
Tianyi Zhang, Miryung Kim [Abstract][Poster]
Automated Transplantation and Differential Testing for Clones (Acceptance Rate: 16.4%)
Tianyi Zhang, Miryung Kim [PDF][Demo][Tool]
Interactive Code Review for Systematic Changes (Acceptance Rate: 18.5%)
Tianyi Zhang, Myoungkyu Song, Joseph Pinedo, Miryung Kim [PDF][Source Code]
Critics: An Interactive Code Review Tool for Searching and Inspecting Systematic Changes
Tianyi Zhang, Myoungkyu Song, Miryung Kim [PDF][Demo]



  • Program Committee: ICSE 2019 AEC
  • Journal Reviewer: TSE, EMSE
  • Student Volunteer: ICSE 2016

Advice/Userful Links

Students to Conference by David Notkin

Patterns for writing good rebuttals by Andreas Zeller

7 Tips for Attending a Conference Alone (And Having a Good Time) by Yuanyuan Zhou

Things I Keep Repeating About Writing by Claire Le Goues