Verification and Validation course offered at Kansas State University in Fall'15, '16, and '17
Switch branches/tags
Nothing to show
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Failed to load latest commit information.

Readings in Verificaiton and Validation (CIS841) Course

These are the contents of Verification and Validation (CIS841) course offered at Kansas State University in Fall'15-'17. This is an upper-level graduate course. It requires students to have taken graduate-level courses in software engineering and software specification and to have some experience developing software.


At the beginning of the semester, a subset of papers from the following list is selected for discussion. We consdier the interests and strengths of students enrolled in the course while selecting the papers. For example, if most of the students are employed in the industry, then we pick more industry paper (i.e., tools, tech transfer, studies). Then, each student is assigned a set of papers from the set.

In each class, one student presents an assigned paper to the rest of the class while the rest of the class participates in discussing the assigned paper. The presenter is required to be aware of the entire content of the paper and to try to answer any questions raised during the discussion. At times, the presenter may not understand some parts of the paper. In such situation, he/she can consult the instructor about such parts of the paper before presenting the paper or he/she can state his/her understanding and seek clarification from the instructor in class while presenting the paper. The instructor will steer the in-class discussion (e.g., by asking questions/clarifications about specific parts of papers) and clarify any doubts about the paper that were unanswered by the presenter.

When there are more than few students in a class, students are required to summarize the paper before the class, update the summary after discussing the paper, and then submit the updated summary to the instructor for evaluation.

Discussed Papers

  1. A Decade of Software Model Checking with SLAM. Thomas Ball, Vladimir Levin, and Sriram Rajamani. CACM'11. Model Checking
  2. A Randomized Scheduler with Probabilistic Guarantees of Finding Bugs. Sebastian Burckhardt, Pravesh Kothari, Madanlal Musuvathi, and Santosh Nagarakatte. ASPLOS'10. Bug Finding
  3. [A Survey of Symbolic Execution Techniques.] Roberto Baldoni, Emilio Coppa, Daniele Cono D'Elia, Camil Demetrescu, and Irene Finocchi. Arxiv'16. Symbolic Execution
  4. Achievements, open problems and challenges for search based software testing. Mark Harman, Yue Jia and Yuanyuan Zhang. ICST'15. Testing
  5. Addressing Dynamic Issues of Program Model Checking. Flavio Lerda, and Willem Visser. SPIN'01. Model Checking
  6. An Analysis and Survey of the Development of Mutation Testing. Yue Jia and Mark Harman. TSE'11. Testing
  7. An introduction to Property-based Testing. Testing
  8. Astree: From Research to Industry. David Delmas and Jean Souyris. SAS'07. Static Analysis
  9. Automated White-box Testing. Patrice Godefroid, Michael Levin, and David Molnar. NDSS'08. Fuzzing, Testing
  10. Bandera: a source-level interface for model checking Java programs. James C. Corbett, Matthew B. Dwyer, John Hatcliff, and Robby. ICSE'00. Model Checking, Tooling
  11. Basic Concepts and Taxonomy of Dependable and Secure Computing. Algridas Avizienis and Brian Randell. IEEE Transactions on Dependable and Secure Computing 2004. Concepts
  12. Boeing Flies on 99% Ada.
  13. CUTE: a concolic unit testing engine for C. Koushik Sen, Darko Marinov, and Gul Agha. FSE'05. Symbolic Execution, Testing
  14. Cadena: An Integrated Development, Analysis, and Verification Environment for Component-based Systems. John Hatcliff, William Deng, Matthew B. Dwyer, Georg Jung, and Venkatesh Prasad Ranganath. ICSE'03. MDD, Tooling
  15. [Choosing Properties for Property-based Testing.] ( Testing
  16. Compatibility Testing via Patterns-based Trace Comparison. Venkatesh-Prasad Ranganath, Pradip Vallathol, and Pankaj Gupta. ASE'14. Tech Transfer, Tooling
  17. DART: Directed Automated Random Testing. Patrice Godefroid, Nils Klarlund, and Koushik Sen. PLDI'05. Symbolic Execution, Testing
  18. DARWIN: An Approach for Debugging Evolving Programs. Dawei Qi, Abhik Roychoudoury, Zhenkai Liang, and Kapil Vaswani. FSE'09. Testing
  19. Differential Testing for Software. William M. McKeeman. DTJ'98. Testing
  20. Enabling Efficient Partial Order Reductions for Model Checking Object-Oriented Programs Using Static Calculation of Program Dependences. Venkatesh Prasad Ranganath, John Hatcliff, and Robby. Santos-TR2007-2. Model Checking, POR
  21. Evaluating Static Analysis Defect Warnings On Production Software. Nathaniel Ayewah, William Pugh, J. David Morgenthaler, John Penix, and YuQian Zhou. PASTE'07. Evaluation, Static Analysis, Tooling
  22. Evaluating the Effectiveness of Slicing for Model Reduction of Concurrent Object-Oriented Programs. Matthew B. Dwyer, John Hatcliff, Matthew Hoosier, Venkatesh Prasad Ranganath, Robby, and Todd Wallentine. TACAS'06. Evaluation, Program Slicing, Model Checking
  23. Feedback-directed random test generation. Carlos Pacheco, Shuvendu K. Lahiri, Michael D. Ernst, and Thomas Ball. ICSE'07. Testing
  24. How AWS uses formal methods. Newcombe, Rath, Zhang, Munteanu, Brooker, and Dearduff. CACM'15e. Specification
  25. How Complex Systems Fail. Richard I. Cook. Systems and Failures
  26. How Microsoft built, and is still building, Windows 10. Testing
  27. Isolating Cause-Effect Chains from Computer Programs. Andreas Zeller. FSE'02. Debugging
  28. Locating Causes of Program Failures. Holger Cleve and Andreas Zeller. ICSE'05. Debugging
  29. Model-Based Quality Assurance of Protocol Documentation: Tools and Methodology. Wolfgang Grieskamp, Nicolas Kicillof, Keith Stobie, and Victor Braberman. STVR (ICST'08). Specification
  30. Model-Based Testing: Where does it stand? Robert V. Binder, Bruno Legeard, and Anne Kramer. CACM'15. Evaluation, Testing
  31. Model Checking Programs. Willem Visser, Klaus Havelund, Guillaume Brat, SeungJoon Park, and Flavio Lerda. ASE'03. Model Checking
  32. Moving Fast with Software Verication. Cristiano Calcagno, Dino Distefano, Jeremy Dubreil, Dominik Gabi, Pieter Hooimeijer, Martino Luca, Peter O'Hearn, Irene Papakonstantinou, Jim Purbrick, and Dulma Rodriguez. NFM'15. Static Analysis, Tech Transfer
  33. Pex - White Box Test Generation for .NET. Nikolai Tillmann and Peli de Halleux. TAP'08. Symbolic Execution, Testing
  34. Race Conditions, Distribution, Interactions -- Testing the Hard Stuff and Staying Sane. Testing
  35. Realizing quality improvement through test driven development: results and experiences of four industrial teams. Nachiappan Nagappan, E. Michael Maximillien, Thirumalesh Bhat, and Laurie Williams. ESE'08 TDD
  36. Regression Testing Minimisation, Selection and Prioritisation : A Survey. Yoo and Harman. STVR'12. Concepts, Testing
  37. SLAM and Static Driver Verifier: Technology Transfer for Formal Methods inside Microsoft. Thomas Ball, Byron Cook, Vladimir Levin, and Sriram Rajamani. IFM'04. Tech Transfer
  38. Satisfiability Modulo Theories: Introduction and Applications. Leonardo De Moura and Nikolaj Bjorner. CACM'11. SMT
  39. Simple Testing Can Prevent Most Critical Failures: An Analysis of Production Failures in Distributed Data-Intensive Systems. Ding Yuan, Yu Luo, Xin Zhuang, Guilherme Renna Rodrigues, Xu Zhao, Yongle Zhang, Pranay U. Jain, and Michael Stumm. OSDI'14. Testing
  40. Software Model Checking. Ranjit Jhala and Rupak Majumdar. ACM CS'09. Model Checking
  41. Symbolic Execution for Software Testing: Three Decades Later. Cristian Cadar and Koushik Sen. CACM'13. Symbolic Execution, Testing
  42. The Model Checker SPIN. Gerard J. Holzmann. TSE'97 Model Checking
  43. Transferring an Automated Test Generation Tool to Practice: From Pex to Fakes and Code Digger. Nikolai Tillmann, Peli De Halleux, and Tao Xie. ASE'14. Tech Transfer
  44. Who Builds a House without Drawing Blueprints? Leslie Lamport. CACM'15. Specification
  45. Why Amazon chose TLA+? Chris Newcombe. ABZ'14. Specification


Copyright (c) 2015, Venkatesh-Prasad Ranganath

Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.

Compiler: Venkatesh-Prasad Ranganath