Accountability: A Cross-disciplinary View

Type:

Master Seminar

Semester:

Summer Semester 2017

Language:

English

Preliminary Meeting:

27.01.2017 at 11:00,

Room: 01.09.14 (Alonzo church)

Lecturer:

Prof. Dr. Alexander Pretschner

Amjad Ibrahim

Ehsan Zibaei

SWS:

2

ECTS:

4

LvNr:

1123 (IN2107)

Max. Number of participants

10

Rules for participation and registration

  1. Plagiarism of any form (blatant copy-paste, summarizing some else's ideas/results without reference etc.) will result in immediate expulsion from the course.
  2. All submissions are mandatory. Each submission must fulfill a certain level of quality. Submissions that are just collections of buzzword/keywords or coarse document structures will not be accepted. Failing that will be graded 5.0
  3. Late submissions will invite penalties.
  4. Non-adherence to submission guidelines will invite penalties.
  5. Slides must be discussed with the supervisor at least one week before the presentation. Presentation must be held in English.
  6. Participation and attendance in all seminar presentations is mandatory. Students must read the final submissions of their colleagues and participate in the discussions.
  7. Registration for the seminar takes place by the TUM Online Matching System.
  8. Once successfully registered for the seminar
    1. Students select at most 3 free available individual seminar topics of their choice.
    2.  Send the selected topics via email (subject: “Accountability seminar") in a preferred order from 1 (=most preferred topic) to 3 to Amjad Ibrahim.
  9. Once assigned a topic, you will receive a confirmation email.
  10. Students must acknowledge their acceptance of the topic and participation in the seminar latest by TBA.
  11. Students willing to quit the seminar must send a cancellation email by TBA, failing which they will be graded 5.0

Content

Although the term accountability is used often in scientific research [Accountability 1-5], we still do not have a clear, unified and comprehensive definition of the term. For the scope of this seminar, we will refer to accountability as a property of a system that enables linking an unwanted behavior at run time to its possible cause in post mortem. Specifically we will use the definition by Pretschner [Accountability 4] accountability is “the capability of a system to answer questions concerning the why, how, by whom, where and when specific events have happened or have failed to happen.”

In practice, mechanisms that enable accountability are used in different application domains [Accountability 6]. They are utilized in automotive industry when building systems according to industry-specific standards like ISO 26262 for vehicles. Accountability is also implemented in airplanes or cars with voice recorders, or black boxes, according to JAR-OPS. It is a common concern for all systems in which logging takes place: financial transactions, surgical and invasive procedure protocols, debugging of computer programs [Accountability 4]. Simply, recording any form of data about the events that took place during the runtime of a system enables accountability.

In computer science research, many fields and communities try to achieve similar goals or sub-goals of accountability. For example, digital forensics [Digital Forensics 1-4] aim to analyze and preserve computer evidence of cybercrimes. Hence, this field of research is an enabler of accountability. Another example from software testing domain is fault localization, which aims to find the location and causes of failures.

The aim of this seminar is to study the concept of accountability as an umbrella of different disciplines. Based on the definition of accountability presented above, this seminar will draw a general picture of accountability and its relation with other concepts in computer science literature. In this seminar, we compare and conclude what enables accountability? How accountability relates to other concepts in computer science? Which of the related disciplines is a specification or a generalization of accountability?

How it goes?

  1. Each student will study and analyze the literature around accountability.
  2.  Each student or group of students will study and analyze the literature around one related field.
  3.  The literature should provide a general understanding, definition and components of a specific field, e.g model checking. Taxonomies or meta models are a good starting point.
  4.  The understanding is directed towards the comparison with our definition of accountability.
  5.  Each student will come up with a clear comparison (differences and commonalities) between the related field and accountability. The comparison should contribute to classifying the field as: an enabler, synonym, related or unrelated to accountability. 

Pre-requisites

 A background in security, system modeling or any field specified in the list of topics is desirable, but not required. Basic understanding of algorithms and data structures.

Objective

In addition to the scientific contribution of exploring the links between accountability and related concepts. This seminar, also, aims to introduce students to the scientific method of critically reading, understanding, analyzing, explaining and presenting existing scientific literature.

Students will read one or more papers that are assigned to them by their supervisors. They are required to find further relevant research on the topic. Understanding the central statements of a paper includes highlighting, complementing and explaining assumptions, as well as deliberately or accidentally incomplete chains of argumentation – typically followed by examples. This understanding should be reflected in the written exposé. This exposé must include the problem that is tackled by the selection of papers, as well as their respective central assumptions, arguments, and results. A highly motivated student is expected to come up with a classification scheme within which all selected publications may be neatly organized and their crux matter discussed in the context of the corresponding problem.

Possible topics

The list of related fields can be (but not limited to):

  1. BlockChain
  2. Model checking
  3. Runtime verification (offline mode)
  4. Digital Forensics
  5. Log auditing
  6. Model based testing/analysis
  7. Intrusion detection systems
  8. Causality
  9. Fault localization and Delta debugging
  10. Accountability in cyber-physical systems
  11. Anomaly detection

Organization

Students will survey the literature of one of the research topics assigned to them by their supervisors;they are encouraged to find and read further relevant articles on the topic. At the end of the seminar,students are to submit an exposé that incorporates the knowledge they acquired and the findings of any experiments they conducted whilst researching the topic. The exposé depicts a scientific paper that adopts their own succinct chain of argumentation. Merely paraphrasing and augmenting the contents of original papers is not sufficient. We expect the paper to be maximum 15 pages in Springer LNCS style. We will notaccept any other formats. All submissions must be as PDF files: no other file formats are acceptable. The presentation will be 30 minutes + 15 minutes of discussion.

Seminar Literature

Accountability

1- Papanikolaou, Nick, and Siani Pearson. "A Cross-Disciplinary Review of the Concept of Accountability A Survey of the Literature." (2013).

2- Weitzner, D., Abelson, H., Berners-Lee, T., Feigenbaum, J., Hendler, J., Sussman, G.: Information accountability, Communications of the ACM 51(6):82-87, 2008

3- Beckers, Kristian, Jörg Landthaler, Florian Matthes, Alexander Pretschner, and Bernhard Waltl. "Data Accountability in Socio-Technical Systems." International Workshop on Business Process Modeling, Development and Support. Springer International Publishing, 2016.

4- Pretschner, Alexander. "Achieving accountability with distributed data usage control technology." The 2nd International Workshop on Accountability: Science, Technology and Policy at MIT. 2014.

5- Feigenbaum, Joan, Aaron D. Jaggard, Rebecca N. Wright, and Hongda Xiao. Systematizing “Accountability” in Computer Science (Version of Feb. 17, 2012). YALEU/DCS/TR-1452, Yale University, New Haven, CT, 2012.

6- Kacianka, Severin, Florian Kelbert, and Alexander Pretschner. "Towards a Unified Model of Accountability Infrastructures." arXiv preprint arXiv:1608.07882 (2016).

Causality 


1- Gössler, G., Le Métayer, D.: A General Trace-Based Framework of Logical Causality. [Research Report] RR-8378, 2013.

2- Halpern, J., Pearl, J.: Causes and Explanations: A Structural-Model Approach. Part I: Causes. arXiv:cs/0011012v3 [cs.AI] 7, 2005

Model checking


1- Clarke, Edmund M., Orna Grumberg, and Doron Peled. Model checking. MIT press, 1999.

2- Baier, Christel, Joost-Pieter Katoen, and Kim Guldstrand Larsen. Principles of model checking. MIT press, 2008.

3- Visser, Willem, Klaus Havelund, Guillaume Brat, SeungJoon Park, and Flavio Lerda. "Model checking programs." Automated Software Engineering 10, no. 2 (2003): 203-232.

4- (Survey) Ranjit Jhala and Rupak Majumdar. 2009. Software model checking. ACM Comput. Surv. 41, 4, Article 21 (October 2009), 54 pages. DOI=http://dx.doi.org/10.1145/1592434.1592438

Runtime verification

1- Leucker, Martin, and Christian Schallhart. "A brief account of runtime verification." The Journal of Logic and Algebraic Programming 78, no. 5 (2009): 293-303.

2- Andreas Bauer, Martin Leucker, and Christian Schallhart. 2011. Runtime Verification for LTL and TLTL. ACM Trans. Softw. Eng. Methodol. 20, 4, Article 14 (September 2011), 64 pages. DOI=http://dx.doi.org/10.1145/2000799.2000800

3- Falcone, Ylies, Jean-Claude Fernandez, and Laurent Mounier. "Runtime verification of safety-progress properties." In International Workshop on Runtime Verification, pp. 40-59. Springer Berlin Heidelberg, 2009.

Digital Forensics

1- Garfinkel, Simson L. "Digital forensics research: The next 10 years." digital investigation 7 (2010): S64-S73.

2- Casey, Eoghan. Digital evidence and computer crime: Forensic science, computers, and the internet. Academic press, 2011.

3- Reith, Mark, Clint Carr, and Gregg Gunsch. "An examination of digital forensic models." International Journal of Digital Evidence 1, no. 3 (2002): 1-12.

4- Freiling, Felix C., and Bastian Schwittay. "A Common Process Model for Incident Response and Computer Forensics." citeseerx.ist.psu.edu/viewdoc/download

Log auditing/ minning


1- Roger, Muriel, and Jean Goubault-Larrecq. "Log auditing through model-checking." In Proceedings of the 14th IEEE workshop on Computer Security Foundations, p. 220. IEEE Computer Society, 2001.

2- Iváncsy, Renáta, and István Vajk. "Frequent pattern mining in web log data." Acta Polytechnica Hungarica 3, no. 1 (2006): 77-90.

3- van Aalst, Wil MP, Kees M. van Hee, Jan Martijn van Werf, and Marc Verdonk. "Auditing 2.0: using process mining to support tomrrow's auditor." Computer 43, no. 3 (2010): 90-93.

Delta debugging and Fault localization

1- Ghassan Misherghi and Zhendong Su. 2006. HDD: hierarchical delta debugging. In Proceedings of the 28th international conference on Software engineering (ICSE '06). ACM, New York, NY, USA, 142-151. DOI=http://dx.doi.org/10.1145/1134285.1134307

2- Andreas Zeller. 2002. Isolating cause-effect chains from computer programs. In Proceedings of the 10th ACM SIGSOFT symposium on Foundations of software engineering (SIGSOFT '02/FSE-10). ACM, New York, NY, USA, 1-10. DOI=http://dx.doi.org/10.1145/587051.587053

Model based testing/analysis

1- Apfelbaum, Larry, and John Doyle. "Model based testing." In Software Quality Week Conference, pp. 296-300. 1997.

2- Pretschner, Alexander. "Model-based testing." In Proceedings. 27th International Conference on Software Engineering, 2005. ICSE 2005., pp. 722-723. IEEE, 2005.

3- Arilo C. Dias Neto, Rajesh Subramanyan, Marlon Vieira, and Guilherme H. Travassos. 2007. A survey on model-based testing approaches: a systematic review. In Proceedings of the 1st ACM international workshop on Empirical assessment of software engineering languages and technologies: held in conjunction with the 22nd IEEE/ACM International Conference on Automated Software Engineering (ASE) 2007 (WEASELTech '07). ACM, New York, NY, USA, 31-36. DOI=http://dx.doi.org/10.1145/1353673.1353681

Intrusion detection systems

1- Hervé Debar, Marc Dacier, Andreas Wespi, Towards a taxonomy of intrusion-detection systems, Computer Networks, Volume 31, Issue 8, 23 April 1999, Pages 805-822, ISSN 1389-1286, dx.doi.org/10.1016/S1389-1286(98)00017-6.

2- Debar, Hervé, Marc Dacier, and Andreas Wespi. "A revised taxonomy for intrusion-detection systems." In Annales des télécommunications, vol. 55, no. 7-8, pp. 361-378. Springer-Verlag, 2000.

Accountability in cyber-physical systems 

1- Datta, Anupam, et al. "Accountability in cyber-physical systems." Cyber-Physical Systems Workshop (SOSCYPS), Science of Security for. IEEE, 2016.

2- M. C. Tschantz, A. Datta, J. M. Wing, "Information flow investigations", Tech. Rep. 2013 CMU-CS-13–118.

3- M. C. Tschantz, A. Datta, A. Datta, J. M. Wing, "A methodology for information flow experiments", IEEE 28th Computer Security Foundations Symposium CSF 2015, pp. 554-568, 13–17 July, 2015.

Anomaly detection

1- Chandola, Varun, Arindam Banerjee, and Vipin Kumar. "Anomaly detection: A survey." ACM computing surveys (CSUR) 41, no. 3 (2009): 15.

2- Jones, Austin, Zhaodan Kong, and Calin Belta. "Anomaly detection in cyber-physical systems: A formal methods approach." Decision and Control (CDC), 2014 IEEE 53rd Annual Conference on. IEEE, 2014.

3- Emmott, Andrew F., et al. "Systematic construction of anomaly detection benchmarks from real data." Proceedings of the ACM SIGKDD workshop on outlier detection and description. ACM, 2013.

4- Gavrilovski, Alek, et al. "Challenges and Opportunities in Flight Data Mining: A Review of the State of the Art." AIAA Infotech@ Aerospace. 2016. 0923.