By January 15, 2018English

There is no secret that many things that people used to do in real life can now also be done virtually, like: shopping and paying bills with use of Internet banking; making friends and staying in contact through social media websites; or following courses and passing exams online. When we experience all these situations in real life, we behave in a way fitting to social norms and principles.  For example, when people meet for the first time, it is quite normal to keep some distance from each other and have a limited socially-accepted range of topics to discuss. It takes time to learn each other; and every gesture, word and movement provides valuable detail to better understand who is in front of us. The virtual world is different from the real world and full of risks. Risks concerning our privacy are hard to understand and to control when sometimes we even do not know how technologies that we use, function. To achieve the same level of privacy in the virtual world as we expect in real life, is quite a challenge. In this respect Nissenbaum (2004) speaks of “contextual integrity” which links adequate privacy protection  to norms formulated for specific contexts, “demanding that information gathering and dissemination be appropriate to that context and obey the governing norms of distribution within it” (Nissenbaum, 2004, p. 101).

This question also concerns e-assessment technologies like TeSLA that collect personal data from students for identification and authorship verification purposes.  A dominant approach addressing this question is a combination of transparency and choice (Nissenbaum, 2011). The gist of this approach is to inform users about data collection and to give them a choice whether or not to provide their data. Even those who decide to provide their data, should have opportunities to access, change or delete data, or withdraw consent. However, in practice we see that technology providers offer privacy on a “take it or leave it” basis (Nissenbaum, 2011), turning the choice into a dilemma. The same can be said of privacy policies; almost all of them are very long, and overloaded with legal terms making it impossible for ordinary users to read and understand. As a result, 85,5% of Internet users simply agree to privacy policies without reading (Elsen, Elshout, Kieruj, & Benning, 2014).

There is a need to understand the context in which a particular technology is used and expectations of users for this particular context; then – make sure that this technology does what is expected. Technology providers should think in advance not only about secure data collection and processing, but also ethical issues that can arise during exploitation. For e-assessment it means to provide students with information on instruments for identification and authorship verification in a clear way, and to give them opportunities to choose instruments according to their preferences. Depending on which data are required, preferences of students may differ. From the TeSLA pilots we know, for example, that students at the Open University in the Netherlands are much less willing to share video recordings of their face than recordings of their voice or keystroke dynamics. Some of them are even not willing to share any personal data for identification purposes.

Although much work has been done already, there are still many questions that need to be answered, like: how well do students understand these technologies; how much effort do/must they make to sufficiently understand them; what is the best way to communicate it; and what influence do they have on the relationship between students and teachers. Answering these questions is of great importance for all parties in order to bring the experience of e-assessment a bit closer to a real life setting, and to create a safe and trustful environment for all students.


Elsen, M., Elshout, S., Kieruj, N., & Benning, T. (2014). Onderzoek naar privacyafwegingen. Retrieved from

Nissenbaum, H. (2004). Privacy as contextual integrity. Washington Law Review, 79, 101–139.

Nissenbaum, H. (2011). A Contextual Approach to Privacy Online. Journal of the Americam Academy of Arts and Sciences, 140(4), 32–48.


Ekaterina Mulder, PhD Candidate

José Janssen, Associate Professor

Author Ekaterina Mulder

A PhD Candidate at the Open University

More posts by Ekaterina Mulder

Leave a Reply