In IHC 2021, the Evaluation Competition aims to explore the evaluation of interactive computational artefacts and their relationship with History and their global effects in the past, present and future perspectives. As such, the competition aims to incorporate the possibility of evaluating technologies and their relationship with the historical legacy of the past, as well as technologies and their relationship with “making and telling history” or “retelling history” in today’s striking episodes. In addition, the competition also aims to encourage the pursuit of a more critical line of history construction, opening up to evaluation with applications and technologies that help in the understanding of History and its reflections. The theme of the Evaluation Competition – IHC 2021 is in line with the call for the Evaluation Competition – IHC 2020 that was suspended, making it possible to submit works initiated from the previous call.
Among the possible artefacts that could be the focus of the evaluation, from the point of view of technologies that help to “make, tell and retell history” nowadays are social networks with situations and discussions involving ‘fake news’ (political, economic, social, cultural, legal scenarios, among others) and collaborative technologies to support of professional, educational and daily activities situations of isolation (e.g. COVID-19). From the point of view of technologies that help to understand the past historical legacy, evaluation teams could approach interactive technologies that gained great prominence with the situations of social isolation with the crisis of COVID-19 and the closure of museums and cultural spaces for the public. Possibilities to evaluate include applications, websites or virtual and augmented reality environments of museums and cultural spaces in Brazil and all over the world (e.g. Museu Afro Brasil, Museu do Amanhã in Brazil, AR RATP Museum in France, VR British Museum in the UK, Museu Nacional do Azulejo in Portugal, among others), as well as educational games and serious games related to the theme.
From a methodological point of view, in this evaluation competition, we would like to challenge students to extrapolate the simple evaluation of classical aspects of quality in use of systems (e.g. usability, accessibility, and communicability) and include aspects related to user experience (UX). According to ISO 9241-210, user experience addresses “a person’s perceptions and reactions that result from the intended use or use of a product, system or service” and “includes all emotions, beliefs, preferences, perceptions, physical and psychological responses, user behaviors and achievements that occur before, during and after use”. In several interactive technologies related to the theme of this competition, the “user experience” has an important role in the use of this technology, including their emotions, sensations and other aspects related to UX, and not only aspects related to the use of the system and performing tasks on it. Students are free to choose methods that they consider appropriate to analyze UX aspects.
We consider it important to disseminate the use of UX evaluation methods that actually cover aspects of UX that go beyond classic aspects of usability, accessibility and communicability, for example. Despite the wide dissemination of the term “UX” and its adoption in many companies, several evaluations carried out in commercial environments said “UX evaluations” do not incorporate broader aspects such as perceptions, reactions, emotions, beliefs, among others. Therefore, the competition puts this challenge forward to demonstrate the effective use of these methods, as examples that can be replicated in commercial environments.
Within the scope of the competition, evaluation teams can choose to evaluate different aspects of UX with potential relevance. For example, teams could approach the reactions with respect to beliefs and perceptions about interactive systems involving news, emotions that emerge from the interaction with content referring to real and ‘fake’ historical basis, or feelings emerging from the use of technology in situations related to historical facts, as well as the perceptions and reactions that come from the interaction with artefacts linked to the cultural legacy of the past.
For graduate students, we would like to launch an “extra” challenge, in the sense of exploring methods of evaluating UX from recent research literature in their evaluation venture. In the academic field, since the publication of the paper “User experience – a research agenda” in 2006 by Hassenzahl and Tractinsky, with more than 3.000 citations in March 2021, several research endeavours have been dedicated to the investigation of methods to evaluate aspects of user experience. Despite the advances, this field still presents more research challenges than the use of techniques consolidated decades ago to evaluate other aspects of quality in use, such as usability. Graduate students are invited to review recent HCI literature towards a search for methods with applicability to the chosen type of application. In the literature, there are specific proposals for evaluating user experience with general-purpose applications, games or with cultural heritage interactive applications. Teams will have complete freedom to carry out their research in the literature and choose the best method(s) to use. However, we suggest below some examples of articles with similar proposals, but the teams do not need to adopt them necessarily in the evaluations to be performed:
– IJSSELSTEIJN, W. A.; DE KORT, Y. A. W.; POELS, Karolien. The game experience questionnaire. Eindhoven: Technische Universiteit Eindhoven, p. 3-9, 2013.
– PETRIE, H.; OTHMAN, M. K.; POWER, C. Smartphone Guide Technology in Cultural Spaces: Measuring Visitor Experience with an iPhone Multimedia Guide in Shakespeare’s Church. International Journal of Human–Computer Interaction, v. 33, n. 12, p. 973-983, 2017.
– BRÜHLMANN, F; Vollenwyder, B.; Opwis, K.; & Mekler, E. D. Measuring the “Why” of Interaction: Development and Validation of the User Motivation Inventory (UMI). In: Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. 2018. p. 1-13.