EVALUATION COMPETITION

Accepted Papers

  • Avaliação da Usabilidade e Experiência do Usuário na Plantinha IoT - Carlos Feitosa (Universidade Federal do Ceará), Ana Beatriz Vasconcelos Martins (Universidade Federal do Ceará), Francisco Gaspar (Universidade Federal do Ceará), Naily Santos (Universidade Federal do Ceará), Petrucio Filho (Universidade Federal do Ceará - UFC), Ingrid Monteiro (Universidade Federal do Ceará), Marcelo Martins da Silva (Pontifícia Universidade Católica do Rio de Janeiro)
  • Hidrate Spark TAP: Avaliação de uma Garrafa Inteligente para Motivação e Acompanhamento da Ingestão de Água - Pedro Silva (Universidade Federal do Ceará), Ruan Gabriel Lopes e Souza (Universidade Federal do Ceará), Larah Virgínia Pedrosa Lima Cruz (Universidade Federal do Ceará), Anna Julia Abreu Lima de Souza (Universidade Federal do Ceará), Andréia Libório (UFC), Marcelo Martins da Silva (Pontifícia Universidade Católica do Rio de Janeiro)
  • Da Curiosidade ao Conhecimento: Comunicabilidade e Experiência do Usuário Iniciante e Habitual da Amazon Alexa - Mariana Castro (Universidade Federal do Ceará), Alairton Sousa Junior (Universidade Federal do Ceará), Jamyle Teles (Universidade Federal do Ceará), Isabelle Reinbold (Universidade Federal do Ceará), Luiz Gonzaga dos Santos Filho (Universidade Federal do Ceará), Georgia Pereira (Universidade Federal do Ceará), Ticianne Darin (Universidade Federal do Ceará)

The HCI Evaluation Competition aims to motivate undergraduate and graduate students, supervised by HCI professionals and educators, to contribute to the education of these students. The competition has a focus on systems: participants evaluate a computer system and apply their theoretical knowledge related to HCI evaluation methodologies.

COMPETITION THEME

IHC' 2023's has the theme is "Weaving Interfaces between the Virtual and the Physical". With the widespread use of digital technologies in different spheres of people's lives, the virtual world and the physical world become increasingly connected, highlighting applications that integrate humans with different objects in their daily lives.

Thus, aligned with the theme of IHC'23, the Evaluation Competition will address people's interaction with ecosystems that involve the interconnection between physical objects through the internet to provide resources and services. These applications, known as the Internet of Things (IoT), are increasingly common in people's lives, whether in their homes or outside them, bringing them convenience and often helping them to have a better quality of life. However, despite bringing many advantages to people in general, IoT requires significant attention from designers regarding its quality of use, as it is necessary to provide useful, understandable, and manipulable resources and functionalities through interfaces that motivate users to use technology in their daily lives. Thus, the focus of the Evaluation Competition will be to analyze how people interact with IoT applications. There are several systems of this type, such as smart homes, which consist of automation and interconnection of residential equipment, systems that involve monitoring of physical activity equipment, which produce data on user performance, systems that involve vehicles with embedded technology, which provide greater safety and performance to drivers, systems for remote control and manipulation of a variety of objects, and several other examples.

Here are some links for inspiration:

From a methodological point of view, in this Evaluation Competition, we would like to challenge students to go beyond the evaluation of solely classical aspects of quality in system use (e.g. usability, accessibility, and communicability) and include aspects related to the user experience (User eXperience - UX). According to ISO 9241-210, the user experience addresses "the perceptions and reactions of a person resulting from the use or intended use of a product, system, or service" and "includes all emotions, beliefs, preferences, perceptions, physical and psychological responses, behaviors, and achievements of the user that occur before, during, and after use". In various interactive technologies related to the theme of this competition, not only aspects related to the use of the computer system that controls the IoT application and the execution of tasks through it but also the "user experience" regarding the impact of the application on the external physical environment to the software. This experience includes emotions, sensations, and other aspects related to UX. Students are free to choose methods they consider appropriate to analyze aspects of UX.

EVALUATION

Object and Focus: To evaluate the quality (usability, communicability, accessibility, etc.) and user experience of an interactive computational artifact for IoT, taking into account the user experience related to the impact of using such artifact in the physical environment. The evaluated computational systems can be for individual (single-user) or collaborative use, and can be implemented in different forms that offer interaction, such as software, hardware, environment, application, game, or other computational systems. The interaction of the computational artifact with an external equipment, such as an air conditioning unit or a Smart Watch, should be considered. Proposals with evaluations of IoT applications that intensify social collaboration are also welcome.

Methods: Teams will be free to choose the method(s) they will use to evaluate the computational artifact. Graduate student teams are encouraged to seek the application of recent UX evaluation methods proposed in articles in the HCI literature.

Product: After evaluation, each team must generate a report containing:

  • Name of the evaluation method(s) used;

  • Justification for the choice of method(s);

  • Description of the chosen platform(s): brand, model, operating system;

  • Description of the evaluation process (portion of the interface analyzed, evaluated functionalities, procedures involved);

  • Evaluation results;
  • Problems found: description, location, occurrence context, and justification;

  • Closure: conclusion and other observations;

  • Team's opinion on the ability of the method(s) to reveal (or not) problems and aspects of user experience in the application(s).

CODE OF ETHICS

It is expected that all submissions, underlying research, and behavior during the review and conference process comply with the principles and responsibilities outlined in the ACM Code of Ethics.

SUBMISSIONS

Report submissions must be anonymous and have up to 10 pages in the model for SBC article publication. The authors must submit their submissions electronically, using the JEMS system, in PDF format (link coming soon).

TEAM FORMATION

Team formation can be of two types:

  • Undergraduate: at least 3 and at most 5 undergraduate students

  • Graduate: at least 2 and at most 4 graduate students

All teams must have a supervising faculty member, linked to an educational institution. Teams may also have a second supervisor, who can be a graduate student (exclusively in the case of undergraduate student teams) or a person linked to a research institute or a private company with activities related to the HCI field.

There will be no possibility of having undergraduate and graduate students in the same team.

Failure to comply with the team formation requirements will result in disqualification from the selection process.

PARTICIPATION INSTRUCTIONS

To participate in the Competition, each team must complete three steps:

  • Perform the evaluation;

  • Submit the evaluation report;

  • If selected as a finalist, present the evaluation report during the IHC'23 Evaluation Competition session in Maceió - AL.

EVALUATION OF SUBMITTED REPORTS

Each report will be evaluated by reviewers with proven experience in HCIIHC or interactive systems evaluation. The following judging criteria will be taken into account:

  • Adequacy of the system/application to the Evaluation Competition theme.

  • Readability, organization, and presentation of the evaluation report text;

  • Clear definition of the scope and objective of the evaluation;

  • Suitability of the chosen method(s) and the described evaluation process for the intended objective;

  • Quality of the results found during the evaluations for the established scope and objective;

  • Consideration of the ethical aspects involved in conducting the evaluation (in case the team has used evaluation method(s) that involve the participation of users or other people outside the team);

  • Quality of the critical analysis of the method's ability to reveal potential usage problems of the system.

SELECTION AND PRESENTATION AT IHC'23

Three finalists from each category (undergraduate and graduate) will be selected for a short oral presentation, followed by questions from a group of evaluators during the event, to be held in Maceió - AL - Brazil, between October 16 and 20, 2023. Although desirable, the presence of all members of the finalist teams on the day of the presentation at IHC'23 is not mandatory. However, the presence of at least one team member is required.

AWARD

Each selected finalist team must present orally at IHC'23 and will receive a Certificate of Recognition. The winning team will also receive a prize (to be defined by the Organization).

SUPPORT FOR TEAM PARTICIPATION

The event organization will provide exemption from the registration fee for one member of each selected team for the oral presentation.

PUBLICATION

The evaluation reports of the finalist teams will be published in the extended proceedings of IHC'23, in SBC-OpenLib (SOL).

SUMMARY

  • System to be evaluated: Interactive computational artifacts for IoT;

  • Criteria to be evaluated: Quality (e.g., usability, accessibility, and communicability) and User Experience (the latter must be related to both the computational artifact and its interaction with external equipment);

  • Methods to be used: According to the teams' choice;

Team sizes:

  • 3 to 5 undergraduate students + 1 or 2 supervisors (teachers, researchers, postgraduates, or professionals);

  • 2 to 4 graduate students + 1 or 2 supervisors (teachers or researchers).

Program Committee

  • Ana Carolina De Marchi - Universidade de Passo Fundo
  • André Luís Menolli - Universidade Estadual do Norte do Paraná - UENP
  • Andre Freire - Universidade Federal de Lavras
  • Andrey Rodrigues - Universidade Federal do Amazonas
  • Angela Peres - Universidade de Ciências Da Saúde de Alagoas
  • Daniela Trindade - Universidade Estadual do Norte do Paraná - UENP
  • Eliana Moreira - Instituto Federal de São Paulo
  • Fernanda Lima - Universidade de Brasilia
  • Marcelle Mota - Universidade Federal do Pará
  • Marcelo Morandini - Universidade de São Paulo
  • Marcos Alexandre Rose Silva - Universidade Federal de Santa Maria
  • Mônica Paz - Universidade Federal da Bahia
  • Pedro Henrique Dias Valle - Universidade Federal de Juiz de Fora
  • Renato Balancieri - Universidade Estadual do Paraná
  • Williamson Silva - Universidade Federal do Pampa

COORDINATION

Maria Lúcia Bento Villela (UFV) -  maria.villela@ufv.br
Thiago Adriano Coleti (UENP) -  thiago.coleti@uenp.edu.br

IMPORTANT DATES

Deadline for registration of reports in JEMS: 08/24/2023 ( was 08/21/2023)

Deadline for submission of reports in JEMS: 08/24/2023 (was 08/23/2023, was 08/21/2023)

Deadline for submission of reports on JEMS: 08/21/2023

Notification of finalists for presentation at IHC'23: 09/11/2023

09/20/2023 - Final report submission on JEMS

10/16 to 10/20/2023 - Presentation at IHC'23