Video games could be a tool for teaching ethical competencies in fields where people may encounter moral dilemmas. This is one of the principles behind the research being carried out at the Universitat Oberta de Catalunya (UOC) by Joan Casas-Roma, Jordi Conesa and Professor Santi Caballé, researchers belonging to the SmartLearn group in the Faculty of Computer Science, Multimedia and Telecommunications. The researchers are working on the use of digital games for improving the learning of ethical competencies in teaching, specifically in the field of technology, but their research can also be extrapolated to other areas.
The digital world often sees debate around data analysts and their ability to manipulate data. Casas-Roma remarked that those designing technology might be reproducing “a specific way of understanding the world” by, for example, permitting certain interactions but not others. This is not a limitation imposed by the technology itself: rather, its designers have made a specific decision. These decisions often end up having an impact on society, either positive or negative.
The effects of technology
Casas-Roma’s research examines the extent to which data engineers and analysts take into account “the good or bad effects that technology” can have. His work is aimed at “foreseeing” these effects and fostering further work to ensure that technologists have the tools to be able to consider the impact the technology they are working on will have on the world.
To give an example, Casas-Roma spoke of social media, which were originally created for a specific purpose, but have ended up being transformed by the use people put them to. Instagram, Facebook and Twitter first saw the light of day with a particular use in mind, but users have sometimes appropriated them in a way that might end up causing harm. This is the case with Instagram and its effects on adolescents’ mental health and self-image, as some reports have indicated. “This has led to the redesign of some of the social media platform’s features”, noted Casas-Roma.
The UOC researcher’s work is based on the idea that, although it is right that there are regulations on the use of artificial intelligence, which is particularly useful in the field of data, “there is another part that has to come from the design of technology”. Technologists need to have tools to help them “understand how the technology [they are working on] might change the world”. This is by no means an easy task: “it’s not that easy to boil all this down into a set of rules”, because “we’re often dealing with technologies that have not existed before, meaning we don’t know how they’ll affect the world, thus calling for an exercise in foresight”.
Learning ethics through codes or play?
The researcher believes that learning about ethical competencies through codes of ethics “wouldn’t work”. The design phase of technology needs “a holistic vision” of both itself and “of the users that will be utilising it”. This, he noted, could be done through examples allowing users to become spectators, able to observe the complexities of cases presented to them and have empathy for the decisions made.
However, Casas-Roma wants to go a step further and suggests making use of interactivity for teaching ethical competencies to students, presenting them with a virtual adventure which makes them the protagonists. Interactivity, he said, “opens up the door to moral emotions, subjective elements, to genuine engagement, to feeling responsible”. Indeed, he has already designed a prototype game of this kind.
The project is based on the creation of a virtual narrative designed for engineering students, placing them in situations involving moral dilemmas. For example, they can be invited to take on the role of a developer who at some point discovers that a program that’s just been launched by their company “might have some undesirable ethical consequences”. “The great thing is that the narrative is not designed to teach what is or isn’t right, as the real world doesn’t often give enough information for us to make a decision. The game constantly makes players choose between decisions that do or do not provide support for certain ethical principles, perhaps in detriment to other requirements raised by the given situation”, explained Casas-Roma.
Video games can be a safe environment for dealing with dilemmas
The researcher is thus exploring how video games could become “a safe environment for dealing with ethical dilemmas and decisions that could be difficult” to resolve. Video games become a “platform for each player/student to have a space for reflection, creating an environment for considering what the ethical consequences of a decision might be”, he added.
The research team has designed a prototype, adaptable to different professional fields that may encounter ethical dilemmas. Casas-Roma gave the example of the world of biomedicine. Using this prototype, the participant has to make a series of decisions that affect the storyline and its characters in different ways. These decisions may affect the player’s relations with other story characters, impact the goals set for the story and be associated with the principles of professional ethics taken from codes of conduct.
The video game raises questions as to whether there is not always a “right” answer, or there is a lack of information to indicate which is the best decision to make. “We’re not looking for participants to learn specific codes of conduct, but rather to understand and try to foresee the possible consequences of their decisions, getting them used to considering the ethical side of their actions in different professional contexts”, said Casas-Roma.
The UOC researchers’ prototype offers some space for reflection for developing participants’ ethical competencies so that they can transfer them to the decision-making processes of their everyday professional work.
The articles we publish on Psychreg are here to educate and inform. They’re not meant to take the place of expert advice. So if you’re looking for professional help, don’t delay or ignore it because of what you’ve read here. Check our full disclaimer.