Hopp til hovedinnhold
illustrasjon av kunstig intelligens
Illustrasjonsbilde: Getty Images

Behavioural Artificial Intelligence: An Agenda for Systematic Empirical Studies of Artificial Inference

In an article published in AI & Society: Journal of Knowledge, Culture, and Communication, Tore Pedersen (Oslo Nye Høyskole) and Christian Johansen (NTNU) explain how biases in Artificial Intelligence (AI) may be reduced by employing methodologies from cognitive and behavioural science to disclose the AI inference processes that may lead to the biases.

Publisert i Forskningspublikasjoner Mandag 4. oktober, 2021 - 14:38 | sist oppdatert Mandag 4. oktober, 2021 - 15:33

Forskere: Tore Pedersen og Christian Johansen.

Abstract

In media, Artificial Intelligence (AI) is often described in contrasted terms – either as the ultimate solution to all human problems or the most serious threat to all human existence. In academia, computer scientists are focused on developing AI systems that function, whereas philosophers are concerned about the implications these systems may have to our lives. Of growing concern is the adverse effect of biased AI decisions. 

But why do intelligent systems make biased decisions?

To better understand the process that leads to a biased AI decision, we introduce the term Behavioural Artificial Intelligence and propose an agenda for studying Artificial Inference – the process that leads AI to arrive at its decision. By employing methodologies from cognitive and behavioural science to study AI decisions, one may disclose AI inference processes in the same way as studies of human judgment and decision making have disclosed human inference processes. 

Such disclosure may lead to increased algorithmic transparency and give us a better understanding of the AI inference processes. This may help us decide whether or not we can trust AI decisions to be unbiased.

Les hele artikkelen her:

'Behavioural artificial intelligence: an agenda for systematic empirical studies of artificial inference', AI & Soc 35, 519–532 (2020)