This page in Swedish

Annika Andersson

Annika Andersson in Novahuset.

Annika Andersson is professor of informatics, and her research focuses on how power and politics influence the design and use of information systems.
“An increasing number of decisions in the public sector are made with the help of AI. My goal is for these to be understandable and transparent to citizens,” she says.

1968  Born in Stockholm

2010 Obtained her PhD in informatics at Örebro University with her thesis Learning to Learn in e-Learning – Constructive Practices for Development

2014 Docent in informatics, Örebro University

2023 Professor in informatics, Örebro University

Annika Andersson profile page

Today, Annika Andersson conducts research primarily on automated decision-making in the public sector. Her research has always been critical, examining how information systems can transform organisations and practices – or how power and politics govern information systems.

“For me, it’s about exploring how technology can include or exclude vulnerable groups, such as women, the poor, and ‘weaker’ students. I always strive to expose structures of oppression.”

Married or unmarried – that is the question

Annika Andersson also explores the values manifested by the design and use of information systems. One example is establishing paternity – how many lines exist for potential fathers? Put differently, how many men have the mother had sex with? On the other hand, if you tick “married”, these alternatives do not appear. If married, your husband is always assumed to be the father.

Annika Andersson provides another example of underlying structures:

“Consider a webpage promoting civic participation. Your option might be to agree with the politician’s proposal – or not agree. Alternatively, a box where you, as a citizen, can express your own opinion. The difference is how democracy is perceived.”

Technology as a means of empowerment

As a doctoral student, Annika Andersson studied how information systems can be used to empower users in developing countries. More specifically, she examined how technology can support pedagogy in Bangladesh and Sri Lanka to give individuals the strength and ability to reason independently. These countries have a colonial legacy where one is not expected to change existing structures.

After the public defence of her doctoral thesis, Annika Andersson has researched how Swedish schools can best benefit from the technology already in use. Her research includes studies of how decision support systems can be used in politics. She has also studied the power structures behind the design of information security policy.

Who is responsible when computers make the decisions?

Her research has focused primarily on automated decision-making and AI in recent years.

“I want to understand what happens to responsibility and legitimacy when a computer takes over decision-making. One government agency project showed a lot of confusion about who is responsible when a computer makes decisions at the agency.”

She has observed that the more AI is introduced into our society, the more relevant the question becomes.

“When discussing the connection between AI and responsibility, we focus almost exclusively on terms of self-driving cars. And much less often about what happens when we get self-driving caseworkers in our agencies and public organisations.”

Annika Andersson leads a project on how automated decision-making in social services affects sensitive decisions, such as income support and taking children into custody.

“We really want to help and prevent caseworkers working in social services from being overrun by technology as case processing becomes automated,” she says.

Safeguarding law and legal certainty

The project aims to ensure that the values of social services and society are not lost as AI makes more and more decisions. What is clear is that the decisions must follow the law and be legally certain.

“A client’s right to co-determination also governs social services in Sweden, that they are to be involved in planning. We are investigating what happens to this right as AI bots make decisions.”