When Digital Access Becomes Unequal
Digital platforms, from healthcare to social support, increasingly mediate how we access essential services. While these systems promise efficiency and personalization, they also quietly shape who finds what, and how easily. This raises an important question: are digital services helping everyone equally, or are they unintentionally reinforcing existing inequalities?
This question lies at the heart of the FAIR-ACCESS project, supported by DigiSus seed funding in 2026. The project explores how search and recommendation technologies that are commonly used to guide users through complex information, can be redesigned to ensure fair and equitable access to health and social services.
Why Fairness Matters in Digital Discovery
Imagine a citizen trying to find mental health support or social assistance online. The results they see depend on how algorithms rank and recommend services. These algorithms often rely on historical data, user interactions, and available metadata. But what if this data is incomplete, biased, or skewed toward certain groups?
In such cases, digital systems may (i) prioritize well-known services over equally relevant but less visible ones, (ii) favor users who are more digitally literate or active online, (iii) struggle with fragmented service descriptions or multilingual contexts. The result is not just a technical issue, but it becomes a societal one. Unequal visibility can translate into unequal access, particularly for vulnerable or underserved populations.
From Personalization to Responsibility
Personalization has long been a key goal in digital systems: show users what is most relevant to them. However, relevance alone is not enough in socially sensitive domains like healthcare and social services. FAIR-ACCESS challenges the traditional design of these systems by asking: What if we explicitly account for fairness, transparency, and inclusion alongside relevance?
This shift requires rethinking how digital discovery systems are built. Instead of optimizing only for clicks or engagement, we must also consider:
- Equitable exposure of services
- Transparency in how recommendations are generated
- Trustworthiness of the system from a citizen’s perspective
The FAIR-ACCESS project lays the groundwork for a new generation of fairness-aware digital services. Specifically, the project will focus on:
- Defining Fairness: We begin by identifying the challenges people face when accessing services online. These include issues such as fragmented information, language barriers, and differences in digital skills. What does fairness mean in this context? The project will define concrete fairness and transparency requirements tailored to digital service discovery, considering trade-offs between personalization, diversity, and equity.
- Building a Prototype: A prototype system will demonstrate how fairness-aware search and recommendation can work in practice. This includes balancing relevance with fairness constraints and providing explanations for recommendations. Traditional metrics like accuracy are not enough. The project will develop evaluation approaches that also measure: equity in access, transparency and user trust.
Why This Matters for Sustainable Societies
Access to health and social services is a cornerstone of inclusive and sustainable societies. As these services become increasingly digital, ensuring fair access is not optional, it is essential. Without careful design, digitalization risks amplifying existing inequalities. With the right approach, however, it can do the opposite: improve inclusion, strengthen trust, and support social sustainability.
FAIR-ACCESS contributes to this vision by aligning technological innovation with societal values. It connects data science with broader questions of equity, responsibility, and sustainability that are core themes of the DigiSus platform.
Looking Ahead
Although this is an early-stage project, the ambition is clear. FAIR-ACCESS aims to provide practical guidelines for fair and transparent digital services and establish a research agenda on equitable digital systems. Moreover, we target at fostering interdisciplinary collaboration across data science, health, and sustainability. This is the first step toward rethinking how digital systems can serve everyone both efficiently and fairly.
About the Author
Kostas Stefanidis is a Full Professor of Data Science at Tampere University, where he directs the Data Science Research Centre and leads the Recommender Systems group. His research focuses on personalization, fairness, and responsible data management, with applications in recommender systems and data integration. He currently leads the Research Council of Finland project ALTER, which explores fairness, explainability, and user interaction in automated decision systems.
Homepage: https://homepages.tuni.fi/konstantinos.stefanidis/
LinkedIn: https://www.linkedin.com/in/kostas-stefanidis-9556486/
***
Photo: Jonne Renvall / Tampere University