Transparency before Algorithms
The principle of Transparency before Algorithms is part of the Manifesto for Sustainable Software Development created by software developers, designers and entrepreneurs to call for responsibility among our peers to prioritize sustainability — not only in terms of ecological impact but also in inclusivity, ethics, and the long-term resilience of the digital systems we create. By prioritizing Transparency before Algorithms when designing our software, we may contribute to a shift toward a better digital future where technology serves the needs of society, humanity, and the planet — rather than exploiting human psychology for the benefit of a few.
This manifesto is a work in progress, and we are actively seeking feedback, ideas, and support. We invite you to join the conversation and contribute in any way that resonates with you — whether big or small, every voice matters.
Algorithms play a pivotal role in shaping digital experiences, from recommending content to curating search results and tailoring ads. While these systems can enhance efficiency and personalization, hidden algorithms operate without user visibility or understanding, fostering mistrust, manipulation, and bias. Transparency must take precedence in systems that influence user experiences to ensure fairness, accountability, and ethical alignment.
Hidden algorithms often function as black boxes, making decisions that significantly impact users without clear explanations. This opacity can perpetuate biases, prioritize engagement over well-being, and manipulate user behavior. For instance, content curation algorithms designed to maximize clicks might amplify sensational or polarizing material, harming societal discourse. The Nordisk Tænketank for Tech og Demokrati (2023) emphasizes the importance of algorithmic transparency as a democratic safeguard, ensuring that platforms remain accountable to users and the broader public.
Transparency empowers users by making decision-making processes visible and understandable. Platforms should provide clear explanations of how algorithms work, allowing users to comprehend why certain content is shown or prioritized. For example, a news platform could include annotations explaining why an article was recommended, such as its alignment with the user’s reading history or geographic location. Open access to algorithmic criteria—without exposing proprietary secrets—ensures that users and researchers can audit systems for fairness and bias.
Customization options further enhance transparency. Allowing users to adjust algorithmic settings, such as prioritizing chronological feeds over engagement-based curation, gives them greater control over their digital environments. Platforms that embrace transparency can align with user expectations while promoting ethical design principles.
The consequences of ignoring transparency in algorithm design are significant. Hidden algorithms can manipulate user behavior, prioritize profit over ethics, and perpetuate systemic biases, ultimately eroding trust in digital systems. Users who feel manipulated or excluded by opaque decision-making processes may disengage, leaving platforms to face reputational damage and regulatory scrutiny. Moreover, the societal impacts of algorithmic opacity—such as the spread of misinformation or the reinforcement of discriminatory practices—are far-reaching and difficult to address retroactively.
By prioritizing transparency, platforms can create systems that are not only more ethical but also more robust and trustworthy. Clear explanations, open standards, and customizable settings empower users to engage actively with technology, fostering a sense of agency and accountability. Transparent algorithms encourage fairness and inclusivity, aligning with societal values and building long-term user loyalty.
Designers, developers, and policymakers must work together to make transparency a foundational principle of algorithmic systems. This requires adopting standards for explainability, creating interfaces that demystify algorithmic processes, and engaging users in meaningful ways. Transparency before hidden algorithms ensures that technology serves society responsibly, supporting fairness, trust, and the broader public good.