Unconference Critique Digitale

Panel 1, Session 1- Lausanne

Do., 21.10.21, 13.30 - 14.15

Title: Algorithms as normative tools


Laetitia Gern ([email protected]) : Discours politique en ligne, polémique, YouTube

Moritz Mähr ([email protected]) Digital historian

André Cardozo Sarli ([email protected]) - Children Rights and Technology,

Wendelin Brühwiler ([email protected]) Historian, Media Historian, 19 century, History of Trademarks, Temporality

Pascal Föhr ([email protected]) Digital Historian: Digital Source Criticism (hsc.hypotheses.org)

Amsler, Claudia ([email protected]): Queering Digital Humanities, Critical Algorithm Studies, Gendered History of Computing, Instagram

Discussion Leader:

André Sarli is a PhD candidate in Sociology and Children’s Rights at the Centre for Children’s Rights Studies since January 2019. He is working on the SNF project “The participatory capability of children in street situations in Brazil and China” led by Prof. Daniel Stoecklin. He holds a Master in International Law from the Graduate Institute of International and Development Studies (2017), as well as a Specialisation in International Law by the School of Judges of São Paulo (2015) and a Bachelor in Law by the University of São Paulo (2012) His doctoral research focuses on the interplay of vulnerable childhoods in Brazil, technology appropriation and rights of the child in the digital environment.

André said he is in his early career in studying technological studies and he came mostly to hear what other will share

Goal of the discussion

The main goal of the discussion was to exchange literature, perspectives and expierences with normative algorithms. The question was not whether algorithms are normative, that was the assumption of the discussion and it was more about explaining examples and sharing current research.

Main challenges by the participants

The ambiguity towards algorithms is challenging: Algorithms can be helpful for different research interests, but they can also perpetuate prevailing power mechanisms. It is important to make these normative mechanism visible. Also the problem of the “algorithmic bubble” was discussed.

Why is this an issue in DH?

In the DH research is done with algorithms and new algorithms are also programmed, so it is central to be sensitive to their normative power, also to avoid possible bias in programming.

Shared Literature

About Googles Algorithm: https://backlinko.com/google-ranking-factors ; Vaidhyanathan, Siva: The Googlization of everything (and why we should worry), Berkeley 2011

Safiya Umoja Noble (2018). Algorithms of Oppression.

On Gaming Algorithms: Petre, C., Duffy, B. E., \& Hund, E. (2019). “Gaming the System”: Platform Paternalism and the Politics of Algorithmic Visibility. Social Media + Society. https://doi.org/10.1177/2056305119879995

tool for ads targeting https://eyeballs.hestialabs.org/en/

Bechmann, Anja, and Geoffrey C Bowker. “Unsupervised by Any Other Name: Hidden Layers of Knowledge Production in Artificial Intelligence on Social Media.” Big Data \& Society 6, no. 1 (January 1, 2019): 1–11. https://doi.org/10.1177/2053951718819569.

Dating Privacy collective for understanding Dating app algorithms, this is the wiki page with a lot of info on dating apps, you can also join the collective: https://wiki.personaldata.io/wiki/Project:Dating_Privacy


The discussion starts with a short pitching of food for thought.

What is the normative aspect of algorithms? As we know today, algorithms are being used in smart contracts, in the judicial system and in the allocation process of social assistance. However, how algorithms can influence our lives beyond those situations? For instance, think about how business and influencers, among others, are considered as much as dead if they are in the second page of Google Search. And yet there is complete business sector behind it, Search Engine Optimisation. If one wants to be noticed by Google in their field of action, one needs to adapt. The algorithm of Youtube made many people millionaires with some types of content in the Early years. Now they are struggling to keep visibility. Every year, once or twice, the Algorithm changes, and so does the users need of adapting.

The session circled around one very insightful work of Jessica Pidoux, one of the participants, whom investigated how dating apps’ creates norms and expectations. She interviewed devs and users, and tried to understand how the interface and variables affect the users and how the users interact with the recommender system.

One of the takes of her work is the bi-directional construction by norms. The conventions are build together and very fluid. The design, what can be traced by an algorithm, the spaces of the app create frame in which the users can fill, but on the other side, their profiles, contribution, acts, influence how the app will work for the whole community of users.

One of the participants questioned about the specific digital properties of the dating apps in comparison with the traditional forms of dating. The way people dress and portray themselves in “real life” is also influenced by social norms, and as people engage in dating more and more, they also become better on it, a similarity with the users learning and improving their chances of finding a partners in those apps.

This was addressed by the ways the apps make the users portray themselves. There is a declarative aspect on how they are, what are their profiles, their history, which cannot be seeing in mundane dating.

How people changed their behavior offline? Users compare the profile to the face-to-face

One person is grateful of algorithms learning very well our preferences and works very well, but what about the content that is not being shown?

For Google Algorithms, this does not hold well, as 80% of the search results are the same for all users. Personalisation is a very expensive and so far unpractical task. What apps nowadays try is how to improve their results for all users. Advertising as mediated by algorithms, however, relies heavily on personalised target ads.

It was commented how those algorithms of personalisation should be put under a wider context of the structure of internet business, where we are seen as consumers, and our data is also a product. The grain is salt comes from the push and nudges to maximize engagement and therefore monetize our presence in the application.

WB: just a quick follow-up: If we take norms to be about desirable behaviour (a priori established, in penal law for instance) many of the phenomena discussed could be qualified as non-normative. For they are processual and they lack a normative telos. There are other ways of enforcing conformity, or specific structures of behaviour than norms. Knowledge deprivation, incentives etc.

(@Laetitia: on the notion of norms in the abouve-mentioned sense Christoph Möllers: Die Möglichkeit der Normen. Über eine Praxis jenseits von Moralität und Kausalität, Berlin: Suhrkamp 2015.)