This Recommendation System is Broken

"This Recommendation System is Broken" is part of Curatorial A(i)gents, a series of machine-learning-based experiments with museum collections and data developed by members and affiliates of metaLAB (at) Harvard, a creative research group working in the networked arts and humanities.

Online recommendation systems are information filtering systems that provide users with streams of prioritized content based on expected individual preferences. While they can be of different types – collaborative, content-based, or hybrid filtering -, they typically share the use of machine learning technologies as forms of artificial intelligence able to perform predictions and profile personal taste. Drawing upon previous research on critical algorithm studies, "This Recommendation System is Broken" is a computational art project that tackles the limitations of predictive content personalization and automated sorting.

By using creative coding and guided randomization, this project upends the expectations of carefully-automated choice, surfacing suggestions from the vast body of undervalued, hidden, unseen artworks in the museum collection. "This Recommendation System is Broken" ultimately challenges the public to rethink museum collections in terms of visible and in-visible objects, by inviting us to explore what we might call "brokenness" in recommendation systems and to reconsider our understanding of marginalized art history.



Aim and Scope

Beyond more scientific definitions, in popular belief a broken algorithm is essentially perceived as one that fails in performing accurate prediction. And yet, precisely embracing this form of brokenness can lead to new explorations and discoveries, overcoming the limitations of using biased training sets for machine learning. Drawing upon a critical and ethical approach to algorithms and new media practices, this project starts from the premises that art should not be handled in the same way as finance or marketing. The creation of algorithms for cultural institutions or media services should not be guided by prediction, popularity or money-making. Finance works at a scale, culture doesn’t.
Admitting a framework where art and cultural institutions do not need to function at scale is a fundamental step to decolonize museums and acknowledge that art collections have very specific identities, histories, traditions that cannot be treated in a capillary way. We need diversity in algorithms, meaning algorithms that are tailored for each cultural institution. [...]
“Brokenness” is a no less relative concept than wholeness. But here it is featured rather than concealed. The term becomes a window into the instrumental ethics of algorithms, one that observes how information filtering systems shape culture by reinscribing what is always already preferred.
Giulia Taurino (forthcoming). “The Brokenness in our Recommendation Systems: computational art for an ethical use of A.I.”, in Alexander Pfeiffer and Alexiei Dingli (eds.), Advances in Intelligent Systems andComputing [Media, Arts and Design | A.I. Conference Proceedings], Springer.
Home


Project developed in collaboration with metaLAB (at) Harvard and Harvard Art Museums.
Copyright © 2020 Giulia Taurino. All rights reserved.