Hauptinhalt
Topinformationen
Personen
Beyond bias: Investigating the reproduction of discriminatory language in machine translations
8.30104
Dozenten
Beschreibung
“The teacher talked about race in class.” – Would you translate it to the German “Der Lehrer sprach über Rasse im Unterricht”? Prominent Artificial Intelligence translation tools do (e.g. googletranslate, deepL, chatGPT). By doing so they, firstly, ascribe the (grammatical) male gender to “teacher”. Secondly, it is not sensitive to the connotations and discourse associated with the history of “biological races” reflected in the German word “Rasse” which is different to using “race” as an identity category in English.
It is known that algorithms trained on large amounts of data are prone to reproducing biases. When looking at machine translations one can observe the reproduction of stereotypes, biases, and discriminatory language in multiple domains. In this course we will look at literature about Machine Translation tools and try to find out where the issue comes from. Perhaps this can bring us closer to finding out how to resolve it?
Studienbereiche
- Cognitive Science > Bachelor-Programm
- Cognitive Science > Master-Programm
- Human Sciences (e.g. Cognitive Science, Psychology)