Unexpected Gender Stereotypes in AI-generated Stories: Hairdressers Are Female, but so Are Doctors
Veröffentlichungsdatum
2024
Autoren
Zusammenfassung
We investigated gender bias in short stories generated by ChatGPT by generating stories about characters with specifed occupations and analyzing the gender assigned to these characters. On the one hand, stereotypes about professions typically associated with women are strongly reinforced, with almost all of the characters in these stories being female, well beyond what would be expected based on human biases. On the other hand, among occupations that humans typically associate with men, the generated stories reinforce these stereotypes in some cases (particularly blue-collar occupations), while reversing them to be strongly stereotypically female in other cases (notably highly regarded professions such as doctors, scientists, attorneys, or astronauts).
Schlagwörter
Generative AI
;
Large Language Models
;
ChatGPT
;
Story Generation
;
Gender Bias
Verlag
RWTH Aachen
Institution
Fachbereich
Institute
Dokumenttyp
Konferenzbeitrag
Zeitschrift/Sammelwerk
Text2Story 2024: Narrative Extraction From Texts 2024 = CEUR Workshop Proceedings, Band 3671
Startseite
115
Endseite
128
Zweitveröffentlichung
Ja
Dokumentversion
Published Version
Sprache
Englisch
Dateien![Vorschaubild]()
Lade...
Name
Spillner_Unexpected Gender Stereotypes in AI-generated Stories_2024_published-version.pdf
Size
10.66 MB
Format
Adobe PDF
Checksum
(MD5):d511af8a7da7209fe17bdbff7a89f1dc
