Hicks, Michael Townsen and Humphries, James and Slater, Joe (2024) ChatGPT is bullshit. Ethics and Information Technology, 26: 38. ISSN 1388-1957
AI Summary:
Large language models have been plagued by persistent inaccuracies in their output, often referred to as AI hallucinations. Researchers argue that these falsehoods can be better understood as bullshit, in the sense explored by Frankfurt.AI Topics:
Available under License Creative Commons Attribution.
Download (814kB)
Recently, there has been considerable interest in large language models: machine learning systems which produce human-like text and dialogue. Applications of these systems have been plagued by persistent inaccuracies in their output; these are often called “AI hallucinations”. We argue that these falsehoods, and the overall activity of large language models, is better understood as bullshit in the sense explored by Frankfurt (On Bullshit, Princeton, 2005): the models are in an important way indifferent to the truth of their outputs. We distinguish two ways in which the models can be said to be bullshitters, and argue that they clearly meet at least one of these definitions. We further argue that describing AI misrepresentations as bullshit is both a more useful and more accurate way of predicting and discussing the behaviour of these systems.
Title | ChatGPT is bullshit |
---|---|
Creators | Hicks, Michael Townsen and Humphries, James and Slater, Joe |
Identification Number | 10.1007/s10676-024-09775-5 |
Date | 8 June 2024 |
Divisions | College of Arts & Humanities > School of Humanities > Philosophy College of Social Sciences > School of Social and Political Sciences > Politics |
Publisher | Springer |
Additional Information | A Correction to this article was published on 11 July 2024. https://doi.org/10.1007/s10676-024-09775-5 |
URI | https://pub.demo35.eprints-hosting.org/id/eprint/236 |
---|
Item Type | Article |
---|---|
Depositing User | Unnamed user with email ejo1f20@soton.ac.uk |
Date Deposited | 11 Jun 2025 16:36 |
Revision | 17 |
Last Modified | 12 Jun 2025 10:41 |
![]() |