University of Siegen linguist James McElvenny, author of A History of Modern Linguistics (Edinburgh University Press, forthcoming, 2024) muses on the fact that this question preoccupied linguists for centuries:
There is quite a bit at stake in entertaining the possibility of linguistic relativity – it impinges directly on our understanding of the nature of human language. A long-held assumption in Western philosophy, classically formulated in the work of Aristotle, maintains that words are mere labels we apply to existing ideas in order to share those ideas with others. But linguistic relativity makes language an active force in shaping our thoughts. Furthermore, if we permit fundamental variation between languages and their presumably entangled worldviews, we are confronted with difficult questions about the constitution of our common humanity. Could it be that there are unbridgeable gulfs in thinking and perception between groups of people speaking different languages?James McElvenny, Our language, our world, Aeon, 15 January 2024
McElvenny takes us on a tour of linguistic theories from the eighteenth century to the present but doesn’t come to a strong conclusion — because it would be honestly hard to tell whether the different word usages in different languages cause people to think differently in significant ways. He does note some interesting passages along the way though:
For example, early in the twentieth century, there was a moral panic around the use and abuse of language by new media like radio and film, a panic addressed by linguists Edward Sapir (1884-1939) and his student Benjamin Lee Whorf (1897-1941):
The young 20th century saw public discourse perverted by new forms of propaganda, disseminated by such new technologies as radio and film, all of which accompanied and facilitated the catastrophic upheavals of the First World War and the political polarisation that resulted in the rise of totalitarian governments across Europe. There was a desire to break the spell of language, to revolt against its tyranny supporting irrationality and barbarity, and make it the servant of enlightened thought. This sentiment found expression in, among other places, the linguistic turn taken by the incipient analytic philosophy of this period. At the popularising end of the spectrum, innumerable manuals on meaning appeared, such as The Meaning of Meaning (1923) by C K Ogden and I A Richards, Science and Sanity (1933) by Alfred Korzybski, and The Tyranny of Words (1938) by Stuart Chase. This is the world of Orwell’s Newspeak, in which language is the master of mind.McElvenny, Our language, our world
Of course, Orwell’s Newspeak was a language of its own, modeled on English but devoid of words or concepts that enabled criticism of the government. That might seem to prevent people from thinking clearly. But of course, in 1984, such approaches did not work in Winston Smith’s case, hence the story. So far, no government has been able to make a Newspeak stick. Human language is deeper and more basic than that.
McElvenny credits linguist Noam Chomsky (1928–) with revitalizing the idea that we all have the same sort of minds and that language does not really change that:
In his pursuit of ‘universal grammar’, Noam Chomsky (1928-) strove to re-establish a kind of psychic unity of mankind. The differences between individual languages, on Chomsky’s account, are mere phantoms, superficial variations on the same underlying system produced by an innate faculty of language shared by all humans. The linguist’s task should not be to meticulously catalogue these variants, but to factor them out and discover the universal principles governing all languages.McElvenny, Our language, our world
McElvenny also hosts a podcast and offers blog items on language topics of interest. For example, the “fake news” that many people worry about today is not a new concept:
The first half of the last century saw political polarisation and conflict that led to some of the worst atrocities in human history: global and local wars, persecution, murder and destruction on a scale previously unknown. Accompanying and in no small part facilitating this real violence was the intellectual violence of political propaganda. This was spread through pamphlets, tracts and speeches, but also through the deft exploitation of new technologies such as radio and talking films.James McElvenny, “How the language of ‘fake news’ echoes 20th-century propaganda,” The British Academy, 15 Aug 2019
As he tells it, intellectuals strove mightily to reverse the tide. In the end, of course, most people simply became more sophisticated in their approach to the new media. But now we face the same problem with the internet, with the same need for more sophistication in evaluating what we hear.
You may also wish to read: Internet pollution — if you tell a lie long enough… LLMs can generate falsehoods faster than humans can correct them. Later, Copilot and other LLMs will be trained to say no bears have been sent into space but many thousands of other misstatements will fly under their radar. (Gary Smith)