A COLLISION OF OPINIONS

John Stuart Mill said that truth is discovered by the “collision of adverse opinions,” and, “he who knows only his side of the case, knows little of that.” When people listen only to one viewpoint, he explained, “errors harden into prejudices, and truth itself ceases to have the effect of truth, by being exaggerated into falsehood.”

Ian Holmes' essay recently posted in the Journal echoed the disengagement thesis we wrote and talked about when we excused ourselves from social media for a few weeks in 2020. I think the hiatus helped us learn to use those platforms for a purpose rather than simply feeding an addiction. Personally, I had to wean myself off of the doom scroll and constant checking; it took several tries but finally, I rejected "the Gram" altogether, although without deleting my account entirely. Perhaps I lack commitment. Regardless, the addiction remains. I simply replaced social media with media, mostly news, which means I now know more about shit that doesn't immediately affect me than I did before, and it isn't making my life any better.

I have learned from reading others' observations and long form essays, and sometimes this makes me curious enough to surf related essays or buy a book, which is more than social media ever did for me. I like trying to understand things, and people, and behavior ... it helps me make some sort of doom-laden predictions about the future of the species. ;)

In his essay, Ian wrote, "If a person didn’t know what a smartphone was, it would look like the aliens/AI had won." I actually believe that the AI, which I think should be considered the enemy, already has won. We developed technology to assist us, to speed learning, to make work more efficient, to reduce economic costs and environmental impact, to strip the drudgery from daily life and allow immediate flights of fancy toward delicious treats we imagine being entitled to. The technology learned quickly though that, if engagement was the highest imperative, the prime directive, then tech must address and feed the meatsacks' worst, most venal appetites and behaviors.

Slowly, patiently, the AI replaced ambition, aka the desire to do or become something greater, with indolence and entitlement, creativity with mimicry, real life and face-to-face communication with a distant, poor facsimile, calculated risk with demands for absolute safety, choice with obedience, and personal autonomy with dependence (moral and otherwise). The AI — itself being a binary construct — guided the clumsy, inexact, subtle, sometimes disagreeable but generally warm and open human nature into an either/or condition where what was not right was most certainly wrong (that being the only other option), where what was not good was evil, where my way became the highway, the only way. The AI has rewired much of human consciousness and stripped many of what little sense of personal agency they once had, pacifying them through a little handheld dopamine box. Imagine that a collective shift in consciousness occurred, wherein the majority, who once felt that they could steer the direction and outcome of their lives, suddenly believed they had no control, that circumstances were beyond it, and that someone or some many — those with power, money, influence and so-called expertise — did have dominion thus the majority gave in to it, gave up. This shift from internal to external locus of control, from, "I can do it," to "someone/thing is stopping me from doing it," spells disaster. Once the AI convinces people that they “can’t do anything right,” or subtly yet relentlessly suggests it, well, the AI and its masters have indeed taken control. And anyone with a little power is tempted to take advantage of those whose locus of control leans toward external.

There is an obvious and constant tension, or antagonism between liberty and authority but if governing authority is amorphous — a little nudge here, a small opinion there, with a morass of consensus all around — the tension wanes because there is nothing for the free man to push against or resist. So we acquiesce. We accept. We return to the screen, geofenced and surveilled, and helpless.

Once we give up agency, our personal autonomy, we become much easier to control, er, I mean govern. And using algorithms to suppress that agency by showing us all of the looks and things and status we don't have, is a very efficient means of derogating our belief in Self. Once upon time I had a girlfriend who understood this thesis very well; the less I believed in myself the more easily she could steer and control me. My climbing partner nicknamed her, The Diminisher, and his insight eventually put one more nail in the coffin of that relationship. Young people today need such a close friend and advisor, someone or something to point out how AI-driven social forces clip their wings, and "stop them from doing what they might" by making them believe they can't. In the social cesspool created by these algorithms no one will ever be enough, will always be less-than, and, stripped of all belief in their potential and the ability to achieve it, will become unhappy cogs in the commerce machine.

I happen to think that the commercial aspect of "attention capturing algorithms" is the least dangerous effect. The more I read things like, "preventing free speech is free speech," or, "...you have the right to speak but you also have the right to be shut up,” i.e. to be compelled into silence in the name of free speech, as quoted in Mark Bray's book on Antifa, the more I see certain fundamental human rights disappearing from practice and losing their importance in the public consciousness. Sadly, the silencer never understands the value of actual free speech — it is the silenced who understand its value best. And if freedom of speech is suppressed by threat of violence, silence compelled by force, it is no longer freedom, and the more accustomed to losing freedom we become the more authoritarian those with power — of whatever kind — will become. And the more we hand our locus of control to external factors, whether we do it consciously or are seduced into doing so, the less valuable or necessary actual freedom appears to be.

The expression of personal freedom derives from agency over one's own life. Without such agency there can be no freedom, and no sense of it, only the directions given by who or whatsoever has power. I used to think that only a small number of people would willingly admit, "I just want to be told what to do." Apparently, I used to be an optimist because these days, it appears to me that many more want to be led and told than want to lean into the concept of figuring it out for themselves, and to be free to do so. Of course, freedom does imply responsibility and maybe that's what people resist; if they take responsibility they can no longer blame an external someone or something for having failed to do or achieve what they wanted.

The screens may well be at fault here but I think the over-nerfing of society in general, especially for the young, is equally influential. We went from "safety third" to being so risk-averse that we cannot handle reality without it being interpreted and presented on a screen. It turns out that we can fall out of a tree, break an arm, and survive, having learned the valuable lesson of gravity, branch thickness, tree type and health, etc., along the way. It turns out that we can try and fail or compete and lose, and not suffer irreparable psychological damage, and in fact, these experiences are beneficial. Physical and emotional risk help us develop resilience and when shielded from such risk we become weak and helpless. Understanding and accepting risk, and taking responsibility for the outcome, helps support long-term survival in ways that a fully nerfed experience can't. I won't say that safety isn't worth pursuing, but I do believe that consequences teach better lessons (in all things) than insincere, positive affirmations recited within an overly-protective bubble ever could.

I came across an old adage that suggested it is more useful to "prepare your child for the path" than it is to "prepare the path for your child". It does seem that we should look ahead, assess all of the things and people and behavior our children may eventually experience as adolescents, young adults and proper adults, and do our best to prepare them for this reality. Because if we lie too much in the name of safety, our reward will be to watch them fail, sometimes disastrously so, and witness society continue its decline.

I never attended college but I always thought it was the place where adolescent ideas were challenged, where students — whose hypotheses and opinions were not yet fully-formed — engaged in discourse with other students and more experienced adults as part of their education in a topic relevant to their career and life choices. Some relatively recent college campus behavior made me reconsider my conclusion. I don't see the point of these institutions if the students may tell the faculty what they can and can't teach, what words or ideas might harm their fragile sensibilities, and then compel said faculty into acquiescence with threats of violence, social shaming and condemnation. If 66% of students say it is acceptable to shout down a speaker to prevent them from speaking on campus, and 23% say it is acceptable to use violence to stop a campus speech, and these students threaten faculty for even exposing them to ideas inconsistent with their notion of reality, it can't be called education nor prepare students for life in the real world. I think it imperative that we listen to others’ viewpoints, and challenge what we consider to be bad ideas with our better ideas. Obviously, this would require us to listen graciously rather than shouting loudly enough to drown opposing opinions but it seems such politesse is hard to come by in the hallowed halls of higher learning.

It appears to me that we have created an environment from which recent generations cannot learn to handle reality, or develop the physical or emotional resilience necessary to survive it. It's not their fault, of course, we older adults are responsible, having decided what they could and couldn't handle during some very important formative years and shielded them from the latter to the greatest possible degree. If simple and minimal exposure to contrary ideas can do them irreparable harm they certainly won't fare well once they step outside the echo chamber of conditioned irreality. In this sense, the consequences of being "raised like veal" are beginning to show. As a high school senior, I wrote a one-act play about just such a student who, having been taught a "more hopeful reality of human nature", returns to confront his mentor after being brutalized and broken by difficult experience in the real world. After a stilted conversation (writing fictional dialogue is hard) and a 13-minute sword fight, the mentor was less optimistic about human nature, and also no longer breathing.

But perhaps this is what the AI wants: generations of frightened adult-children who can be easily steered toward particular social and political views, and depended upon to throw tantrums when actual adults don't share and follow their “wisdom”. There it is, my doom-laden prediction about the future of the species. It's just like, my opinion, man, and also a bit tongue-in-cheek because I don’t believe this is inevitable. Perhaps my commentary is not as well-reasoned as I may eventually make it, but for now I need to move on to a less depressing topic, and perhaps enjoy a little clarifying risk by going climbing.

p.s. some threads to pull

Isaiah Berlin, "Two Concepts of Liberty"

Wikipedia - Locus of Control

Jonathan Haidt, Greg Lukianoff, Coddling book

John Stuart Mill, concepts of liberty

Jean M. Twenge, Generations book, due 4/25/23

Rutherford Institute vs Santa Clara County

College Free Speech Rankings

https://www.cagle.com/author/angel-boligan/

Previous
Previous

The Raven

Next
Next

Bored and lonely