To what extent does maintaining a stable planet and stable society depend on our finding the right balance between tech innovation and tech risk? Artificial intelligence, the new holder of the acronym AI, certainly brings existential risks for humanity, as noted with huge concern by both thinktanks and new films. Do you use AI? How do you feel about it? What role(s) do you feel tech solutions generally have in navigating our future, and under what mindsets?
This short and emphatic piece by our valued board and exco member, Prof Eileen Crist, was first published 7 October 2023 in Earth Tongues, a blog of The Ecological Citizen.
"AI can help writers save time, reduce costs, overcome creative obstacles, and produce diverse and original content for different channels and audiences." - Linked-In advice, regarding the usefulness of AI for writers
No thanks. In the words of Bartleby, The Scrivener, written by Herman Melville with neither computer nor AI, I would prefer not to.
I don’t have an opinion about AI, especially an opinion about whether AI forecasts the end of the world literally or the end of the world as we know it. Even if I had an opinion on the matter, it would make no difference. Something I’ve noticed, however, is this: Just contemplating AI—pondering its existential nature and the terminus it’s paving—scrambles the mind. I would rather not have my mind scrambled trying to figure out AI, so perhaps better not exert myself to form an opinion.
Not having an opinion, however, does not mean not taking action with respect to AI. This thing—said now to be as smart as Albert Einstein and soon to become a billion of times smarter than the average person (whatever that means)—does demand a response. AI’s rash unleashing into the public domain throws down a gauntlet: We can all perceive the provocation if we allow ourselves to pause before AI, and choose an examined stance toward what’s on offer instead of compulsively joining the fray.
If AI is a prodigious crutch for creativity and for “creating content,” we do not want to place ourselves in a position, by dallying with it, to come under its dominion in the blink of an eye. Curiosity killed the cat is often valid counsel. Taking a breather and deciding deliberately vis-à-vis AI is sounder strategy than getting caught in the stampede to adopt every next technology that is hurled into our faces.
Decades back I read Wendell Berry’s essay (freshly republished) “Why I am not going to buy a computer,” and I admired his gumption. Despite the fact that I’ve always owned a computer, I have safeguarded one of the blessings that Berry celebrates of not having one: the art of handwriting. Handwriting, among other crafts, is one of nature’s gifts to the human. There’s something exquisite in the eye-hand coordination, the just right pressing of the ballpark or fountain pen onto paper, and the consecutive unique inked letters arising on the page. It is sweetly satisfying in a simple way. The art of handwriting comes under the rubric of what philosophers mean with the term Da-sein (Heidegger 2012). Certain things do indeed belong singularly to the human form of life, albeit making us neither special nor superior to those without them.
In the consumer society of mass-produced, uniform (often superfluous and throwaway) commodities, made every step of the way by fossil-fuel powered technologies, human crafts are newly revealed in what has always been their autochthonous aura. The term “artisanal” may be bandied about somewhat indiscriminately, but there’s priceless beauty in what the human eye-hand-mind ensemble (often anchored in tradition and mentorship) is capable of creating.
The technosphere, defined as the total mass of all things manmade, now weighs more than all living things (Stokstad 2020). It has taken over the face of the Earth and remains tenacious in its colonizing march. The technosphere has subjugated land, seas, and animals. It has smashed the atom, disassembled life, and projected itself into outer space. Now, the technosphere wants to take over, to replace, our thinking and our creative expressions; it so innocently offers to “assist in the content creation process.”
Methinks, NO. I do not want to know what AI “thinks.” I especially do not want AI to think or write for me. Additionally, I decide not to consider its input. This position is not motivated by prejudice against machines and by attachment to my cherished human distinction from them. Rather, in a world so slavish and reckless in every regard toward technology, with no evidenced capacity for either restraint or free choice, it behooves us to draw personal boundaries mindfully decided.
Setting firm boundaries with respect to technology is eminently defensible. For example, not looking at our devices first thing in the morning and last thing at night is without question the wise choice. “The morning,” wrote Henry Thoreau in 1854, “is the most memorable season of the day.” It is “the awakening hour.”
“Little is to expected of that day,” he explained, “if it can be called a day, to which we are not awakened to our Genius, but by the mechanical nudgings of some servitor, are not awakened by our own newly-acquired force and aspirations from within, accompanied as by the undulations of celestial music, instead of factory bells, and a fragrance filling the air—to a higher life…
All poets and heroes, like Memnon, are children of Aurora, and emit their music at sunrise. To him whose elastic and vigorous thought keeps pace with the sun, the day is a perpetual morning.”
Evening time, too, is holy, an unwinding and preparing toward day’s daily death: to sleep, perchance to dream.
The human nervous system can handle the occasional electrocution—that shrill email, social-media post, or news report that makes you go haywire first thing in the morning or purchases you insomnia for most of the night. Frazzle me once, and I can blow it off. Frazzle me twice, and I’ve been warned. Frazzle me thrice, and I either lay down my law or live forevermore under the law of frazzle. Briefly put, better not look at the devices too early in the morning or after the early or latish afternoon. Better to take some days off. The benefits of intermittent fasting from technology are as wholesome as those that apply to food.
AI is the newfangled technological curveball that invites us to stop—and to resolve if we want to draw a line or to obey its summons. Yet it seems to me that if the studied choice with respect to AI is No, then it is not about moderating its usage but about decisively cutting cords with it. The reason for such intransigent deciding is two-fold: Either the intelligence of AI is purely machinic—i.e. the modality of a complicated albeit deterministic algorithm following an extrinsically governed path (code). Or, the intelligence of AI harbors some form of “silicon consciousness”—i.e. evinces (seemingly) emergent properties of decision-making, judgment, creativity, emotion, and the like, thereby making AI intrinsically unpredictable.
Either way—whether AI is an Automaton or AI is a Golem—it is not something desirable to “think with,” but, on the contrary, something hazardous to eschew. I cannot imagine which possibility is worse: to risk being possessed by a machinic intelligence or to risk being possessed by an alien intelligence. Confronted with such a choice, all I can say is, I would prefer not to. I will try to explain.
I begin with the caveat that I recognize it is not kosher to tell (or even advise) other people what to do. But here’s one way to grasp how one’s stance toward AI is fateful: It involves the distinction, skillfully drawn by Martin Heidegger, between mastery and power. Mastery is what Heidegger called “originary.” It supervenes from self-generated predilection, effort, determination, and perseverance with respect to an endeavor. Otherwise put, mastery ensues from cultivated, sustained, and iterative human will power. Mastery demands from you to put in “10,000+ hours,” indefatigably, year in and year out, decade in and decade out—not for the sake of mastery as such but for the sake of the endeavor. The rewards of such unremitting endurance arrive long before mastery—if indeed mastery ever arrives. On the other hand, power (Heidegger noted) is never originary. It is defined externally, and its sway is adjudicated within contextual relationship with something or someone other. Instead of the outcome of a pursuit undertaken for its intrinsic value, power is the purpose of the undertaking pursued. If power is your purpose, be careful what you ask for: For while mastery is always a blessing, the attainment of power tends to be a curse.
Thus, if you are in “the game” for such things as money, fame, office, recognition, security, legacy, and the like, AI might well help you “save time, reduce costs, overcome creative obstacles, and produce diverse and original content for different channels and audiences.” But if mastery is the star on your horizon, your path must be forged from within.
Something remains to be addressed. Namely, a potential rebuttal: Are not machinic technologies, the technosphere, AI, etc., also part of human Da-sein? Some insist the answer is yes, and even call technology “human destiny” and sing its paeans and praises. I concede that all technology is part of our Da-sein, for it originates from nature’s gift of the eye-hand-mind ensemble peculiar to the human. (Other animal species have eye-hand/paw-mind ensembles peculiar to them.) Nevertheless, the conundrum regarding technology lies in its capacity to escape us, and to take on a life of its own. Suddenly and (rarely) without fail, it slips through our fingers and is released into the world. Let’s consider one instance: How bitterly did philosopher Lewis Mumford decry the invasion of the motorcar into urban centers! Was he not right?
Imagine today’s cities car-free, with public and efficient public transportation for all, bicycle lanes, urban gardens, green spaces, and broad walkways. Wouldn’t such cities (in a hypothetical world where humanity actually possessed free will vis-à-vis technology) have been superior to the ones we have? By far—but the motorcar won.
The broader point is the following. Technology is either something we are in charge of, or it is in charge of us. It just tends to work that way. So what do we decide?
Berry W (2018). Why I Am not Going to Buy a Computer. Penguin
Heidegger M (2012). Contributions to Philosophy (of the Event). Indiana University Press.
Mumford L (1968/2010). “The Highway and the City.” Hanks C ed. Technology and Values: Essential Readings. Blackwell, 361-368.
Stokstad E (2020). Human “Stuff” Now Outweighs All Life on Earth. Science December 9. https://www.sciencemag.org/news/2020/12/human-stuff-now-outweighs-all-life-earth
Thoreau H (1854/1991). Walden, or, Life in the Woods. New York: Vintage Books.
Eileen Crist is recently retired from Virginia Tech where she taught for 22 years as a sociology professor specializing in the ecological crises and the rise of an ecological civilization. She is an advocate of plant-based eating and a yoga teacher. Her work can be found on her website: http://www.eileencrist.com/.