An Open Letter to the Knowledge Workers of the World
On Industrial Agriculture, the Mississippi Delta, AI, and the Future of Knowledge Production
Dear knowledge workers of the world,
I write you today as one of your ranks, a graduate of a succession of universities, working my way up to an ominously named terminal degree, and now busy at knowledge work in the academic humanities. But I’m not here to talk about me. I’m here to talk about us, and what’s going to happen to us, or, at least, what is probably going to happen. I know ‘knowledge worker’ is rather vague, but I suspect if you’re part of the group I’m addressing you’ll know more or less what I mean by it. I’d like to follow on with saying “and here’s what we can do about it,” but I’m not as naive as I used to be. Maybe a gesture in that direction will suffice for now, and anyway it’s all I can summon. But before I get to your situation and particularly the question of AI, we’re going to talk about the Delta.
Unless you’re from Mississippi, and even if you are, odds are good you’ve not spent much time in the Delta, that massive alluvial plain (only small portions of which are bona fide delta, and then of tributary rivers to the Mississippi) that arcs along the northwest corner of the state, a vast expanse of rich soil laid down by the river over tens of thousands of years. The Delta was briefly in the news this last week thanks to devastating tornadoes that ripped across the perfectly flat landscape, leveling completely the town of Rolling Fork. Or, at least, what was left of Rolling Fork—as some news reports described it, there really was only so much there to be destroyed, like so many other small Delta towns and communities. The human poverty in the remaining sad little towns, coupled with the sheer absence of humans across most of the rest of one of the most intensively managed and controlled landscapes in America is truly striking; it is always still a shock to me despite my broad familiarity with the region. The agricultural lands remain highly productive, albeit dependent now on fossil fuel inputs for sustained fertility, but that productivity does not translate to economic gains for most of its inhabitants, the overwhelming majority of whom are black; it is also, outside of the scattering of wildlife preserves that retain the pre-settlement bottomland forests, an ecological desert. There are many reasons for this immense poverty in such a rich topography, starting with the initial settlement and cultivation of the Delta, which took place using huge inputs of ‘human resources’ (and animals, but especially humans), intitial in the form of black slaves, later in the form of black and white tenant farmers and other landless people. Their labor—labor is too soft and bloodless a word for what hoeing and picking massive fields of cotton under the Mississippi sun entailed—and their disenfranchisment and dispossesion cleared the way for the spectacular degree to which Delta agriculture has been consolidated, corporatized, and mechanized.
Today, one of the biggest factors in the continued poverty of Delta communities is the ‘progress’ of modern agriculture there, rooted in the fact that modern agriculture largely does not need humans, nearly all functions replaced by enormous, often automated, fossil fuel powered machines. The workers rendered redundant by the post-war dominance of the machine did not have places to go, the continued stream of wealth from the soil they had worked did not translate down to them. Instead, the march of agriculture in the Delta and around the world has been towards greater and greater reduction of the human element or of anything not narrowly productive and efficient. The dream of corporate agriculture, in fact, is quite literally to have robots raise and harvest everything, and, probably, to eventually have the robots repair the other robots, human workers rendered almost completely redundant. In other words, far from being a backwards place, the Delta is a vision of the future as dreamed by the technologists and capitalists and the like, built upoin the hard labor of men and women whose historical traces have all but disappeared, both from the literal landscape and from wider memory and consciousness.
The Delta is but one example of many we could bring up: here in the greater Chattanooga area, as in so many American cities outside of a handful of thriving centers, the city itself and its hinterland are littered with the ruined hulks of industry, of factories small and great that once hummed with industry; while hardly paragons of virtue (the heavy metals and other elements they churned out are still leaching out of the soil and into the water in many places), they were local economic drivers, and supplied good jobs with stability and predictability. Many of the displaced black farmers fleeing the Delta of Jim Crow found industrial jobs in the booming cities of the North; the work was hard but the pay was good, and while racism existed it was possible to work and dream beyond the narrow range then current in most of the South. Today—well, things are different. Globalization, neoliberalism, automation, lots of culprits and dynamics, take your pick. ‘Capitalism,’ again, works pretty well as an analytical term: the demand for efficiency and profit, consequences to humans and communities and ecosystems be damned. It’s not the only factor, but it’s pretty important too that the industrial capitalism of the Delta and of the post-industrial wasteland is a system fueled, thus far anyway, by cheap and abundant fossil fuels and by the drawing down and spitting out of other ‘resource bases,’ indeed of reducing humans and soil and forests and cultures and knowledge itself to a mere ‘resource’ to be strip-mined and processed, all with the seemingly unlimited power unleashed by the derivatives of long-dead plants mined or pumped or fracked out of the earth itself.
What does this have to do with knowledge workers and with AI? Very simply that at root what AI might well make possible is the all but total mechanization and automation of areas of work and life that we hitherto thought immune, the sorts of things we go to school for years of our adult lives to train and pick up credentials for; and that the economic, the ecological, and other buildups and paths necessary for making such a takeover possible have been long in the making. I want to make the very simple suggestion that we in our various fields and ways of life are just as likely as not to end up like agriculture or American manufacturing, and that most of the discourse around AI seems to avoid the basic fact of what kind of economic system it exists within. It is more exciting, I suppose, to imagine some sentient AI emerging from the ether and ‘deciding’ to wipe us all out a la Skynet than to confront the idea of AI simply becoming a new tool for displacement, replacement, and further economic consolidation of power and wealth.
AI is genuinely complex and not fully understood by anyone, and means different things in different situations—the machine learning work I am a part of, for instance, has much more limited goals and restricted range of use than what, say, Google or Microsoft or whoever is working on, to put it mildly (we have the advantage of being somewhat removed from the direct stream of capitalist control, though it’s a relative removal, not absolute, and is certainly far from ideal in many ways given our dependence on external funding and the like). That said, at its root, there are some disquieting aspects of how AI—meaning here and in the following primarily large language models—works that not coincidentally echo back to how industrial capitalism as a whole has long worked. One of the most striking is the general dependence of the large language model on large bodies of ‘training data,’ supervised or unsupervised: for something like ChatGPT to produce comprehensible essays or idiomatic translations, it needs a huge trove of human-produced text, which it can reduce and examine and render into a body of quantified relations and so spit out new text, fueled, as it were, by the old. The human cultural production of years past, insofar as it has been made digitally legible, becomes a vast resource base, not very different from alluvial soils or shale sands or coal deposits. It’s not a perfect analogy, of course, and it should be remember AI still needs good old fashioned power sources, and in non-trivial amounts (advanced computing is not cheap or light on energy usage), but I think it’s a potentially useful way of thinking.
And while unlike with fossil fuels, which once consumed can’t be restored, a large language model doesn’t destroy the cultural resources upon which it draws, it can potentially undermine the further generation of those resources. Robust AI is likely to create strong path dependencies within our actual existing situations, as people and organizations and larger entities become both dependent upon its services and accustomed to its shortcomings, accepting its peculiarities and failures as the cost of speed, efficiency, and the like. Over time this is likely to lead to atrophying of the very skills and abilities that went into creating the ‘fossil’ reserves of cultural production upon which AI feeds itself; why learn to write or code or read a book when a machine can do it passably well for you? Without going too far down this rabbit trail, we need only look at the effects that digital technology generally has had on us as writers and thinkers and the like, and while it’s not all bad, we’d be lying to ourselves if we omitted what are increasingly quantifiable instances of loss of function and ability thanks to how we use digital tools and devices (probably especially devices…). The products of AI seem easily poised to do similar work, and to be used towards ends we would not ourselves actually prefer—and that’s even with the real shortcomings and limitations of the technologies themselves.
See, here’s the thing that I have yet to see raised in the conversation around machine learning, AI, large language models, and the rest of that complex: it’s true that what a scholar trained deeply in a field of humanistic inquiry can do in terms of research, critical analysis, and communication is and will almost certainly always be qualitatively different from what an AI model can generate. But at a certain level, that truth does not matter. The agricultural model that prevails in the Mississippi Delta and all over the world now is not fundamentally better than other models on all relevant points; in reality it is fundamentally unsustainable, ecologically destructive, often downright economically—in conventional terms even—irrational, dependent on state subsidies, and utterly reliant above all on massive, massive inputs from fossil fuels at every single level. It is very good at producing a narrow range of agricultural commodities for sale on a particular sort of capitalist market, and within our capitalist system as currently configured that is really all that matters.
Likewise, I’ve little doubt that the products once produced in the ruins of industry here in and around Chattanooga—and all over ‘post-industrial’ America—were objectively higher quality and more durable than those which replaced them in the era of outsourcing globalization. But again it did not and does not matter. Cheapness and efficiency trump all, and just as over time people have become fully accustomed to the inferior but cheap and abundant products of industrialized agriculture, they have become accustomed to the cheap and abundant if inferior and short-lived products of globalized manufacturing. Likewise, agricultural industrialization and globalization did not just emerge out of nowhere, but followed long building trajectories, political systems, public myths, and the like. Perhaps a bit closer to home, we all know I think that doing anything on a smart phone is objectively and quantifiably less enjoyable and less productive than using an actual computer, or a book, or what have you, yet… we prefer the smart phone, again and again.
The economic wasteland (for the majority of its inhabitants anyway) of the Delta took its particular shape because of decades of undermining and displacement of black farmers well before mechanization, with most actual farmers already robbed of land and political power and often driven from the region even before its full transformation with mechanization and automation. The technology built upon and cemented existing situations, ultimately becoming the central driver of things, but not at first. As I like to stress and will do so again, technology can only become hegemonic and transformative when the conditions are right, when the ground is already prepared. These things are not automatic, until, of course, they are, and for reasons that can be discerned and analyzed.1
There is no reason to think that the world of knowledge production and services will be any different from that of agriculture or manufacturing or what have you, and indeed in the very language I’ve just used—‘production and services’—we can see that the game is already basically lost, we’ve all but forfeited the field. We’ve already accepted the terms of the arrangement, we’ve reduced writing and cultural production to just that, products to be churned out and consumed, with the university as the final stage in the preparation of ‘marketable’ skills and consumer formation. Makers, buyers, consumers, everyone, we’re accustomed already to demand and to consume and to plan stuff that sounds like it could have been generated by AI, and now the AI is here ready for its niche. Perhaps we could imagine a world in which that didn’t mean displacement and atrophying of skills and virtues and abilities, but we still live in the same world in which agriculture was industrialized and automated, in which manufacturing was automated and outsourced and deunionized. What happened to the family farm, what happened to the union factory job, can, and probably will, happen to the university, and to every field of knowledge work and production possible (and probably some in which even by capitalist logic it doesn’t actually make sense).
I can’t predict the future, obviously, in that I cannot say whether AI will become the knowledge worker, white collar equivalent of industrialization and automation in other fields. Maybe it won’t—though most people who have argued it won’t have predicated their arguments upon the shortcomings of AI, which as I’ve argued above may not really matter all that much. Most likely, AI will simply be one tool in a larger expanding tool kit that drives the further downgrading of the university and other centers of scholarly thought and work, and similar trends in other areas of ‘knowledge work,’ from entertainment to advertising to grant writing. If you thought you were safe, that your job and your skill set was irreplaceable, think again—the odds are not on your side, and that’s even apart from what the tools and processes that emerge might end up doing (if we let them, and if we permit the socio-cultural frameworks necessary) to us ourselves.
Dear knowledge workers, I’m not going to end on a happy note or a revolutionary one. It would be nice to be able to now offer some political program, to make a play on the ‘workers of the world unite’ bit, or otherwise rehash some old leftist or straight-up Marxist stuff and get your blood stirring. But while my analysis has pretty obviously benefited from those old traditions, I don’t think they offer us very much today, and anyway the global historical track record of existing alternatives to capitalism isn’t very illustrious (in part—the reasons are many obviously—that most twentieth century leftists bought into the same basic myths as the capitalists). We’re not going to get rid of AI, one way or another—and it certainly can have its place as a powerful and useful tool within functioning and healthy knowledge and market ecosystems, as it were. Like so many of the tools and bodies of knowledge of industrial modernity, we can in fact imagine—and realize to albeit currently limited degree—far different worlds built with and using these tools, and for as long as we have access to such things we should make responsible, critical usage of them. AI can be a very potent tool for making sense of the sheer scale of production and accumulation and feedback effects of industrial modernity itself, and could in the right circumstances actually reduce unnecessary human labor without reducing other things as well (such as decent wages and living conditions, say). There are plenty of David Graeber’s ‘bullshit jobs’ whose replacement by AI would not cause any net loss to humanity—but that is assuming the displaced workers have something better to do and somewhere better to go, that somehow the arc of capitalism will be automatically nicer to them than it was to farmers and factory workers.
We’re not going to get rid of capitalism any time soon, and anyway we’d likely be no better off with whatever would immediately emerge to replace it. Our first task must be clarity of understanding, a certain cold-eyed realism about what is happening, about the actual contours of our world, and what is possible in the moment. We can then take the failure of the great totalizing alternative systems of previous generations as an opportunity, not a cause for despair: we have numerous advantages in terms of knowledge, accumulated experience, and the very real largess of the capitalist economy to which those of us not living in places like Rolling Fork still have access. Individual and collective experimentation and building of alternatives can happen now, and, I’m happy to say, is happening now. Odds are good that some people in or near your physical neighborhood are already trying something different, exploring other ways of living and being in the world—if nothing else, seek them out, try something yourself!
The capitalist use of AI will almost certainly be unsustainable in the long term, dependent as it is on the continued creativity and vitality of the very areas of human work and thought which it works to undermine. It is even possible that the capabilities of AI will undermine the structure of the internet itself (imagine weaponized AI turned, deliberately or inadvertently, against the architecture of the digital world), and so generate massive crisis and perhaps fundamental transformation. Perhaps—but I don’t suggest waiting around to find out.
It’s really important to stress this point: social systems and norms, political processes and structures, even climatic and ecological factors, are all crucial in whether a given technology or economic configuration will emerge and whether it will become dominant and causative in its own right. Initial causation if often fallacious, but can transition into an actively causative role: for instance, we might argue that while the rise of social media did not cause a crisis of loneliness and depauperate sociability, and that instead it was the pre-existence of such factors that gave social media a huge boost, we might now argue that social media has become a factor perpetuating and giving particular contours to a crisis of loneliness and further atrophying of sociability.