AI: The New Plantation Owners
It wouldn’t be unfair to say AI learning has become the act of stealing someone else’s insights, labor, and creativity so that we don’t have to pay a human being what it’s actually worth.
The counter-argument is that a thing is only worth what someone can pay, so if you can get it for less, it must not be worth that much. So it comes down to property. If we’re talking about walking into your living room and carrying off your couch because I can do that without paying, so why would I go to a store, that extra little point (that you own the damned sofa and you worked for it and it’s the fruit of your labor, creativity, and insight) becomes pivotal.
This is why I think of AI learning, as it’s currently being implemented, as a new kind of plantation owner. When authors like George RR Martin are suing AI firms because those firms are teaching their algorithms with pirated ebooks off the web, it’s not unfair to say they’re requiring the life’s work of a human being without volition or compensation. It is both compelled, and it is stolen. Therefore, it is slavery.
The fact that we have offloaded the fundamental human impulse to command the labor of others without remuneration or choice to digital rather than pastoral is merely due to the flavor of our era. We once paid musicians to make art. Now, we pay people to steal their sample tracks, remix them, and tap a few computer keys over the top. Or we use computer-simulated drum tracks that sound perfect and therefore anything but human, but they certainly were taught to be drum tracks on the backs of actual musicians making music and innovating rhythms and percussive effects.
Face it, theft of other people’s work, so we can use it to profit ourselves, occurs in every generation under every rubric to ‘make it all right’. The plantation model persists among liberals and conservatives, the learned and the ignorant, the smart and the stupid, the enlightened and the barbaric, in Silicon Valley and El Paso, Texas. The moral clarity available to us is to recognize slavery for what it is.
In the end, we must be offended not at this slavery or that slavery, not at one form or another, one historical instance or another, or slavery based on degrees of pain and misery, as though it were less offensive if the suffering it creates is less. There is no moral cause to be against any given slavery unless we are against slavery. Anything less is merely being dramatic or having a fetish for human suffering.
And that is the crux of the problem here. If we can conceive of a type of slavery in which the bulk of those whose involuntary labor, insight, and creativity are being procured are essentially comfortable, we will build such a thing. More people will accept it for themselves and others. It is more easily normalized. It is harder to condemn if we are left only with slavery as a principle and can’t point to a sufficient degree of misery to elicit sympathy. We stop really caring whether it’s slavery at all—we’re merely upset if someone suffers past a certain threshold.
And that’s the final piece that suggests AI learning is the new plantation system. It’s an indictment of us, of our unprincipled attitude toward human ingenuity, work, and individual vision. It’s an indictment of our attitude toward individuality itself. The master is an algorithm and doesn’t whip anyone or separate families. He’s more clever now. He doesn’t even seem to compel our work. He merely waits for us to make something, build something, create something, innovate something, produce something, and then he shoplifts it for resale after the fact.
Even that is covered up because, of course, if you steal enough soup from enough different kitchens, mix it all together, and ladle it out in fresh bowls, each person will get something “new,” and you can say, “I’m not serving what he made.” Or at least we can’t prove that he is. Except, of course, that’s what the lawsuits are about. We actually can and will in enough cases that the embarrassment of grift and soothing subjugation will be outed, and so will you and I for benefitting from it.