Schools, Ransomware, Crypto, and the "Promise" of Imitative AI
There is about one cyber attack per day on k-12 school in the United States. Most, if not all of these attacks are ransomware attacks — attacks designed to hold the information and systems of a target hostage until the attacker is paid. And the attackers almost always demand some form of crypto-currency, usually Bitcoin. Crypto has been shown to largely be an enabler of criminal enterprises. Which is unsurprising, and tells us a lot about where imitative AI is likely heading.
Crypto was supposed to save the world. It was going to end inflation (never mind that an economy needs at least some inflation to continue to function) and break the power of central banks (though not in a way that encourages democratic influence over the economy.) People were going to use NFTs to upend the art world (never mind that NFTS in practice are just pinkie promises that you own a jpeg) and get paid to play video games (never mind that the point of video games is fun, not money). None of these dreams came true, of course.
NFTs have crashed and burned. The crypto-video games have come and gone. It is still extremely difficult to use crypto to pay for anything in the real world. Crypto is subject to wild price swings, making it less a stable hedge against inflation and more a casino. What is thriving, however, is fraud.
It is not just ransomware attacks, though the primary driver behind those attacks are cryptocurrencies. Without the difficulty in tracking such payments, there would be little incentive to conduct these attacks, against schools or anyone. The field has been ripe with fraud from almost the start. People disappear with the funds after creating a new coin. The difficulty in tracing cryptocurrency means that it is heavily used in investment frauds. And, of course, entire exchanges have been shown to be Ponzi schemes.
Essentially, from the start, it was clear that crypto, at best, was a means of speculation. It has devolved into a means of facilitating criminal behavior, but the signs where there from the begining. Just like they are for imitative AI.
Imitative AI does not have a lot of practical uses, at least not on a commercial scale. Even putting aside the morality of stealing from people to get their training data or using CSAM in their training data, and even putting aside the environmental impact, imitative AI is not succeeding as a business. Most legitimate implementations have had serious problems, adoption is slowing, their use in search has been hilariously horrific, and the products themselves appear to be “useless”. What they are good at, unequivocally, is disinformation and non-consensual porn.
Imitative AI (and I keep making that distinction because there are other forms of machine learning, what we have collectively decided to call AI, that have brighter, more useful prospects) and crypto are depressingly similar. Both had an enormous amount of hype driven by tech firms looking to cash in on the next big thing. Both had no to limited utility as legitimate businesses. Both were well suited to assist in specific kinds of crimes. Heck, both even had or have big name celebrities shilling for them. The similarities are unavoidable as they are depressing.
History, it is said, does not repeat, but it does rhyme. And crypto and imitative AI are locked in an iambic pentameter dance of criminality and societal harm. We should learn from crypto and recognize that imitative AI has little to no legitimate uses, at least on a scale that can justify the enormous amounts of capital it takes to train and operate these systems. Unfortunately, we don’t seem to be recognizing the rhyme.
They hype machine trumbles on and precious little is done to regulate the industry. Another rough beast slouches toward, if not Bethlehem, certainly Silicon Valley. And the rest of us are condemned to have our ceremonies of innocence drowned by environmental damage and criminality brought on by a tech product with almost no legitimate business possibilities.
Again.