Every child, at least of my generation in the US, learned the story of the Emperor’s New Clothes. A clever tailor sells invisible clothes to the emperor based on the fanciful notion that only smart or brave or good or pick the virtue the teller thinks is the most important people can see the clothes. The emperor, being functionally stupid because he is the kind of emperor that doesn’t listen to anyone who disagrees with him, of course claims to see the clothes. And since he’s the emperor, everyone else claims to see the clothes. So, he walks around naked until one day a small child or fool or hero or whatever virtue the teller thinks is the most ironic or important points at him and goes “Hey, dude’s naked!”
The dialogue varies from version to version.
Three years ago, an AI guru said that we should stop training radiologists, since machine learning, what we used to call AI before AI got all buzz-y, was going to clearly be so much better at radiology than humans. Hasn’t happened yet — hasn’t come close. In fact, we have far too few radiologists. Why? Because radiology is not just about looking at pictures. The context of the imaging matters a great deal to the ability to properly read an image. Even putting aside well-known issues of spurious associations (for example, certain kinds of patients only get imaging done from portable machines because they are so sick, so the AI learns that said images done from those machines are more likely to represent a condition. The algorithm devolves into a portable imaging detector, not an image reader), images can represent multiple conditions. Doctors and radiologist need to take into account their experience and the context of the patient’s full medical history in order to be able to fully determine the true meaning of the image.
Radiological AI, it seems, is not fully dressed.
This lack of clothing was largely discovered by researchers outside the control of the AI companies themselves. That is a growing problem, however because AI is beginning to price researchers out of the field. In some cases, AI companies simply offer too much money for researchers to turn down and they join the companies themselves. Those people are effectively taken out of the group of people that can do meaningful contra-research to AI company’s claims. The cost of compute and data storage is also a factor. Academic and government institutions do not have the money to pay for the compute power and storage that private firms do, putting them at a disadvantage.
Much of this article talks about how this puts public good AI at a disadvantage, which is true. If we are going to have AI research, we should have it at least theoretically focused on public needs rather than private wealth concentrations. But I am more concerned with how it diminishes our ability to actually understand what is going on in the field. Since AI is clothed in math and algorithms, is a bit harder to tell when AI is wearing clothes or not. We need outside researchers, people whose salary does not depend on the success of a given company, to be there to point out when AI is undressed.
Because it seems to be naked a lot. The aforementioned radiology failure is but one example. Even the most hyped AI companies aren’t making a profit today and Google and Amazon have dialed back expectations with their sales teams. The latter might be because enterprise customers aren’t finding these tools very useful. Even in software, a place where generating boilerplate code seems right up imitative AI’s alley, studies have shown that code quality is declining as AI reach increases. If AI is wearing clothes, they aren’t very impressive outfits.
We need independent research to sanity check any technology claims made by businesses. We need independent researchers to ensure that the research is focused on the public good, not just private wealth. All of this applies just as much to AI as it does to any other technology. Regulation is good and needed, but any proper response to the growth of a new technology must be proper funding of adversarial, academic institutions to say, in effect, “Hey, dude’s naked.”
Otherwise, we will be subjected to Sam Altman walking down the street, butt naked, with no one to tell him to put some pants on. And that’s not good for anyone.