I feel like the general populace might not realize the importance of this idea that @pluralistic shares:
what Dan Davies calles an "accountability sink." The radiologist's job isn't really to oversee the AI's work, it's to take the blame for the AI's mistakes.
this human (sometimes only nominally) in the loop is central to law as a whole not being broken
it's also of utmost importance for weapons of war, where AI actually is having life and death impact right now, with non-hypotheticals like "how do we make sure the system doesn't kill innocents without repercussions"
because if there's no repercussions the system will end up externalizing on the way to maximizing other metrics

Hendrik Pfaff 🇪🇺
in reply to datum (n=1) • • •Pharmaceutical companies in the #EU are required by law to have a "qualified Person" (QP). Their sole purpose is to sign of the entire (incredible complex) manufacturing process of the drugs the company produces every day and take the blame (i.e. get fired) in case of errors or costly product recalls.
Overseeing and signing of #ai processes however, sounds even worse, as it is literally impossible for a person to understand the inner workings of the black box...
Subjacent Banana
in reply to datum (n=1) • • •It's like "self-driving" and "computer controlled aircraft" - they always kill the autopilot the minute there's a problem. Boom. Pilot Error.
That Air France flight was a nice case study. Airplane was on auto pilot, pitot tubes froze and confused the autopilot, autopilot dumps off, sleepy crew has to make quick decision abt what the data means, takes the plane into a death stall. Pilot Error.🤡
en.wikipedia.org/wiki/Air_Fran…
2009 plane crash of an Air France Airbus A330 in the Atlantic Ocean
Contributors to Wikimedia projects (Wikimedia Foundation, Inc.)