Meredith Broussard, author of Artificial Unintelligence
AI is not Hollywood-style robots. What we have is “narrow AI.” It’s just math.
AI is a branch of computer science. Machine learning is a sub field of AI. It’s computational statistics, making predictions based on past data. When people talk about being completely driven by AI with no human intuition, etc. — that isn’t happening.
Techno-chauvinism believes tech is superior to humans. We should use the right tool for the job. When we say this, we’re saying math is better than humans. The people who came up with this idea are white, male mathematicians from elite schools.
People embed their own biases in technology.
Female members of the American Mathematical Society: under 20%.
Word embedding: computer makes associations based on a corpus of text (e.g., occupations associated with “she” or “he”).
Funding fantasies: telling the military we’re going to make computers smarter that humans in order to get funding. The “space elevator” idea.
Positive asymmetry: Nobody wants to be the naysayer. Nobody wants to bring up racism, sexism, privacy.
Video of soap dispenser that doesn’t work with dark-skinned hands. It’s a racist soap dispenser. The creators had a blind spot. We need more diverse teams.
Using technology is not inherently liberating. In fact, sometimes the opposite is true. People who don’t have computers/internet at home are disadvantaged when services go online (especially municipal services, education).
Self-driving cars are a terrible idea. She’s read the code the tech is based on.
Video games have the same problem as the racist soap dispenser. Facial recognition systems do not recognize people with dark skin. They are better with men than women. Self-driving cars are based on the same technology.
What to do? Buy her book.
Understand AI reality.
Differentiate between AI and automation.
Assume discrimination is the default in all automated systems (e.g., AI that decides who gets welfare benefits or a mortgage). If it’s based on past practice, past discrimination is built into the system.
Recognize ghost work. When you flag something on Facebook, it’s mostly real humans evaluating problematic content.
Avoid tech Columbus-ing. The study of AI is really just cybernetics, which has been around since the 1940s. AI people need to talk to social scientists.
Read up on AI’s social aspects.
Weapons of Math Destruction
Race After Technology
Behind the Screen
Algorithms of Oppression
Twitter and Tear Gas
Talk to people:
Center for Critical Race and Digital Studies
Data for Black Lives
Black in AI
Make sure we can read today’s news on tomorrow’s computers: Turns out the Internet is not forever. Librarian-types need to preserve content.
“The irony of writing online about digital preservation” by Meredith Broussard, Atlantic, 2015. Legacy media organizations fired their news librarians. They thought their CMS would save everything. The pipes were not hooked up correctly, and there were no human beings making sure things were hooked up correctly.
Future-Proofing the News by Hansen and Paul.
Cutting-edge digital news is disappearing. Bailiwick, her project to preserve data journalism about the 2016 election, doesn’t work any more and had to be taken down.
(Yes, she knows about the Internet Archive. They are good at preserving static web pages, but not anything interactive or streaming.)
Archiving digital content is a human-in-the-loop process. The fantasy is that these things can be made completely autonomous.