The neglect of AI ethics extends from universities to industry
"At least we can rely on universities to teach the next generation of computer scientists to make. Right? Apparently not, according to a new survey of 2,360 data science students, academics, and professionals by software firm Anaconda.
Only 15% of instructors and professors said they’re teaching AI ethics, and just 18% of students indicated they’re learning about the subject.
Notably, the worryingly low figures aren’t due to a lack of interest. Nearly half of respondents said the social impacts of bias or privacy were the “biggest problem to tackle in the AI/ML arena today.” But those concerns clearly aren’t reflected in their curricula."
Tom Vander Ark, Forbes; How To Teach Artificial Intelligence
"Artificial intelligence—code that learns—is likely to be humankind’s
most important invention. It’s a 60-year-old idea that took off five
years ago when fast chips enabled massive computing and sensors,
cameras, and robots fed data-hungry algorithms...
A World Economic Forum report
indicated that 89% of U.S.-based companies are planning to adopt user
and entity big data analytics by 2022, while more than 70% want to
integrate the Internet of Things, explore web and app-enabled markets,
and take advantage of machine learning and cloud computing.
Given these important and rapid shifts, it’s a good time to consider
what young people need to know about AI and information technology.
First, everyone needs to be able to recognize AI and
its influence on people and systems, and be proactive as a user and
citizen. Second, everyone should have the opportunity to use AI and big data to solve problems. And third, young people interested in computer science as a career should have a pathway for building AI...
The MIT Media Lab developed a middle school AI+Ethics course that hits many of these learning objectives. It was piloted by Montour Public Schools outside of Pittsburgh, Pennsylvania, which has incorporated the three-day course in its media arts class."
David Weinberger, Harvard Business Review; How Machine Learning Pushes Us to Define Fairness
"Even with the greatest of care, an ML
system might find biased patterns so subtle and complex that they hide
from the best-intentioned human attention. Hence the necessary current
focus among computer scientists, policy makers, and anyone concerned
with social justice on how to keep bias out of AI.
Yet machine learning’s very nature
may also be bringing us to think about fairness in new and productive
ways. Our encounters with machine learning (ML) are beginning to give
us concepts, a vocabulary, and tools that enable us to address questions
of bias and fairness more directly and precisely than before."
Greg Epstein, TechCrunch; Teaching ethics in computer science the right way with Georgia Tech's Charles Isbell
"The new fall
semester is upon us, and at elite private colleges and universities,
it’s hard to find a trendier major than Computer Science. It’s also
becoming more common for such institutions to prioritize integrating ethics into their CS studies, so students don’t just learn about how to build software, but whether or not they should build it in the first place.
Of course, this begs questions about how much the ethics lessons such
prestigious schools are teaching are actually making a positive
impression on students.
But at a time when demand for qualified
computer scientists is skyrocketing around the world and far exceeds
supply, another kind of question might be even more important: Can
computer science be transformed from a field largely led by elites into a
profession that empowers vastly more working people, and one that
trains them in a way that promotes ethics and an awareness of their
impact on the world around them?
Enter Charles Isbell
of Georgia Tech, a humble and unassuming star of inclusive and ethical
computer science. Isbell, a longtime CS professor at Georgia Tech,
enters this fall as the new Dean and John P. Imlay Chair of Georgia
Tech’s rapidly expanding College of Computing."