The computer sciences are often thought of as purely objective, devoid of ethics. But, as computer science professor Dee Weikle explains, every single computer program contains its writer’s conscious and unconscious values.
The field as a whole is becoming attuned to ethics in our computer culture, and that discussion also touches a new course about artificial intelligence, taught at Eastern Mennonite University (EMU) by Greg Keim. “The course introduces the core algorithms of AI as well as exploring the human impact and ethics of AI,” said Keim.
“For artificial intelligence, it’s worth thinking about … how can we direct this to be used for the most positive benefit?” says Keim, an adjunct faculty member.
A human-impact approach permeates EMU’s computer science department as a whole. Weikle, for example, is a member of the Association for Computing Machinery’s Special Interest Group on Computers and Society, which addresses matters such as this: The command “control + alt + delete” on a PC requires hand mobility that disabled users may not have. And, an ATM screen that asks in English if one would like options in Spanish is ineffective across language barriers.
Weikle, who has a PhD in computer science from University of Virginia and a BS in electrical engineering from Rice, says EMU’s computer science program strives to educate young computer scientists to be aware of such issues.
Professor Dee Weikle says EMU’s computer science program strives to educate young computer scientists to be aware of such issues. (Photo by Jon Styer)
EMU’s computer science program strives to educate students about the human impacts of technology, says Weikle. (Photo by Jon Styer)
“A much more complete computer scientist” is likely to emerge from EMU’s liberal arts framework – with its variety of classes and cross-cultural study opportunities – than from undergraduate programs that exclusively focus on the technical aspects of information technology, she says.
Modeled on UC Berkeley toolkit
Many of the exercises in Keim’s class use a toolkit from the University of California-Berkeley, which allows students to implement key AI algorithms in the context of the PacMan video game.
First-year Joel Christophel says the course has taught him to employ “ideas about real, hands-on application of more theoretical concepts in artificial intelligence.”
Keim, who hails from the Harrisonburg area and earned his bachelor’s degree at Swarthmore, soon after joined the growing language-learning software company Rosetta Stone. He was the “stay-up-all-night coder for the first version,” working there until attending Duke University in 1994. Five years later, Rosetta Stone interrupted again, and Keim returned to the Harrisonburg-based company in lieu of finishing his doctorate. Rosetta Stone “was too much work, too much fun.”
As Keim was exiting Duke University’s graduate program in artificial intelligence, the editor of the New York Times crossword puzzles, Will Shortz, declared that no computer could beat even an average human at a Thursday (medium-hard difficulty) crossword puzzle. In response, Keim helped run a seminar at Duke in 1999 to do exactly that – and succeeded.
Achievements such as this tap into the discussion of ethics. How good is technology at performing humanesque tasks? How good can it be? How does that affect people?
Deep learning
These questions are particularly pertinent in regards to current discussions around the field of “deep learning.” Deep learning refers to new techniques that leverage neural networks and machine learning, sub-disciplines within artificial intelligence, which can automatically learn features at different levels of abstraction from data. “It’s become the state-of-the-art technique,” says Keim. Techniques like deep learning offer massive potential to the artificial intelligence field, but as with any technological forefront, tapping this for the good of all requires conscientious foresight.
Keim cites the example of an Amazon.com shipping warehouse. An employee stands at a table, and robots bring entire shelves of products to that person, who retrieves an item, and packs it while the robot returns and re-sorts the shelf. Humans are currently better at the complex vision and dexterity tasks associated with picking, inspecting, and packing the items. However, ongoing advances in AI prompt the question, what about the day when they are not? What will the economic effect be and how will our workforce look?
To be well-rounded, Weikle encourages EMU’s students to double major or minor with computer science. While digital media and business are the most common combinations, she challenges those interested to “pick something that doesn’t use a computer!”
Article by: Randi B. Hagi
Photo by: Jon Styer