Uncommon Descent Serving The Intelligent Design Community

Bob Marks: Bias is inevitable in AI; time to admit it

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

He is talking about software engineer Gregory Coppola’s recent revelations about Google and political tampering with search engine results:

Some biases are unintentional. In 2015 Google’s image search software identified a black software developer and a friend as gorillas. Google immediately apologized and did a quick fix on the problem by blocking gorillas and chimpanzees from its image recognition algorithm. Unintentional bias can be fixed when it is identified. But those who have an intentional bias — think of CEOs of cigarette manufacturers testifying at a congressional hearing — can sneakily try to avoid detection and scrutiny.

All computer algorithms are biased by design. The programs are biased to perform whatever tasks programmers tell them to do. The need for bias was first explicitly noted by Tom Mitchell about forty years ago in “The need for biases in learning generalizations.”1 Twenty-five years ago computer scientist Cullen Schaffer noted, in reference to machine learning, “a learner… that achieves at least mildly better-than-chance performance [without bias]… is like a perpetual motion machine.”2 In the case of learning in machine intelligence, the amount of infused bias can be measured in bits. 3 Any attempt at machine learning or search engine data mining 4 without bias is “futile.”5

Robert J. Marks, “Can computer algorithms be free of bias?” at Mind Matters News

Marks’s point is that such biases are not a matter of villains taking over. It’s a normal feature of the way people think. And people program computers. Doubtless, it finds its way into evolution issues for which people say they ran a simulation on a computer.

Lack of political diversity in Silicon Valley is bound to result in biased searches. One must take it into account when one uses Valley search products and look for alternative sources of information as well.

Robert J. Marks is an author, with William Dembski and Winston Ewert, of Introduction to Evolutionary Informatics.

See also: Google engineer reveals search engine bias He found Google pretty neutral in 2014; the bias started with the US 2016 election. The algorithms—the series of commands to computers—“don’t write themselves,” Coppola says. People who have their own opinions may write them into an algorithm, knowingly or otherwise.

and

Evolutionary Informatics Has Come A Long Way Since A Baylor Dean Tried To Shut Down The Lab

Follow UD News at Twitter!

Comments
This google software engineer made a very interesting comment on Tucker Carlson's show the other night about the inherent biases of people being reflected in any software that they may develop.
- Greg Coppola to Tucker: "algorithms don’t “write themselves” Excerpt: "Basically, any software launch reflects to outcome of thousands of human decisions. If you made different human decisions you would get a different result. And so, if you see a resulting end product that seems to encode a bias of one sort or another, there must have been that bias in the process that produced the end result. Because, like I say, different human decisions that went into the process would produce a completely different result.",,, "In my experience, as algorithms get more complicated and more advanced, that only means that they have more human decisions going into them. So there is actually more opportunities for human beings to influence the final product.",,, "If people aren't able to think critically about all the information that they are being given, especially if there is this kind of illusion that maybe somehow technology exists in a world that is completely apart from humans. That somehow you can create a computer that will think for itself and be free of any human biases, then people can be easily misled or manipulated." - Google Insider, Greg Coppola, Talks Political Bias at Google On Tucker Carlson https://www.youtube.com/watch?v=uu5-VQuFU_g
bornagain77
July 31, 2019
July
07
Jul
31
31
2019
04:23 AM
4
04
23
AM
PDT

Leave a Reply