Uncommon Descent Serving The Intelligent Design Community

Total surveillance should worry us more than an AI news writing machine

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

This question is especially relevant in an age when the battle for intellectual freedom on campus must be fought anew because the internet can so easily be used to identify, freeze, and punish dissidents.

While it’s unclear that the automated news spew is a serious threat (relative to a “handmade” one), there are a number of real AI dangers we should be aware of, especially constant surveillance and data gathering:

Because so much surveillance is now possible, the new AI technologies that promised freedom are becoming a threat to it. It’s well known that the Chinese government is using AI to monitor and police the daily lives of citizens. What’s less well-known is that China is exporting the technology for mass surveillance to developing countries as foreign aid. Or that Canada recently demanded intimate banking data from half a million citizens. It’s probably quite easy for governments to convince themselves that they could solve a lot more of everyone’s problems if they just knew what we are all doing all the time. “AI dangers that are not just fake news” at Mind Matters

The most dangerous tyrannies are the ones intended to bring about some public good. It’s the difference between someone who just wants to rob you and someone who wants to fix you and run your life.

See also: US prez Trump vows to tie federal U funding to campus free speech “In an interview after Trump’s speech, Terry Hartle, senior vice president for the American Council on Education, called the executive order “a solution in search of a problem,” because “free speech and academic freedom are core values of research universities.

Maybe dissent from Darwin can’t kill a career anymore?

See also: Who’s afraid of AI that can write the news? AI now automates formula news in business and sports. How far can it go?

and

Top Ten AI hypes of 2018 (Robert Marks)

Comments

Leave a Reply