Index Investing News
Friday, May 8, 2026
No Result
View All Result
  • Login
  • Home
  • World
  • Investing
  • Financial
  • Economy
  • Markets
  • Stocks
  • Crypto
  • Property
  • Sport
  • Entertainment
  • Opinion
  • Home
  • World
  • Investing
  • Financial
  • Economy
  • Markets
  • Stocks
  • Crypto
  • Property
  • Sport
  • Entertainment
  • Opinion
No Result
View All Result
Index Investing News
No Result
View All Result

Before worrying about a killer bot, regulators must take on human abuse of AI

by Index Investing News
November 10, 2023
in Opinion
Reading Time: 3 mins read
A A
0
Home Opinion
Share on FacebookShare on Twitter


Every week sees a slew of launch announcements in artificial intelligence (AI). The last week, however, was marked by a rush of declarations on how to regulate it. It started off with the US springing a surprise with Joe Biden’s executive order requiring AI majors to be more transparent and careful in their development. This was followed by the Global AI Summit convened by Rishi Sunak; attended by 28 countries (China included) and boasting the star power of Elon Musk, Demis Hassabis and Sam Altman, it led to a joint communique on regulating Frontier AI. The EU is racing to be next, China has something out, and India is making the right noises. OpenAI announced a team to tackle Super Alignment, declaring that, “We need scientific and technical breakthroughs to steer and control AI systems much smarter than us.”

The race to develop AI has turned into a race to regulate it. There is certainly some optimism here—that governments and tech companies are awake to the dangers that this remarkable technology can pose to humankind, and one cannot but help applaud the fact that they are being proactive about managing the risks. Perhaps they have learnt their lessons from the ills that social media begat, and want to perform better this time. Hopefully, we will not have an AI Hiroshima before people sit up to the dangers of it.

However, I am not so sanguine about this. On closer look, most of this concern and regulation seems to be directed towards what is loosely called Frontier AI—that time in the future when AI will become more powerful than humans and perhaps escape our control. The Bletchley Park AI Summit was very clear on this; it focused on Frontier AI. The OpenAI initiative is also about alignment between human and superintelligent AI values—thus the term ‘super alignment.’ Most of the narrative around regulating AI seems to be focused on this future worry. My belief, however, is that we need to worry far more about the here-and-now, and the current issues that AI has. Today’s large language models (LLMs) often hallucinate, playing fast and easy with the truth. AI powered ‘driverless’ cars cause accidents, killing people. Most GenAI models are riddled with racial and gender biases, having been trained on biased supersets of data. Copyright and plagiarism problems abound, with disgruntled human creators filing lawsuits in courts for redressal. And then, the training of these humongous LLMs spews out CO2 and degrades the environment (bit.ly/3QsM2Wx).

Gary Marcus, noted AI scientist and author, echoes this sentiment: “…the (UK AI) summit appears to be focusing primarily on long-term AI risk—the risk of future machines that might in some way go beyond our control…. We need to stop worrying (just) about Skynet and robots taking over the world, and think a lot more about what criminals, including terrorists, might do with LLMs, and what, if anything, we might do to stop them.” (bit.ly/3tQXVy5). A Politico article (politi.co/3tUSzli) has an interesting take. It talks about a deliberate effort by Silicon Valley AI billionaires lobbying the US Government to focus on just ‘one slice of the overall AI problem’—“the long-term threats that future AI systems might pose to human survival.” Critics say that focusing on this ‘science fiction’ shifts the policy narrative from pressing here-and-now issues, ones that leading AI firms might want kept off the policy agenda. “There’s a push being made that the only thing we should care about is long-term risk because ‘It’s going to take over the world, Terminator, blah blah blah,’” says AI professor Suresh Venkatasubramanian in Politico. “I think it’s important to ask, what is the basis for these claims? What is the likelihood of these claims coming to pass? And how certain are we about all this?”

This is my exact point. Instead of super-intelligence-caused doomsday scenarios, which have a comparatively tiny probability, we need to focus on the many immediate threats of AI.

It is not a Terminator robot arising from a data centre that will cause the destruction of humanity. More likely, it will be a malevolent state actor who uses deepfakes and false content at scale to subvert democracy, or maybe a cornered dictator who turns to AI-based lethal autonomous weapons to win a war he is losing. Moreover, an unbridled race to build the next massive LLM will further accelerate global warming. And then there’s the crisis of a deluge of fake provocative news that could turn communities on each other.

AI might not harm us, but a human using AI could. We need to regulate humans using AI, not AI itself.



Source link

Tags: AbuseBotHumankillerRegulatorsWorrying
ShareTweetShareShare
Previous Post

AMC Stock: Another Massive Equity Offering (NYSE:AMC)

Next Post

Workers Are Missing Cog In U.S. Manufacturing Gears

Related Posts

Marijuana Vendors Sued For Allegedly Not Warning Consumers Of Risks – FREEDOMBUNKER

Marijuana Vendors Sued For Allegedly Not Warning Consumers Of Risks – FREEDOMBUNKER

by Index Investing News
May 7, 2026
0

Authored by Matthew Vadum via The Epoch Times,Companies that legally sell recreational marijuana to adults are being sued in Illinois...

a century of transformation in Southern Africa

a century of transformation in Southern Africa

by Index Investing News
April 27, 2026
0

Dr Pali Lehohla|Published 6 days agoIn this article that marks fifty years on from June 16, I posit through the...

The Queens street meetup was chaos—and can’t happen again

The Queens street meetup was chaos—and can’t happen again

by Index Investing News
April 25, 2026
0

Let’s get something straight right away: What happened at 69th Street and Eliot Avenue last weekend was serious—not a case...

Why Dhaka is watching Bengal elections closely

Why Dhaka is watching Bengal elections closely

by Index Investing News
April 21, 2026
0

On April 23 and 29, West Bengal will head to the hustings, to elect a new state assembly. This is...

The 4 Pillars I Used To Build Wealth (Not Luck, Not Hype)

The 4 Pillars I Used To Build Wealth (Not Luck, Not Hype)

by Index Investing News
April 18, 2026
0

A lot of us grow up believing that wealth is something reserved for other people. It can feel like something...

Next Post
Workers Are Missing Cog In U.S. Manufacturing Gears

Workers Are Missing Cog In U.S. Manufacturing Gears

Bitcoin Fear And Greed Index Today — April 25, 2023 | by Crypto Beat | The Dark Side

Bitcoin Fear And Greed Index Today — April 25, 2023 | by Crypto Beat | The Dark Side

RECOMMENDED

An Ex Reappears 11 Years Later – ‘Lacking You’ Thriller Collection Trailer

An Ex Reappears 11 Years Later – ‘Lacking You’ Thriller Collection Trailer

November 26, 2024
Existing US Home Sales Plunged In March, Despite Falling Mortgage Rates – FREEDOMBUNKER

Existing US Home Sales Plunged In March, Despite Falling Mortgage Rates – FREEDOMBUNKER

April 13, 2026
Courageous Brit hostage Emily Damari provides defiant salute with bullet-ravaged hand after she’s lastly freed by Hamas thugs

Courageous Brit hostage Emily Damari provides defiant salute with bullet-ravaged hand after she’s lastly freed by Hamas thugs

January 20, 2025
‘Batgirl’ “Was Not Releasable” Says DC Co-Boss, Wants to Work With Directors – Deadline

‘Batgirl’ “Was Not Releasable” Says DC Co-Boss, Wants to Work With Directors – Deadline

January 31, 2023
I held my own FUNERAL and buried myself alive to see what death is really like – grieving friends even read out eulogies

I held my own FUNERAL and buried myself alive to see what death is really like – grieving friends even read out eulogies

October 2, 2023
Home seller offers a free Tesla to attract buyers 

Home seller offers a free Tesla to attract buyers 

November 18, 2022
ChatGPT: Copilot Today, Autopilot Tomorrow?

ChatGPT: Copilot Today, Autopilot Tomorrow?

April 27, 2023
Druckenmiller sees ‘big role’ for cryptocurrency as central bank trust evaporates

Druckenmiller sees ‘big role’ for cryptocurrency as central bank trust evaporates

September 29, 2022
Index Investing News

Get the latest news and follow the coverage of Investing, World News, Stocks, Market Analysis, Business & Financial News, and more from the top trusted sources.

  • 1717575246.7
  • Browse the latest news about investing and more
  • Contact us
  • Cookie Privacy Policy
  • Disclaimer
  • DMCA
  • Privacy Policy
  • Terms and Conditions
  • xtw18387b488

Copyright © 2022 - Index Investing News.
Index Investing News is not responsible for the content of external sites.

No Result
View All Result
  • Home
  • World
  • Investing
  • Financial
  • Economy
  • Markets
  • Stocks
  • Crypto
  • Property
  • Sport
  • Entertainment
  • Opinion

Copyright © 2022 - Index Investing News.
Index Investing News is not responsible for the content of external sites.

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In