Index Investing News
Wednesday, March 4, 2026
No Result
View All Result
  • Login
  • Home
  • World
  • Investing
  • Financial
  • Economy
  • Markets
  • Stocks
  • Crypto
  • Property
  • Sport
  • Entertainment
  • Opinion
  • Home
  • World
  • Investing
  • Financial
  • Economy
  • Markets
  • Stocks
  • Crypto
  • Property
  • Sport
  • Entertainment
  • Opinion
No Result
View All Result
Index Investing News
No Result
View All Result

Deepfake menace: We should aim to curb the Liar’s Dividend

by Index Investing News
January 30, 2024
in Opinion
Reading Time: 4 mins read
A A
0
Home Opinion
Share on FacebookShare on Twitter


A couple of weeks ago, a video featuring cricket legend Sachin Tendulkar started circulating on social media. In it, he was talking up a mobile game called Skyward Aviator Quest, marvelling at how his daughter was able to make ₹180,000 on it every day and pointing out how amazed he was that this was possible using an app that was essentially free. While it soon becomes obvious that the video is fake—the words he says don’t always match the movement of his lips, and, given Sachin’s carefully curated persona, these are not the sort of things anyone would expect him to talk about—I can see how a casual viewer might get taken in.

But Tendulkar is only the latest celebrity to have unwittingly starred in a video he did not consent to make. A similar fate has befallen others—cricketer Virat Kohli, actor Shah Rukh Khan, journalist Ravish Kumar and Infosys founder N.R. Narayana Murthy. Last year, south Indian actor Rashmika Mandanna and British-Indian influencer Zara Patel had to suffer the ignominy of having their faces swapped in viral video clips that had clocked over 20 million views. Even Prime Minister Narendra Modi spoke of a video that, he said, featured what seemed like him dancing the Garba.

Deepfakes are not new. But thanks to rapid advances in generative artificial intelligence (AI), they have over the past year or so become so much easier to create. What was once a niche capability only available to teams with access to massive training data sets and advanced programming capability can now be generated by you and me using any one of a number of off-the-shelf AI services. What even just a year ago was an expensive exercise that called for specialized hardware and considerable technical expertise can now be executed in an hour after looking up a few simple YouTube tutorials.

The real worry, of course, is the effect that all this will have on society. Given how easy it is to generate videos that portray political candidates in an unflattering light, it seems inevitable that we will see them deployed at scale during elections—both by political opponents as well as unfriendly countries that will have no problem deploying teams of hackers to destabilize their enemies. With over half the world voting in an election this year, there is a serious concern around the effect that deepfake proliferation could have on democracy.

In anticipation, various countries around the world have already begun developing legislative counter-measures. In India, the ministry of electronics and information technology has said it will soon release new regulations aimed at ensuring that the social media platforms through which these videos are disseminated implement appropriate measures to proactively identify and take them down before they spread. But just getting platforms to combat the spread of fake videos more effectively amounts to shooting the messenger. If we want a truly effective solution, we have to get to the heart of the problem—we must find a way to strike at the source from which these videos are generated.

This is easier said than done. It is with every passing day becoming easier to create believable videos with highly accessible technology. We have already reached a point where all that stops you from creating a deepfake that is indistinguishable from the real thing is your imagination. And perhaps your conscience.

So what is the solution? When they were first invented, photographs were believed to be incontrovertible. They were mechanical representations of reality and as such trusted to be irrefutable evidence. But, in time, darkroom technicians realized that it was possible to manipulate photographs so the truth could be creatively distorted. Using processing techniques like dodging and burning and elaborate workflows such as double exposures, they were able to create photographs that deviated from reality. And then, once image manipulation software like Photoshop and GIMP became available, nothing was sacred any more.

Today, we no longer trust photographs the way we used to. We have learnt to identify tell-tale signs of manipulation, such as artefacts in the image and barely perceptible ghosts surrounding objects that have been cut and pasted into the frame. So we have something to go by while checking if an image has been tampered with. As a result, when presented with an image that portrays someone in an unusual light, our instinct is to question its veracity because we know how easy it is to manipulate.

I believe that we will inevitably extend the same level of mistrust to the videos we are shown. When presented with a clip of someone saying or doing something out of character, rather than blindly believing the evidence of our eyes, we will wonder whether it is fake. This to me is the only way we can even hope to combat the avalanche of deepfakes that is coming our way. In situations like this, our only inoculation against believable falsehood is healthy scepticism.

But my real worry is what happens after we reach this point. When we doubt all the video evidence we are presented with, anyone who is caught on tape doing something wrong will be able to dismiss the evidence of wrongdoing by claiming it is just another deepfake. This is what law professors Bobby Chesney and Danielle Citron call the Liar’s Dividend. It marks a point in time when evidence can be so easily falsified that nothing can be relied upon to serve as legitimate evidence of wrongdoing. And this is when our real problems will start.



Source link

Tags: AimCurbDeepfakeDividendLiarsmenace
ShareTweetShareShare
Previous Post

France turns up heat on Brussels to address farmer protests

Next Post

Stocks slip, yields rise after strong US labor data By Reuters

Related Posts

Why India’s semiconductor story is a work in progress

Why India’s semiconductor story is a work in progress

by Index Investing News
February 27, 2026
0

India formally joined the Pax Silica grouping on February 20. India is deeply embedded in the design segment of the...

The significance of India’s role in AI diffusion took centre stage at the New Delhi summit

The significance of India’s role in AI diffusion took centre stage at the New Delhi summit

by Index Investing News
February 24, 2026
0

Unlike Bletchley Park, Bharat Mandapam was not only much larger and more crowded, the mood was also markedly more upbeat....

An UNBELIEVABLY Dark Agenda (Video) – FREEDOMBUNKER

An UNBELIEVABLY Dark Agenda (Video) – FREEDOMBUNKER

by Index Investing News
February 20, 2026
0

Child sex trafficker Jeffrey Epstein worked with the highest academics in the world to explore transhumanism, “designer babies,” eugenics, genetic...

Trump wouldn’t survive an HR review –
Las Vegas Sun News

Trump wouldn’t survive an HR review – Las Vegas Sun News

by Index Investing News
February 16, 2026
0

Monday, Feb. 16, 2026 | 2 a.m. I can’t recall another time when my grown kids have called their mom...

Africa’s agricultural future depends on using global research better — not reinventing it

Africa’s agricultural future depends on using global research better — not reinventing it

by Index Investing News
February 12, 2026
0

South Africa and the rest of the African continent face a familiar paradox. Agriculture remains central to food security, employment...

Next Post
Stocks slip, yields rise after strong US labor data By Reuters

Stocks slip, yields rise after strong US labor data By Reuters

How Bihar has fared under Nitish Kumar

How Bihar has fared under Nitish Kumar

RECOMMENDED

We Bought 5 Properties and 20-Plus Units by Copying Other People’s Strategies—Here’s What We Did

We Bought 5 Properties and 20-Plus Units by Copying Other People’s Strategies—Here’s What We Did

February 12, 2024
Braunwyn Windham-Burke Seemingly Marries Jennifer Spinner In Vegas – Hollywood Life

Braunwyn Windham-Burke Seemingly Marries Jennifer Spinner In Vegas – Hollywood Life

February 15, 2023
“Liam” and “Olivia” are the preferred child names as start charges fall

“Liam” and “Olivia” are the preferred child names as start charges fall

May 9, 2022
Binance Rumors And FUD Ramp Up

Binance Rumors And FUD Ramp Up

December 13, 2022
Will the Greatest Writer in the US Get Even Larger?

Will the Greatest Writer in the US Get Even Larger?

July 31, 2022
Seattle-based Redfin to put off lots of because it companions with Zillow

Seattle-based Redfin to put off lots of because it companions with Zillow

February 14, 2025
Climate-change activists have warned of doomsday for decades, including cooling

Climate-change activists have warned of doomsday for decades, including cooling

August 4, 2023
2024 NFL kickoff guidelines might carry pleasure … and chaos: ‘It’s going to be a s— present’

2024 NFL kickoff guidelines might carry pleasure … and chaos: ‘It’s going to be a s— present’

July 22, 2024
Index Investing News

Get the latest news and follow the coverage of Investing, World News, Stocks, Market Analysis, Business & Financial News, and more from the top trusted sources.

  • 1717575246.7
  • Browse the latest news about investing and more
  • Contact us
  • Cookie Privacy Policy
  • Disclaimer
  • DMCA
  • Privacy Policy
  • Terms and Conditions
  • xtw18387b488

Copyright © 2022 - Index Investing News.
Index Investing News is not responsible for the content of external sites.

No Result
View All Result
  • Home
  • World
  • Investing
  • Financial
  • Economy
  • Markets
  • Stocks
  • Crypto
  • Property
  • Sport
  • Entertainment
  • Opinion

Copyright © 2022 - Index Investing News.
Index Investing News is not responsible for the content of external sites.

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In