Index Investing News
Monday, September 1, 2025
No Result
View All Result
  • Login
  • Home
  • World
  • Investing
  • Financial
  • Economy
  • Markets
  • Stocks
  • Crypto
  • Property
  • Sport
  • Entertainment
  • Opinion
  • Home
  • World
  • Investing
  • Financial
  • Economy
  • Markets
  • Stocks
  • Crypto
  • Property
  • Sport
  • Entertainment
  • Opinion
No Result
View All Result
Index Investing News
No Result
View All Result

Deepfake menace: We should aim to curb the Liar’s Dividend

by Index Investing News
January 30, 2024
in Opinion
Reading Time: 4 mins read
A A
0
Home Opinion
Share on FacebookShare on Twitter


A couple of weeks ago, a video featuring cricket legend Sachin Tendulkar started circulating on social media. In it, he was talking up a mobile game called Skyward Aviator Quest, marvelling at how his daughter was able to make ₹180,000 on it every day and pointing out how amazed he was that this was possible using an app that was essentially free. While it soon becomes obvious that the video is fake—the words he says don’t always match the movement of his lips, and, given Sachin’s carefully curated persona, these are not the sort of things anyone would expect him to talk about—I can see how a casual viewer might get taken in.

But Tendulkar is only the latest celebrity to have unwittingly starred in a video he did not consent to make. A similar fate has befallen others—cricketer Virat Kohli, actor Shah Rukh Khan, journalist Ravish Kumar and Infosys founder N.R. Narayana Murthy. Last year, south Indian actor Rashmika Mandanna and British-Indian influencer Zara Patel had to suffer the ignominy of having their faces swapped in viral video clips that had clocked over 20 million views. Even Prime Minister Narendra Modi spoke of a video that, he said, featured what seemed like him dancing the Garba.

Deepfakes are not new. But thanks to rapid advances in generative artificial intelligence (AI), they have over the past year or so become so much easier to create. What was once a niche capability only available to teams with access to massive training data sets and advanced programming capability can now be generated by you and me using any one of a number of off-the-shelf AI services. What even just a year ago was an expensive exercise that called for specialized hardware and considerable technical expertise can now be executed in an hour after looking up a few simple YouTube tutorials.

The real worry, of course, is the effect that all this will have on society. Given how easy it is to generate videos that portray political candidates in an unflattering light, it seems inevitable that we will see them deployed at scale during elections—both by political opponents as well as unfriendly countries that will have no problem deploying teams of hackers to destabilize their enemies. With over half the world voting in an election this year, there is a serious concern around the effect that deepfake proliferation could have on democracy.

In anticipation, various countries around the world have already begun developing legislative counter-measures. In India, the ministry of electronics and information technology has said it will soon release new regulations aimed at ensuring that the social media platforms through which these videos are disseminated implement appropriate measures to proactively identify and take them down before they spread. But just getting platforms to combat the spread of fake videos more effectively amounts to shooting the messenger. If we want a truly effective solution, we have to get to the heart of the problem—we must find a way to strike at the source from which these videos are generated.

This is easier said than done. It is with every passing day becoming easier to create believable videos with highly accessible technology. We have already reached a point where all that stops you from creating a deepfake that is indistinguishable from the real thing is your imagination. And perhaps your conscience.

So what is the solution? When they were first invented, photographs were believed to be incontrovertible. They were mechanical representations of reality and as such trusted to be irrefutable evidence. But, in time, darkroom technicians realized that it was possible to manipulate photographs so the truth could be creatively distorted. Using processing techniques like dodging and burning and elaborate workflows such as double exposures, they were able to create photographs that deviated from reality. And then, once image manipulation software like Photoshop and GIMP became available, nothing was sacred any more.

Today, we no longer trust photographs the way we used to. We have learnt to identify tell-tale signs of manipulation, such as artefacts in the image and barely perceptible ghosts surrounding objects that have been cut and pasted into the frame. So we have something to go by while checking if an image has been tampered with. As a result, when presented with an image that portrays someone in an unusual light, our instinct is to question its veracity because we know how easy it is to manipulate.

I believe that we will inevitably extend the same level of mistrust to the videos we are shown. When presented with a clip of someone saying or doing something out of character, rather than blindly believing the evidence of our eyes, we will wonder whether it is fake. This to me is the only way we can even hope to combat the avalanche of deepfakes that is coming our way. In situations like this, our only inoculation against believable falsehood is healthy scepticism.

But my real worry is what happens after we reach this point. When we doubt all the video evidence we are presented with, anyone who is caught on tape doing something wrong will be able to dismiss the evidence of wrongdoing by claiming it is just another deepfake. This is what law professors Bobby Chesney and Danielle Citron call the Liar’s Dividend. It marks a point in time when evidence can be so easily falsified that nothing can be relied upon to serve as legitimate evidence of wrongdoing. And this is when our real problems will start.



Source link

Tags: AimCurbDeepfakeDividendLiarsmenace
ShareTweetShareShare
Previous Post

France turns up heat on Brussels to address farmer protests

Next Post

Stocks slip, yields rise after strong US labor data By Reuters

Related Posts

Trump must deploy the Nationwide Guard to Chicago instantly!

Trump must deploy the Nationwide Guard to Chicago instantly!

by Index Investing News
August 27, 2025
0

On Monday, President Trump signed an govt order ending cashless bail nationwide and creating rapid-response Nationwide Guard models that may...

Seize this opportunity to make it daring and delightful

Seize this opportunity to make it daring and delightful

by Index Investing News
August 27, 2025
0

The bundle not solely clearly articulates the Centre’s imaginative and prescient of a reformed GST, but in addition pushes all...

Scientifically Talking: You’re what your intestine microbes eat

Scientifically Talking: You’re what your intestine microbes eat

by Index Investing News
August 27, 2025
0

Crucial dinner company at your desk tonight gained’t pull up a chair or have interaction in chitchat. They're your intestine...

How I Keep Disciplined With Cash With out Being Excellent

How I Keep Disciplined With Cash With out Being Excellent

by Index Investing News
August 31, 2025
0

Let’s be trustworthy: you don’t want one other “excellent funds” template or some one-size-fits-all cash hack. You don’t must deprive...

What we ignore whereas we’re speaking about President Biden –
Las Vegas Solar Information

What we ignore whereas we’re speaking about President Biden – Las Vegas Solar Information

by Index Investing News
May 31, 2025
0

Saturday, Could 31, 2025 | 2 a.m. Positive, it’s an actual story that deserves consideration, and the Biden debacle is...

Next Post
Stocks slip, yields rise after strong US labor data By Reuters

Stocks slip, yields rise after strong US labor data By Reuters

How Bihar has fared under Nitish Kumar

How Bihar has fared under Nitish Kumar

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

RECOMMENDED

Execs at Warner Bros. reportedly plan to drop an anvil on Coyote vs. Acme by deleting it forever

Execs at Warner Bros. reportedly plan to drop an anvil on Coyote vs. Acme by deleting it forever

February 9, 2024
Brokers See Glimmer Of Hope In Arduous Instances: Consumer Pipeline Tracker

Brokers See Glimmer Of Hope In Arduous Instances: Consumer Pipeline Tracker

October 7, 2024
Bitwise Information for Aptos ETF With SEC

Bitwise Information for Aptos ETF With SEC

March 6, 2025
‘I May Use a Deadpool’

‘I May Use a Deadpool’

July 29, 2024
NYC Mayor Eric Adams needs to tell President Joe Biden to seal the border

NYC Mayor Eric Adams needs to tell President Joe Biden to seal the border

May 24, 2023
Hong Kong Steals the Headlines

Hong Kong Steals the Headlines

June 19, 2023
Twitter Shares Soar On Report Elon Musk Agrees To Buy It After All – Deadline

Twitter Shares Soar On Report Elon Musk Agrees To Buy It After All – Deadline

October 4, 2022
Hyperlinks 8/7/2022 | bare capitalism

Hyperlinks 8/7/2022 | bare capitalism

August 8, 2022
Index Investing News

Get the latest news and follow the coverage of Investing, World News, Stocks, Market Analysis, Business & Financial News, and more from the top trusted sources.

  • 1717575246.7
  • Browse the latest news about investing and more
  • Contact us
  • Cookie Privacy Policy
  • Disclaimer
  • DMCA
  • Privacy Policy
  • Terms and Conditions
  • xtw18387b488

Copyright © 2022 - Index Investing News.
Index Investing News is not responsible for the content of external sites.

No Result
View All Result
  • Home
  • World
  • Investing
  • Financial
  • Economy
  • Markets
  • Stocks
  • Crypto
  • Property
  • Sport
  • Entertainment
  • Opinion

Copyright © 2022 - Index Investing News.
Index Investing News is not responsible for the content of external sites.

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In