US LawMakers Calls for Stringent Regulation Following Taylor Swift Deepfake Case
AI Trading

X (formerly Twitter) posted a statement indicating it is taking down the images and implementing the necessary interventions against the accounts spreading deepfakes.

US legislators are calling for legislation that outlaws the generation of deepfake images after the vast spread of Taylor Swift’s explicit fake photos. These photos appeared on social media platforms like Telegram and X (formerly Twitter).

US Regulators Take Legal Action Against Taylor Swift

In an X post, Joe Morelle (D-NY) strongly disapproved of the circulation of the images, calling it dreadful. He mentioned the Preventing Deepfakes of Intimate Images Act, a regulation he created to criminalize the making of non-consensual deepfakes.

He also suggested the need for quick action on the matter. Deepfakes utilize artificial intelligence (AI)to develop manipulated videos by changing an individual’s body or face.

AI Trading

Currently, no federal regulations address the circulation or generation of deepfake images. However, some legislators intend to address the problem.

On X, representative Yvette Clarke (D-NY) claimed that Taylor Swift’s situation is not new. She noted that technology has impacted chiefly women for years, and AI advancements ease the generation of deepfakes.

In a statement, the social media platform said it is taking down images and implementing the necessary actions against the accounts behind their circulation. In the UK, circulating deepfake porn became illicit as part of its Online Safety Act.

The move to formulate guardrails is topping the recent agenda of US policymakers oblivious to the ethical concerns and misuse when the evolutionary technology is left unchecked. Beyond deepfakes, the legislators are weighing shield against AI-powered cybercrime, hacking and scams.

Taylor Swift Charged Violates Online Safety Act

According to the State of Deepfakes report from last year, most of the online deepfakes entailed pornography, and an estimated 99% of persons vulnerable to this kind of content are women.

Concerns regarding content generated using artificial intelligence have risen. In its 19th Global Risks Report, the World Economic Program (WEP) explained the negative impacts of AI technologies. This report described the intended or unintended adverse effects of developments in artificial intelligence.

Additionally, it explained the associated technical abilities of organizations, people, economies, and ecosystems. The Canadian Security Intelligence Service, Canada’s leading national intelligence agency, is also concerned by the disinformation campaigns regarding the internet’s use of artificial intelligence-created deepfakes.

In a June 12 report, the United Nations (UN) revealed that AI-created media is a significant and persistent threat to information integrity, particularly on social media. According to the organization, speedy technological advancements have increased the risk of online disinformation, especially in generative AI.

AI Trading

HeraldSheets.com produces top quality content for crypto companies. We provide brand exposure for hundreds of companies. All of our clients appreciate our services. If you have any questions you may contact us. Cryptocurrencies and Digital tokens are highly volatile, conduct your own research before making any investment decisions. Some of the posts on this website are guest posts or paid posts that are not written by our authors and the views expressed in them do not reflect the views of this website. Herald Sheets is not responsible for the content, accuracy, quality, advertising, products or any other content posted on the site. Read full terms and conditions / disclaimer.

Michael Scott

By Michael Scott

Michael Scott is a skilled and seasoned news writer with a talent for crafting compelling stories. He is known for his attention to detail, clarity of expression, and ability to engage his readers with his writing.

Leave a Reply

Your email address will not be published. Required fields are marked *