33.8 C
Dubai
Thursday, September 19, 2024
spot_img

World News | How AI is taking over elections — and undermining democracy

[ad_1]

Streaks of light seen in California. (Image source: video capture)

Cambridge (USA), June 5 (Dialogue) Can organizations use artificial intelligence language models like ChatGPT to trick voters into acting in certain ways?

Senator Josh Hawley posed the question to OpenAI CEO Sam Altman during a US Senate artificial intelligence hearing on May 16, 2023. Altman responded that he does have concerns that some might use language models to manipulate, persuade and interact one-on-one with voters.

Read also | United Kingdom: Dipti Patel, a woman of Indian origin, has been banned from teaching for at least two years for fraud.

Altman didn’t elaborate, but he may have thought of the situation. Imagine political technologists soon developing a machine called a Clogger — a political campaign in a black box.

Clogger relentlessly pursues one goal: to maximize the chances of its candidates (campaigns purchasing the services of Clogger Inc.) winning in elections.

Read also | US F-16 fighter jets caused a sonic boom in Washington DC as they chased the unresponsive plane that eventually crashed in Virginia.

While platforms like Facebook, Twitter, and YouTube use forms of AI to get users to spend more time on their sites, Clogger’s AI has a different goal: to change how people vote.

How Cloggers Work

As political scientists and legal scholars who study the intersection of technology and democracy, we believe that something like Clogger could use automation to significantly increase the scale and potential effectiveness of behavioral manipulation and microtargeting techniques used by political campaigns since the early 2000s.

Just as advertisers now use your browsing and social media history to individually target commercial and political ads, a Clogger focuses solely on you — and hundreds of millions of other voters.

It will offer three advances over current state-of-the-art algorithmic behavior manipulation.

First, its language models generate messages tailored to you — text, social media, and email, and possibly pictures and videos. While advertisers strategically place relatively few ads, language models such as ChatGPT can generate countless unique messages for you personally and millions of messages for others over the course of a campaign.

Second, Clogger uses a technique called reinforcement learning to generate a series of messages that are increasingly likely to change your vote.

Reinforcement learning is a machine learning, trial-and-error method in which a computer takes actions and gets feedback on which works better to learn how to achieve a goal. Machines that can play Go, chess and many video games better than any human have used reinforcement learning.

Third, during the course of the event, the Clogger’s message may evolve to take into account your responses to the machine’s previous sends and what it has learned about changing other people’s minds.

Over time, Clogger will be able to have a dynamic “conversation” with you and millions of other people. Clogger messages are similar to ads that track you on different websites and social media.

The nature of artificial intelligence

Three other features (or bugs) are worth noting.

First, messages sent by Cloggers may or may not be political in content. The machine’s sole goal is to maximize vote share, and it may devise strategies to achieve this that no human campaigner would have thought of.

One possibility is to send would-be opposition voters messages about their apolitical passions in sports or entertainment to mask the political messages they receive.

Another possibility is to send offensive messages — such as incontinence ads — timed to coincide with the opponent’s message. Another is manipulating voters’ social media circles to make them think their circle supports their candidate.

Second, Clogger has no respect for facts. In fact, it has no way of knowing what is true or false. Language model “illusion” is not a problem for this machine, because its goal is to change your vote, not to provide accurate information.

Third, because it is a black-box AI, people have no way of knowing what strategies it uses.

Dictatorship

If the Republican presidential campaign were to deploy Clogger in 2024, the Democratic campaign might be forced to respond in the same way, perhaps using a similar machine. Call it Doug.

If campaign managers think the machines are effective, the presidential race will likely come down to Clogger vs. Dogger, with the winner being the client of the more effective machine.

Political scientists and pundits will have a lot to say about why one type or another of AI prevails, but probably no one really knows. A president is elected not because his or her policy proposals or political ideas convince more Americans, but because he or she has more effective artificial intelligence.

Win-of-the-day content will come from an AI entirely focused on winning, with no political ideology of its own, and not from candidates or parties.

In this very important sense, elections are won by machines, not by people. Elections will no longer be democratic, although all the ordinary activities of a democracy—speeches, advertisements, messages, voting, and counting—will take place.

The AI-elected president can then go one of two ways. He or she can use the veneer of the election to push Republican or Democratic policies. But because party ideals may have nothing to do with why people vote the way they do—Clogger and Dogger don’t care about policy views—a president’s actions don’t necessarily reflect what voters want.

Voters will be manipulated by AI rather than freely choose their political leaders and policies.

Another path is for the president to pursue machine-predicted information, behavior, and policies to maximize his chances of re-election.

On this path, the president has no particular platform or agenda other than maintaining power. Under Clogger’s direction, the president’s actions were most likely to manipulate voters rather than serve their true interests, or even the president’s own ideology.

Avoid Clogocracy

If candidates, campaigns, and advisers all forego the use of this political AI, it may be possible to avoid AI manipulation of elections. We think this is unlikely.

If politically effective black boxes are developed, the temptation to use them is almost irresistible. In fact, political consultants are likely to be called upon by their professional responsibilities to use these tools to help their candidates win.

And once a candidate has used such an effective tool, it’s hard to expect an opponent to unilaterally disarm.

Stronger privacy protections would help. Cloggers will rely on access to vast amounts of personal data to target individuals, craft tailored messages to persuade or manipulate them, and track and retarget them over the course of their campaigns.

Every bit of information that a company or policymaker denies to the machine makes it less effective.

Another solution lies in electoral commissions. They can try to ban or strictly regulate these machines. There is vigorous debate about whether such “replicative” speech (even if it is political) can be regulated.

America’s extreme free speech tradition has led many prominent scholars to say no.

But there is no reason to automatically extend First Amendment protections to the products of these machines. States may well choose to empower machines, but this should be a decision based on today’s challenges, not the false assumption that James Madison’s 1789 ideas were intended to be applied to artificial intelligence.

EU regulators are working in this direction. Policymakers have amended the European Parliament’s draft artificial intelligence bill to designate “artificial intelligence systems that influence voters in election campaigns” as “high risk” and subject to regulatory scrutiny.

European internet regulators and the state of California have already partially adopted a constitutionally safer but smaller step to ban bots from impersonating people.

For example, regulations may require a disclaimer to accompany an event post when it contains content generated by a machine rather than a human.

It’s like the ad disclaimer claim—”Paid for Congressional Committee by Sam Jones”—but modified to reflect its AI source: “This AI-generated ad was paid for by Sam Jones for a Congressional Committee.”

A more robust version might require: “This AI-generated message was sent to you by Sam Jones of a congressional committee because Clologger predicted that doing so would increase your chances of voting for Sam Jones by 0.0002%.”

At the very least, we believe voters should know when a bot is speaking to them, and they should know why.

The possibility of a system like Clogger suggests that the path to collective human disempowerment might not require some superhuman general artificial intelligence.

It might just take over-eager activists and advisors armed with powerful new tools to effectively press many buttons on millions of people. (dialogue)

(This is an unedited and auto-generated story from a Syndicated News feed, the content body may not have been modified or edited by LatestLY staff)


[ad_2]

Source link

Related Articles

AI Groundbreaking Economic Impact: New IDC Research Predicts $19.9 Trillion Contribution to Global Economy by 2030, Driving Innovation and Productivity Across Industries

AI will have an unprecedented economic impact, contributing an astonishing $19.9 trillion to the global economy by 2030.As AI continues to advance, it is...

UAE and Australia Forge Landmark Economic Alliance: Comprehensive Partnership to Boost Trade and Unlock Global Opportunities

UAE and Australia have reached a significant milestone in their bilateral relations by concluding negotiations on a Comprehensive Economic Partnership Agreement (CEPA).This landmark deal...

Myanmar Humanitarian Catastrophe: Over 5,000 Civilians Killed Amid Escalating Crisis and Lawlessness.

Myanmar the country has spiraled into a deepening humanitarian crisis, with over 5,000 civilians reportedly killed, according to the latest report from the United...

UAE Heroic Medical Evacuation from Gaza Applauded: UN Praises Nation’s Unwavering Humanitarian Commitment

UAE Heroic Medical Evacuation from Gaza Lauded: UN Senior Coordinator Applauds Nation's Unwavering Humanitarian CommitmentIn a world torn apart by conflict, the importance of...

Donald Trump Escapes Assassination Attempt: Former President Faces Unprecedented Security Threats Amid 2024 Campaign

Donald Trump has survived a recent assassination attempt, according to multiple reports that have surged into the headlines today.This incident adds to a troubling...

Latest Articles