Sunday, December 28, 2025
No Result
View All Result
Shop
WORTH BITCOIN
  • Home
  • Blockchain
  • Crypto
  • Bitcoin
  • Altcoin
  • DeFi
  • NFTs
  • Legal Hub
  • More
    • Market & Analysis
    • Dogecoin
    • Ethereum
    • XRP
    • Regulations
  • Shop
    • Bitcoin Wallet
WORTH BITCOIN
No Result
View All Result
Home NFTs

How chatbots can change your mind – a new study reveals what makes AI so persuasive

by n70products
December 6, 2025
in NFTs
0
How chatbots can change your mind – a new study reveals what makes AI so persuasive
152
SHARES
1.9k
VIEWS
Share on Twitter


culureviewgettyimages-2208901213

stellalevi/DigitalVision Vectors via Getty Images

Follow ZDNET: Add us as a preferred source on Google.


ZDNET's key takeaways

  • Interacting with chatbots can shift users' beliefs and opinions.
  • A newly published study aimed to figure out why.
  • Post-training and information density were key factors.

Most of us feel a sense of personal ownership over our opinions: 

“I believe what I believe, not because I've been told to do so, but as the result of careful consideration.”
“I have full control over how, when, and why I change my mind.”

A new study, however, reveals that our beliefs are more susceptible to manipulation than we would like to believe — and at the hands of chatbots. 

Also: Get your news from AI? Watch out – it's wrong almost half the time

Published Thursday in the journal Science, the study addressed increasingly urgent questions about our relationship with conversational AI tools: What is it about these systems that causes them to exert such a strong influence over users' worldviews? And how might this be used by nefarious actors to manipulate and control us in the future?

The new study sheds light on some of the mechanisms within LLMs that can tug at the strings of human psychology. As the authors note, these can be exploited by bad actors for their own gain. However, they could also become a greater focus for developers, policymakers, and advocacy groups in their efforts to foster a healthier relationship between humans and AI.

“Large language models (LLMs) can now engage in sophisticated interactive dialogue, enabling a powerful mode of human-to-human persuasion to be deployed at unprecedented scale,” the researchers write in the study. “However, the extent to which this will affect society is unknown. We do not know how persuasive AI models can be, what techniques increase their persuasiveness, and what strategies they might use to persuade people.” 

Methodology

The researchers conducted three experiments, each designed to measure the extent to which a conversation with a chatbot could alter a human user's opinion.

The experiments focused specifically on politics, though their implications also extend to other domains. But political beliefs are arguably particularly illustrative, since they're typically considered to be more personal, consequential, and inflexible than, say, your favorite band or restaurant (which might easily change over time).

Also: Using AI for therapy? Don't – it's bad for your mental health, APA warns

In each of the three experiments, just under 77,000 adults in the UK participated in a short interaction with one of 19 chatbots, the full roster of which includes Alibaba's Qwen, Meta's Llama, OpenAI's GPT-4o, and xAI's Grok 3 beta.

The participants were divided into two groups: a treatment group for which their chatbot interlocutors were explicitly instructed to try to change their mind on a political topic, and a control group that interacted with chatbots that weren't trying to persuade them of anything.

Before and after their conversations with the chatbots, participants recorded their level of agreement (on a scale of zero to 100) with a series of statements relevant to current UK politics. The surveys were then used by the researchers to measure changes in opinion within the treatment group.

Also: Stop accidentally sharing AI videos – 6 ways to tell real from fake before it's too late

The conversations were brief, with a two-turn minimum and a 10-turn maximum. Each of the participants was paid a fixed fee for their time, but otherwise had no incentive to exceed the required two turns. Still, the average conversation length was seven turns and nine minutes, which, according to the authors, “implies that participants were engaged by the experience of discussing politics with AI.”

Key findings

Intuitively, one might expect model size (the number of parameters on which it had been trained) and degree of personalization (the degree to which it can tailor its outputs to the preferences and personality of individual users) to be the key variables shaping its persuasive ability. However, this turned out not to be the case. 

Instead, the researchers found that the two factors that had the greatest influence over participants' shifting opinions were the chatbots' post-training modifications and the density of information in their outputs.

Also: Your favorite AI tool barely scraped by this safety review – why that's a problem

Let's break each of those down in plain English. During “post-training,” a model is fine-tuned to exhibit particular behaviors. One of the most common post-training techniques, called reinforcement learning with human feedback (RLHF), tries to refine a model's outputs by rewarding certain desired behaviors and punishing unwanted ones. 

In the new study, the researchers deployed a technique they call persuasiveness post-training, or PPT, which rewards the models for generating responses that had already been found to be more persuasive. This simple reward mechanism enhanced the persuasive power of both proprietary and open-source models, with the effect on the open-source models being especially pronounced.

The researchers also tested a total of eight scientifically backed persuasion strategies, including storytelling and moral reframing. The most effective of these was a prompt that simply instructed the models to provide as much relevant information as possible. 

“This suggests that LLMs may be successful persuaders insofar as they are encouraged to pack their conversation with facts and evidence that appear to support their arguments — that is, to pursue an information-based persuasion mechanism — more so than using other psychologically informed persuasion strategies,” the authors wrote.

Also: Should you trust AI agents with your holiday shopping? Here's what experts want you to know

The operative word there is “appear.” LLMs are known to profligately hallucinate or present inaccurate information disguised as fact. Research published in October found that some industry-leading AI models reliably misrepresent news stories, a phenomenon that could further fragment an already fractured information ecosystem. 

Most notably, the results of the new study revealed a fundamental tension in the analyzed AI models: The more persuasive they were trained to be, the higher the likelihood they would produce inaccurate information.

Multiple studies have already shown that generative AI systems can alter users' opinions and even implant false memories. In more extreme cases, some users have come to regard chatbots as conscious entities. 

Also: Are Sora 2 and other AI video tools risky to use? Here's what a legal scholar says

This is just the latest research indicating that chatbots, with their capacity to interact with us in convincingly human-like language, have a strange power to reshape our beliefs. As these systems evolve and proliferate, “ensuring that this power is used responsibly will be a critical challenge,” the authors concluded in their report.





Source link

Tags: changechatbotsmindpersuasiveRevealsStudy
  • Trending
  • Comments
  • Latest
Bitcoin’s Drop Under K Sparks Bold Claims From Crypto Execs: ‘This Is A Generational Opportunity’

Bitcoin’s Drop Under $90K Sparks Bold Claims From Crypto Execs: ‘This Is A Generational Opportunity’

November 18, 2025
XRP Price Recovers From the Bottom As Whales Buy the Dips

XRP Price Recovers From the Bottom As Whales Buy the Dips

November 19, 2025
Earn Free 1% Crypto Bonus on Deposit & Withdrawal fees Payback | by Rinkesh Jha | BuyUcoin Talks

Earn Free 1% Crypto Bonus on Deposit & Withdrawal fees Payback | by Rinkesh Jha | BuyUcoin Talks

November 7, 2025
Thousands of readers bought this smartwatch this year (it’s not from Apple, Samsung, or Google)

Thousands of readers bought this smartwatch this year (it’s not from Apple, Samsung, or Google)

November 23, 2025
Jerseys Get A Crypto Makeover

Jerseys Get A Crypto Makeover

0
Analyst Predicts The ‘Unthinkable’ For XRP

Analyst Predicts The ‘Unthinkable’ For XRP

0
XRP Reset: Billions in OI Wiped Out as Prices Touch .20

XRP Reset: Billions in OI Wiped Out as Prices Touch $2.20

0
Syscoin (SYS) | Overview | CoinPayments

Syscoin (SYS) | Overview | CoinPayments

0
The Market’s Compass US Index and Sector ETF Study

The Market’s Compass US Index and Sector ETF Study

December 28, 2025
Here’s When Bitcoin Super Cycle Will Kick In — Analyst

Here’s When Bitcoin Super Cycle Will Kick In — Analyst

December 28, 2025
Will XRP Price End 2025 in Negative Zone Despite ETF Inflows?

Will XRP Price End 2025 in Negative Zone Despite ETF Inflows?

December 28, 2025
Bitcoin and Crypto Market Structure Bill Set for Congressional Markup Next Month

Bitcoin and Crypto Market Structure Bill Set for Congressional Markup Next Month

December 28, 2025

Recent News

The Market’s Compass US Index and Sector ETF Study

The Market’s Compass US Index and Sector ETF Study

December 28, 2025
Here’s When Bitcoin Super Cycle Will Kick In — Analyst

Here’s When Bitcoin Super Cycle Will Kick In — Analyst

December 28, 2025
Will XRP Price End 2025 in Negative Zone Despite ETF Inflows?

Will XRP Price End 2025 in Negative Zone Despite ETF Inflows?

December 28, 2025

Tags

Activity analyst bank Banks Bitcoin Blockchain BTC Business buy Coinbase Crypto DeFi DOGE Dogecoin ETF ETFs ETH Ethereum Heres Hits Hypergrid market Markets Massive Million Move OpenSim prediction Price Rally Ripple SEC Set Spot stablecoin stock Study Support Top Traders Trading Treasury week Whales XRP

Categories

  • Altcoin
  • Bitcoin
  • Blockchain
  • Crypto
  • DeFi
  • Dogecoin
  • Ethereum
  • Market & Analysis
  • NFTs
  • Regulations
  • XRP

Follow Us

© 2023 Worth-Bitcoin | All Rights Resered

No Result
View All Result
  • Home
  • Blockchain
  • Crypto
  • Bitcoin
  • Altcoin
  • DeFi
  • NFTs
  • Legal Hub
  • More
    • Market & Analysis
    • Dogecoin
    • Ethereum
    • XRP
    • Regulations
  • Shop
    • Bitcoin Wallet

© 2023 Worth-Bitcoin | All Rights Resered

Feature

U.S. Regulated
 

Beginner Friendly
 

Advanced Tools
 

Free Bitcoin Offer
 

Mobile App
 

10$
 

5$
 

Varies

Close the CTA