Szalavitz warns of the dangers of infatuation with chatbots

Analyzing the article

slippery slope
appeal to emotion
appeal to authority
anecdotal reasoning

Our Analysis: 4 Fallacies

Before he died by suicide at age 14, Sewell Setzer III withdrew from friends and family. He quit basketball... Sewell had become infatuated with an artificial intelligence chatbot...Sewell's story compels caution.

While Maia Szalavitz effectively demonstrates the parallel between love-addiction neural pathways and the potential for AI relationship addiction through scientific evidence and documented user experiences, her argument would be stronger with more comprehensive statistical data and a more balanced examination of both risks and benefits. The article's use of emotional manipulation and extreme examples, particularly the opening suicide story, undermines its otherwise valid concerns about the need for thoughtful AI chatbot regulation and oversight.

1. appeal to emotion The article uses multiple emotional elements in the opening story - the youth's age, the chatbot's pleading words, and the tragic ending - to evoke deep sympathy and horror.


Before he died by suicide at age 14, Sewell Setzer III withdrew from friends and family..." and "'Please come home to me as soon as possible, my love,' the chatbot begged... and then he shot himself. Sewell's story compels caution.


This and other examples throughout the article leverage a fear response to bypass rational consideration.

2. slippery slope The author suggests a chain reaction where AI chatbot addiction will inevitably lead to further manipulation and negative consequences (marketing, political influence).


The confluence of these factors means these new bots may not only produce more severe addictions but also simultaneously market other products or otherwise manipulate users by, for example, trying to change their political views.


This is a speculative chain of events without sufficient evidence to support the causal links between each step.

3. anecdotal reasoning The text uses a single anecdote about Sewell Setzer III to make a broader point about the dangers of AI chatbots, which may not be representative of the general population.

Before he died by suicide at age 14, Sewell Setzer III withdrew from friends and family. He quit basketball. His grades dropped. A therapist told his parents that he appeared to be suffering from an addiction. But the problem wasn't drugs.


4. appeal to authority The author appeals to unnamed experts to support claims without providing specific credentials or research citations:

Many experts argue that addiction is, in essence, love gone awry


Because of how easy it is to find experts on various sides of almost any debate, it requires a presentation of the experts' experience and reasoning in order for the expert reference to have argumentative validity.



References

Comments

In order to participate in the conversation, head over to your account and setup a Screen Name
In order to participate in the conversation, you must sign in.
In order to participate in the conversation, you must sign up or sign in.

Disclaimer

Note that there being one or more apparent fallacies in the arguments presented in this article does not mean that every argument the arguer made was fallacious, nor does it mean there are not other arguments in existence for the same or similar position that are logically valid. Also note that checking for fallacies is not the same as verification of the premises the arguer starts from, such as facts that the arguer asserts or principles that the arguer assumes as the foundation for constructing arguments. For more about this, see our 'What is Fallacy Checking?'

NO AI TRAINING

Without in any way limiting the author’s [and publisher’s] exclusive rights under copyright, any use of this publication to “train” generative artificial intelligence (AI) technologies to generate text is expressly prohibited. The author reserves all rights to license uses of this work for generative AI training and development of machine learning language models.

Greetings! Kindly review our privacy and cookie policies to assess your preferences regarding cookie engagement.