The New York Times
All those private details collected by the many apps on your phone, not to mention the smart home devices, doorbell cameras and, of course, your car wind up in the hands of salespeople known as data brokers. The data brokers, in turn, frequently sell to government agencies, especially law enforcement... it happens every day.
Megan K. Stack presents a compelling case for the potential risks of surveillance technologies, highlighting the importance of privacy and the potential for abuse. However, her argument is weakened by the use of logical fallacies, such as a slippery slope which oversimplifies the complex political and cultural differences between the U.S. and China, and assumes an inevitable outcome without considering the potential for legal, technological, and social countermeasures.
1. loaded language • There are numerous instances of loaded language in the article, used to evoke strong emotional responses and shape the reader's perception aside from any particular facts or reasoning:
2. slippery slope • The author implies a chain of events will lead to an unspecified but negative and drastic change.
If we keep sleepwalking into a surveillance state, we may eventually wake up in a place we hardly recognize as our own.
The progression is presented as a slippery slope without demonstrating the causal links between each stage. For this assertion to be more than a slippery slope fallacy, the author would need to address factors that could mitigate the risk of the United States descending into a surveillance state mirroring China's, and explain why none of these measures would prevail. Such factors include:
It's important to note that these safeguards are not absolute guarantees. The risk of a more intrusive surveillance state remains, but these factors offer significant potential to prevent the US from reaching the level of pervasive surveillance seen in China, and the author fails to account for them in the assertion of a slippery slope.
3. cherry picking • The article's discussion of Palantir's government contracts exhibits cherry-picking.
Meanwhile, the data analysis and technology firm Palantir, which was co-founded by Alex Karp and Peter Thiel (another Trump acolyte), has already received more than $113 million from the federal government since President Trump took office again.
While highlighting the $113 million received since President Trump's return to office and Peter Thiel's association with Trump, the author omits crucial context. Palantir has a long history of receiving government contracts, extending back to the Obama and Biden administrations, with a substantial $2.5 billion backlog of contract obligations (according to USASpending.gov) remaining at the end of the Biden presidency.
This omission creates a misleading impression that Palantir's relationship with the government is primarily a "Trump thing," strengthening the author's narrative of increased authoritarianism under Trump while neglecting a broader, more nuanced picture of the company's government involvement across multiple administrations. This selective presentation of facts undermines the objectivity and credibility of the author's argument.
Note that there being one or more apparent fallacies in the arguments presented in this article does not mean that every argument the arguer made was fallacious, nor does it mean there are not other arguments in existence for the same or similar position that are logically valid. Also note that checking for fallacies is not the same as verification of the premises the arguer starts from, such as facts that the arguer asserts or principles that the arguer assumes as the foundation for constructing arguments. For more about this, see our 'What is Fallacy Checking?'
Without in any way limiting the author’s [and publisher’s] exclusive rights under copyright, any use of this publication to “train” generative artificial intelligence (AI) technologies to generate text is expressly prohibited. The author reserves all rights to license uses of this work for generative AI training and development of machine learning language models.
Comments