Number 39 (december 2017)

A Not-So-Brief Account of Current Information Ethics: The Ethics of Ignorance, Missing Information, Misinformation, Disinformation and Other Forms of Deception or Incompetence

 

[Versió catalana]


Thomas J. Froehlich

PH.D., professor emeritus
School of Information
Kent State University

 

Abstract

Objectives: The article examines how the new technologies and the internet have given society greater access to information and knowledge but have also led to a major increase in false information and lies, which constitute a serious threat to information ethics.

Methodology: The author offers a taxonomy to describe the most common types of false information (misinformation, disinformation, missing information and self-deception) and information calumny, using examples in contemporary North American politics and information media and focusing on the figures of Donald Trump and Hillary Clinton. The article analyses the role public institutions and information professionals should adopt to face the situation.

Results: While they cannot themselves possess the truth, in order to combat false information and ignorance information professionals must remain alert to the dangers present, keep abreast of the demands of their profession, be competent and informed and promote society’s information literacy at individual and collective levels.

Resum

Objectiu: amb l'arribada de les noves tecnologies no només s'ha ampliat enormement l'accés a la informació i el coneixement, també les informacions falses i l'engany han augmentat exponencialment. L'ètica de la informació es veu perillosament així amenaçada per la mentida amb la potència multiplicadora d'Internet.

Metodologia: l'article descriu taxonòmicament les formes més habituals de la mentida informativa (la informació falsa, falsejada o incompleta i l'autoengany) i la difamació, tot il·lustrant-los amb exemples actuals de la política i els mitjans d'informació dels Estats Units i en especial del seu president, Donald Trump, i la candidata Hillary Clinton. S'analitza el paper de les institucions públiques i dels professionals de la informació davant d'aquesta realitat.

Resultats: per tal de lluitar contra els perills de la mentida i la ignorància, els professionals de la informació no posseeixen la veritat però han d'estar alerta i vigilants, mantenir-se actualitzats, competents i informats, i han de promoure l'alfabetització informacional, individual i col·lectiva.

Resumen

Objetivo: con la llegada de las nuevas tecnologías no sólo se ha ampliado enormemente el acceso a la información y el conocimiento, también las informaciones falsas y el engaño han aumentado exponencialmente. La ética de la información se ve así peligrosamente amenazada por la mentira a través de la potencia multiplicadora de Internet.

Metodología: el artículo describe taxonómicamente las formas más habituales de la mentira informativa (la información falsa, falseada o incompleta y el autoengaño) y la difamación, mediante ejemplos actuales de la política y los medios de información los Estados Unidos y en especial de su presidente, Donald Trump, y la candidata Hillary Clinton. Se analiza el papel de las instituciones públicas y de los profesionales de la información frente a esta realidad.

Resultados: con el fin de luchar contra los peligros de la mentira y la ignorancia, los profesionales de la información no poseen la verdad pero deben permanecer en estado de alerta y vigilantes, mantenerse actualizados, competentes e informados, y deben de promover la alfabetización informacional, individual y colectiva.

 

The Age of the Anti-Enlightenment

One of the consequences of the age of information is that the growth and advent of the internet, particularly in the growth of communication and social media, has not only promoted the growth of information and potential knowledge, but also the growth of ignorance in its various forms and guises: misinformation, disinformation, fake news, and attacks on credible news sources. Access to the internet is now, more often than not, access to resources that reinforce biases, ignorance, prejudgments, and stupidity. Parallel to a right to information, we have created in practice a right to ignorance. Not only that: we, whether as individuals, groups or institutions like the government, have the legal right in the United States to disseminate ignorance and to block venues of facts and truth, and smugly claim to present "alternative facts." We have entered an age of the Anti-Enlightenment, in which knowledge gained systematically and through careful observation of the environment is rejected and replaced by arrogant anti-science, anti-humanitarian propaganda whose misinformation or disinformation is transmitted through cable broadcasting and social media. This paper explores the dark side of the age of information, a side that has been exploited by the media mavens, political hacks, and ideological propagandists who promote lies, disenchantment, disillusion, illusion, fantasy, confusion, and other forms of demented or manipulated imagination.

Any talk of information ethics demands that we address the ethics of ignorance, disinformation, misinformation, missing information, lies and deceit. Not because these issues can be ethical per se (their existence lies in opposition to ethics), but because their challenge to an information ethics must be confronted and discussed. The issue, of course, is not new, but information technologies, such as the internet, its ease of access and rapid global growth, have considerably magnified the problem.

 

Information Professionals and Orthodoxy

As librarians and information professionals, we are concerned with the access to or provision of 'knowledge' or 'wisdom' (as we currently understand them) to patrons, though it is not clear that we ever fully achieve those cognitive states or can provide sources guaranteed to supply it. In other words, we do not have THE TRUTH, but truths, evolving truths. What we hope to make available for our patrons are wisdom (productive or practical, perhaps in an Aristotelian sense ) or knowledge (though these cannot, à la Plato, be directly communicated – they eventually must be self-realized) and beliefs and opinions (δόξα - doxa) as information – and within these, we hope at least to provide "right opinion" or the orthodoxy (ορθο- ortho-, before a noun in Greek means 'right' or 'proper'; hence, from the Greek ὀρθοδοξία, orthodoxia – "right opinion") that we hope will lead to some version of truth(s).

Opinions come in at least two varieties: those that cannot be converted into knowledge and those that can–in the sense that they conform to the prevailing orthodoxy. An example of opinions that can never be truly converted into knowledge is a belief about the "best tasting food" or the "best political viewpoint." Each of these are based on personal or personally affiliated group tastes. However, in a case where opinion can be converted into knowledge, when an information professional supplies information that conforms to the current orthodox view of a particular subject, it means that we are supplying information in accord with current scientific, social or moral beliefs. The orthodoxy is the dominant view or "the right answers" according to the prevailing paradigm, e.g., there are 8 planets in our solar system (Pluto having been demoted from the previous view of 9). For example, when asserting that 100 degrees Celsius is the boiling part of water, "reality" is constructed on common observations (the freezing and boiling parts of water at sea level), from which were created an arbitrary unit (i.e., the degree) and measures or yardsticks of these units. In the case between the freezing and boiling of water, some influential scientists created one form of the degree unit, the Celsius, which sets at sea level the freezing of water at 0 degrees and the boiling of water at 100 degrees, there being 100 units constructed between them. Daniel Fahrenheit set 0 degrees as the temperature of a mixture of brine (salt and water) and 32 degrees as the point at which water freezes and 212 as the point where water boils, there being 180 units between the two. Of course, scientists could have decided on a different number of units, but both systems are based on fundamental observations. Freezing and boiling are perceptions, but the structure of and articulation of the meaning of those observations were invented and imported into a consensual framework, which in turn formed the current orthodoxy. So that if a patron wants to know the freezing point of water, we promptly reply that it is 0 degrees Celsius or 32 degrees Fahrenheit, depending on what we anticipate the patron's comprehending framework to be, e.g., whether they are asking in a country that uses one system rather than the other, or whether the context is scientific where Celsius would be preferred.

 

Imaginings in Plato's Cave

Understanding opinion in the world of misinformation and disinformation, however, might be better understood by using Plato's Allegory of the Cave. In it, Socrates describes a situation that takes place in a dark cave. A number of prisoners have lived in this deep cave since birth, never seeing the light of day, and are physically constrained in such a way that they cannot look to either side or behind them. Behind them is a fire, and behind the fire is a low wall, from behind which various objects are lifted into the air manipulated by another group of people, who are out of sight behind the wall. The fire casts shadows of the objects across the wall facing the prisoners. The prisoners watch the sequences that the shadows play out and play games predicting the sequences and sounds that reverberate in the cave. When they refer to one of the shadows as a "book" for example, they are not actually seeing a book, but rather the shadow of a book, confusing its shadowy appearance with actual reality. Because of their condition and constraints, they believe their perceptions are the most real things in the world. They are so convinced of the reality of their context, they mock anyone who would believe otherwise, and in fact, as the allegory continues to be extrapolated, they are forced to come to see their actual condition, first by being shocked into an awareness of their condition, by becoming aware of the real source of the light, seeing how things are as they are forced to move out of the cave; and second as a mid-wife, letting them, through an interrogation, to come to understand for themselves, in a form of self-realization, their actual condition. In the Platonic/Socratic view of true education, there are two aspects of the Socratic method of education: (1) Socrates as a stingray, electric eel or gadfly (to which he is referred in various Platonic writings), shocking or benumbing his interlocutors into an awareness of their ignorance as they are temporarily blinded by the light. The purpose of this shock is to clear away what one unidentified commentator referred to as "the conceit of false knowledge." It is a brilliant succinct description of the intent of the first aspect of the Socratic method. (2) Socrates as a midwife – using questions skillfully to have his interlocutors come to a self-realization of their true condition. This conversion process does not always succeed as many are secure in their state of ignorance; or they lack the wit to follow the logical conclusion of Socrates' questions. The Socratic method is prefaced, if you recall many of Plato's dialogs, with a profession of ignorance. His interlocutor in a dialog, e.g., Meno in the Meno, brings up a topic to be discussed, such as virtue. Socrates' response is an enthusiastic willingness to learn, because he professes that he has little or no knowledge of the topic at hand. His profession of ignorance has been referred to as ironic, because in the end, his knowledge of the topic, as 'limited' as it is professed to be, turns out to be the most informed.

What is fascinating is the condition out of which education takes place from a Socratic perspective: the rise of "imaginings" (in Plato) or ignorance. This condition can help us understand contemporary world politics, especially presidential politics. This paper will frequently use Donald Trump, his apparent cognitive state and his words to illustrate problems of the forms of ignorance, disinformation and misinformation because of his notoriety and easy access to his comments in social media.

 

The Varieties of Ignorance and False Information

We can, in fact, create a taxonomy for varieties of ignorance – ignorance per se, and the modes of information that produce or facilitate it: misinformation, disinformation, missing information or self-deception. The taxonomy includes the following:

  • Ignorance per se: lacking knowledge or awareness, being uninformed about a specific subject or fact, e.g., Donald Trump's lack of knowledge of the constitution and how it forms the nature of our democracy, how government works, the balance of powers, etc. He is ignorant of his ignorance. Like the prisoners in the Cave, he seems incapable of seeing the difference between reality and his "reality."

  • Misinformation: offering information that is incorrect or inaccurate. Don Fallis correctly observes in "The Varieties of Disinformation" (2014, p. 136) that "Inaccurate information (or misinformation) can mislead people whether it results from an honest mistake, negligence, unconscious bias, or (as in the case of disinformation) intentional deception." In other words, the difference between misinformation and disinformation is the intent to deceive. For example and assuming that I am not a Trump supporter or Hillary hater, if I innocently albeit incorrectly tell a colleague that Hillary Clinton wanted to abolish gun ownership, I will only be misinforming them.

  • Disinformation: supplying misinformation with the deliberate aim to mislead. The suppliers of such untruths can include foreign countries, government agencies, corporations, political parties, especially Super PACs,2 and political candidates. The intent to deceive is important because, in general terms, untruths need not be pronounced by someone whose intention is to deceive. But as Don Fallis (2009) notes in "A Conceptual Analysis of Disinformation," "while disinforming may not require that the source of the misleading information intend to deceive people, it does at least require that the source of the information foresee that people will be deceived."

    Don Fallis's article mentioned above distinguishes four major types of misinformation: lies, visual disinformation, true disinformation, and side effect disinformation (2014, p. 137). Trump has told so many lies, so frequently and spontaneously, that it is difficult to determine which assertions might be true. A sampling of his lies include: that Barack Obama tapped his phones prior to the election, that millions of undocumented people voted illegally during the election (so as to cost him the popular vote), that thousands of people in Jersey City cheered on 9/11, and that in economic terms he inherited "a mess with jobs" from Barrack Obama in terms of the unemployment rate (Carroll; Jacobson, 2017). During the election, Trump also excelled with respect to the misuse of visual graphics, something quite common in political campaigns. For a wonderful illustration, with demonstrations of how charts lie, see John Muyskens (2016), "Most of Trump's charts skew the data. And not always in his favor," in The Washington Post. True disinformation, according to Fallis, is the use of accurate information to intentionally mislead. While it is true that Hillary Clinton used a personal email server for official communications of classified information this does not in itself mean that she is "crooked" or that she could not be trusted with classified government material. Fallis's last form of misinformation is called side effect disinformation, which he explains with the example of researchers inserting inaccurate information into Wikipedia to see how it might be detected and corrected. In the current context, we would be unlikely to find an example where inaccurate political information was inserted into a campaign speech or press release in order to be found and corrected, given that such information is usually put there to be believed.

  • Missing information: the non-inclusion of information that should be known or present in order to understand facts and make decisions. Its absence is due to negligence, incompetence or the desire to mislead. For example, accusing Hillary Clinton of being responsible for the Benghazi attack omits the fact that after 11 hours of her testimony before the House Select Committee on Benghazi and many other investigations no substantial evidence was unveiled to indicate that she was the source of any wrongdoing. In short, the preordained conclusion of the Committee was to establish that she was responsible for the event in the absence of any evidence to establish their case. One can conclude nothing from no information or evidence.

  • Self-deception or bad faith: Sartre observed presciently that bad faith is believing what you don't believe: holding or living a contradiction at the one and same time. Sartre's notion is central to his philosophy, a mode of living inauthentically, where people may deceive themselves into thinking that they do not have the freedom to make choices for fear of its potential consequences, i.e., that they would have to be responsible for themselves. We might file forms of "willful ignorance" under this category, knowing something to be true but consciously or unconsciously choosing ignorance, e.g., choosing to believe that the Confederate flag is not a symbol of racism.

    We can distinguish two types of self-deception: motivated and unmotivated.
     
    • Motivated self-deception: pushing a form of self-deception for conscious political, social, ethical or personal gain (e.g., proposing that all Muslims should be quarantined or deported because all of them believe in Sharia Law and support jihad). Stephen Colbert's notion of 'truthiness' is probably the best contemporary expression of motivated self-deception, described by Wikipedia as a "belief or assertion that a particular statement is true based on the intuition or perceptions of some individual or individuals, without regard to evidence, logic, intellectual examination, or actual facts" (Wikipedia, Truthiness, 2017). We practice truthiness when there is something we want to be true despite clear evidence to the contrary. Types of motivations for motivated self-deception or 'truthiness' include political, personal, social, and ethical.

    • Unmotivated self-deception: succumbing to one's biases, motivated to the degree that it accords with one's a priori biases; confirmation bias. In fact, most people are inclined to information avoidance as a technique of confirmation bias. Drawing on research in economics, psychology and sociology, George Loewenstein, Russell Golman and David Hagmann at Carnegie Mellon University "illustrate how people select their own reality by deliberately avoiding information that threatens their happiness and well-being. In a paper published in the Journal of Economic Literature, they show that while a simple failure to obtain information is the most clear-cut case of 'information avoidance,' people have a wide range of other information-avoidance strategies at their disposal. They also are remarkably adept at selectively directing their attention to information that affirms what they believe or that reflects favorably upon them, and at forgetting information they wish were not true" (Rea, 2017). As opposed to the notion of the so-called rational decision maker who seeks to find all information and options for a particular decision, the researchers found that people "often avoid information that could help them to make better decisions if they think the information might be painful to receive. Bad teachers, for example, could benefit from feedback from students, but are much less likely to pore over teaching ratings than skilled teachers" (Lowenstein et al., quoted by Rea, 2017). And furthermore, "even when people cannot outright ignore information, they often have substantial latitude in how to interpret it. Questionable evidence is often treated as credible when it confirms what someone wants to believe" (Rea, 2017). We have wordsmithers like Frank Luntz, a GOP propagandist and spin doctor, creating language to promote confirmation bias for Republican Party positions. As PBS put it, Luntz's specialty is "testing language and finding words that will help his clients sell their products, or turn public opinion on an issue or a candidate" (The Persuaders, 2004). In other words, he uses language for propaganda. Instead of using a phrase like "oil drilling" in a political advertisement, he would urge the use of "energy exploration;" rather than "inheritance tax," he would urge "death tax;" instead of "global warming," "climate change;" instead of "healthcare reform," "government takeover;" instead of "capitalism," "free market economy." According to Alan Grayson, Luntz is like "a serial killer of the English language" (Grayson, 2011; Frank Luntz Republican Playbook, s.d.).

      Such mental sleights-of-hand have been used to promote legislative agendas. Right-to-work laws, pushed by business interests and those institutions that support them, e.g., the U. S. Chamber of Commerce or the Republican Party, may appear to represent the interests of workers in joining labor unions, negotiating union contracts and facilitating the conditions of work. In fact, whether enacted by law or constitution, such provisions do not strive to provide a guarantee of an employee's right to work; rather, they prohibit agreements between employers and labor unions on such matters as whether they can require employees to take membership in the union and whether they can mandate the payment of union dues by non-union members. Unless legislators and their constituents are completely savvy about the content of such laws, legislators lobbied by the legislation's promoters are likely to pass them as laws that appear to be in workers' favor when in reality they are not. Similarly, laws promoting "religious freedom" would seem to defend the free exercise of one's religion with minimal or no constraints but are in fact used to discriminate against those whose views are at odds with one's religion. Examples of this are the cases Masterpiece Cakeshop v Colorado Civil Rights Division, in which a Christian baker in the open marketplace was allowed to refuse to make a wedding cake for a gay couple, because his religion does not recognize gay unions or accept homosexuality, and Burwell v. Hobby Lobby, where, for its religious beliefs, the for-profit company Hobby Lobby has been allowed by the U.S. Supreme Court to be exempt from requiring certain employees to have contraceptive coverage, mandated under the Affordable Care Act.

      Such deceptive wordsmithing enables unmotivated self-deception with potentially serious political and social consequences.
 

Two Dominant Forms of Information Calumny

In addition, there are particularly noxious forms of information calumny, doxing and fake news, that have come into play in the age of misinformation or disinformation, forms that wreaked powerful havoc in the 2016 presidential election. Both were techniques employed by national, right-wing agents and Russian agents to manipulate the results of the election, and they proved particularly effective in swinging the vote of voters who were susceptible to motivated self-deception.

  • Doxing: searching for and publishing private or identifying information about a particular private or public individual or group on the internet, typically with malicious intent. "Doxing" is a neologism that has evolved over its brief history. It comes from a variation in spelling of the abbreviation "docs" (for "documents") and according to Wikipedia refers to "compiling and releasing a dossier of personal information on someone" (Wikipedia, Doxing, 2017). Essentially, doxing is openly revealing and publicizing records of an individual or group which were previously private or difficult to find, often for nefarious purposes, such as extortion, shaming, coercion or legal, political, or moral harassment. During the presidential election, Russian hackers targeted Democratic candidates and the Democratic National Committee headquarters by doxing those candidates and the Party. As a candidate, Mrs. Clinton may have already had weaknesses that were compounded by recurrent issues with her private email server and the statements by James Comey, but most Clinton supporters believed that it was the Russian assault that played a fundamental role in her electoral defeat. Its impact was huge, harnessing the monumentally widespread use of hacking and doxing and bringing in Russian state actors to perpetuate and propagate the method. In general, in cases of doxing, there may be some truth in a doxing claim, but it tends to distort the realities by amplifying a portion of a person's or an organization's history and so at best is misleading and misinforming.

  • Fake news: a form of yellow journalism (news stories with catchy headlines but with little or no factual basis) that consists of deliberate misinformation, hoaxes or fraudulent stories, spread in traditional media or online social media. It is published with the intent to distort or "mislead in order to damage an agency, entity, or person, and/or gain financially or politically" (Wikipedia, Fake news, 2017). Those who produce fake news hope to solicit the motivated self-deception of consumers of such 'information.' The U.S. Intelligence Community has said that it is "confident" that Russia sought to influence the election of Trump during the 2016 U.S. presidential election (Carroll, 2016). Dubbed "Pizzagate," one of the most serious examples of the consequences of publishing fake news came with posts to social media sites such as Twitter and Facebook that falsely claimed that the Washington D.C. pizza store Comet was the center of a child-sex ring run by Hillary Clinton and her campaign chairman, John Podesta. Edgar Welch, 28, of Salisbury, North Carolina, the young man who fired at least three shots from an AR-15 rifle inside the Comet Ping Pong pizzeria in December, 2016, explained his attack by saying that he was investigating the veracity of the conspiracy theory this news story had generated. Fortunately, no one was hurt (Simpson, 2017). One might argue that fake news is a species of truthiness or a form of bad faith in that one makes the assumption that the source of the story on the web, usually found on social media, is real. Returning to the example of the presidential campaign, it is clear that Russian troll farms and other Russian agencies used Instagram, bought ads on Facebook and produced millions of tweets to favor the election of Donald Trump (Kosoff, 2017).

    A particularly pernicious form of fake news employs "bots." "A 'bot' is a software application that automates tasks such as repetitive responding to related tweets on the topic that is being slandered. Russian agents have been especially effective in the use of bots to produce disinformation in stories on Twitter, which is currently one of the most influential forms of social media. They create bogus Twitter accounts by using common key words and hashtags that would typically accompany a pro-Trump or anti-democracy tweet. A site called 'Hamilton 68' (http://dashboard.securingdemocracy.org/) monitors Twitter traffic to trace the origin of these kinds of fake news stories to some 600 Russian accounts. This site was named after Alexander Hamilton's Federalist Paper 68, in which he anonymously defended the U.S. Constitution to the public, and its function is to alert Twitter and other social media users to the ongoing attempt of the Russians to destabilize Western governments and influence elections. In September, 2017, the Russians were pushing stories to have the Trump administration fire General H. R. McMasters, one of the voices in that administration that believes in NATO and the stability of the European Union (Hamilton 68 Website, 2017).

    Producing fake news stories can be financially rewarding, and not just for Russians. NBC News reported the case of a Macedonian teenager who was one of the many Macedonians producing fake news stories, writing under the pseudonym Dimitri. Dimitri wrote and distributed articles criticizing Hillary Clinton and praising Donald Trump which looked real and appeared to be properly documented, but were not. His reward for these efforts, based on the penny-per-click advertising scheme, was $60,000 over six months, garnered from clicks from Trump supporters (Smith; Banic, 2017).3 Such enticements not only increase the flood of fake news stories but also make it much more difficult for people to be discerning, particularly for those predisposed to accept these stories uncritically.

What is particularly troublesome is that it is likely that such techniques will continue to be used by the Russians and others to influence future elections and the techniques for stopping them are difficult, if not impossible, because it is difficult to control their source (Russia or other international agents) through the internet. Those motivated to embrace fake news deem any attack on fake news as fake news itself, perpetuated by the liberal media, a perfect catch-22 lesson of bad faith that Trump has also perpetuated: making claims based on lies whose credibility is supposedly enhanced by attacking those who claim that such stories are fake news.

 

The Goals of Deception

What are the goals of doxing or fake news? Don Fallis's "The Varieties of Disinformation" (2014) cites Chisholm and Feehan's "The Intent to Deceive" (1977, pp. 143–159) to articulate four of these. The first two, which are achieved by positive deception (causing a false belief) are (1) creating a new false belief (e.g., Trump had already spoken to the families of the soldiers who fell in Niger before being reminded to do this by the press) and (2) maintaining an existing false belief (e.g., if Hillary Clinton became president, she would shut down the National Rifle Association). The other two, which use negative deception, are (1) causing the loss of a true belief (Hillary was an acceptable presidential candidate before her image was affected by the stream of fake news stories about her) and (2) preventing the acquisition of a true belief (Donald Trump was an unacceptable presidential candidate whose 'acceptability' was made possible by focusing on his achievements as a reality TV star, or his success with The Art of the Deal and his being a Washington outsider) (Fallis, 2014, p. 140). For the Russian troll farms in the pre-election strategies, where trollers were handsomely paid relative to their environment, it did not matter which of these occurred, only that those that were pro-Trump or anti-Hillary Clinton succeeded. And they did. Researchers at Oxford University studied the pre-election tweets in Michigan because Trump won Michigan by a mere 10,704 votes. They came away convinced that fake news tweets, possibly linked to Russian troll farms, outperformed tweets that were based on sources of real news (Oosting, 2017). Such social media influences were likely to be sufficient to swing the election results in Trump's favor both in Michigan and nationally.

Fallis (2014, p. 142) also provides another useful classification of the goals of deception: (1) mislead about the accuracy of the content; (2) mislead about the source believing the content; (3) mislead about the identity of the source; and (4) mislead about an implication of the content being accurate. Most of Trump's lies are of the first category, like his claims that the United States is the most heavily taxed country in the world, that nobody but reporters care about his own tax returns or that he signed more legislation in his early months in office than any previous president. All of these are purported to be true and all of them can easily be shown to be false. In the second case, misleading about the source believing the content, a good example might be when Donald Trump claimed that he fired Mr. Comey in part because agents had lost confidence in him (whereas internal survey data contradicted that claim) (Apuzzo, 2017). An example of misleading about the identity of the source would be the Russian trollers pretending to be reliable sources of information only to trick those that fell for them. For example, they created a fictional American by the name of Melvin Redick of Harrisburg, Pa., an easy-going American with appealing properties, e.g., wearing a backward baseball cap and with a young daughter, which they posted on Facebook. This fictional character had a link on his Facebook page to a brand-new website in which, he claimed in a message written on June 8, 2016, "These guys show hidden truth about Hillary Clinton, George Soros and other leaders of the US." "Visit #DCLeaks website," Redick concluded, "it's really interesting!" It had just enough seeming innocence to attract the visitor to an anti-Hillary Clinton site. Russian fingerprints were on a multitude of fake Twitter and Facebook accounts. As noted earlier, many were automated Twitter bots that would display identical content seconds apart "in the exact alphabetical order of their made-up names" as the cybersecurity company FireEye observed (Shanesept, 2017). On Election Day, for instance, the same company confirmed that "one group of Twitter bots sent out the hashtag #WarAgainstDemocrats more than 1,700 times" (Ibid). Fallis's final category is "Misleading about an implication of the accuracy of the content" or what he calls "false implicature." If Trump's claim that millions of undocumented people voted illegally during the election were true, then the implication is that he would actually have won the popular vote (despite the fact that the reasoning is confused: he had assumed that they all voted for Hillary Clinton, although one might argue that many may have voted for him and therefore his claims would be at best ambiguous).

 

Psychological Considerations for the Success of Fake News or Doxing

Why is fake news or doxing so successful? Is there something going on below the level of conscious thought? Psychological studies can help us understand the subconscious context that impedes our willingness to reject disinformation or misinformation:

  • Repetition of fake news stories or 'facts' increases their plausibility (Stafford, 2016). Trump's strategies on the campaign trail were reminiscent of Luntz's but had a perverse twist, using such repeated phrases as "crooked Hillary" or "lying Ted" not only as ad hominems against Hillary Clinton or Ted Cruz but as attempts at serial character assassination. Indirectly, through signs and repeated chants, his supporters repeated the messages and dutifully absorbed them.

  • People who are ignorant or unskilled in a given domain tend to believe they are much more competent than they are. In psychology this is referred to as the Dunning–Kruger effect. It suggests that people are uncritical about their own abilities and uncritical of their lack of critical thinking. To put it bluntly, the stupid do not know they are stupid and are not likely to have or get the skills to recognize their lack of critical thinking, e.g., the Cave prisoners. The first publication in 1999 of David Dunning and his then-graduate student Justin Kruger bore the title "Unskilled and Unaware of It: How Difficulties in Recognizing One's Own Incompetence Lead to Inflated Self-Assessments." The authors asserted that one needs competence and knowledge to judge the extent to which one is skilled and knowledgeable (Poundstone, 2017). Ironically, on the other hand, competent persons tend to underestimate their own competence and "erroneously presume that tasks that are easy for them to perform are also easy for other people to perform" (Wikipedia, Dunning–Kruger effect, 2017).

  • Once learned, false information is hard to dispel. David Rapp's research on memory and learning reveals that our brains quickly memorize the information that we learn independently of its validity or source. If we later discover that it is false, that does not necessarily override the initial impression. He suggests that we can guard against this proclivity if our thinking is critically proactively (Waters; Hargadon, 2017).

  • Culturally induced ignorance stimulates doubts about correct views already held by information seekers. Robert N. Proctor, a Stanford University Professor, coined a word for the study of culturally induced ignorance or doubt, agnotology, which appears to be a specialized technique for spreading misinformation that makes information seekers more doubtful of views or information that they already hold. In particular, he was thinking of what happens in the media, where either through "neglect" or as a result of "deliberate misrepresentation and manipulation" corporations and governmental agencies can contribute to agnotology through "secrecy and suppression of information, document destruction, and myriad forms of inherent or avoidable culturopolitical selectivity, inattention, and forgetfulness" (Wikipedia, Agnotology, 2016). A clear example for Proctor was the fostering of ignorance through the tobacco industry's use of advertising to generate doubt about cancer and other illnesses being the result of smoking. A similar approach is espoused by climate deniers, proponents of fracking, and pesticide manufacturers.

  • There are specific psychological or personality traits that make individuals vulnerable to political misinformation. In "Social Psychological Perspectives on Trump Supporters" Thomas F. Pettigrew (2017) offers some valuable insights from social psychology. He argues that no one factor describes Trump's supporters, but rather that there is an array of factors reflecting five major social psychological phenomena to account for their devotion, though he hastens to add that these five interrelated factors in themselves do not provide a complete explanation because political factors are also involved. These five phenomena are authoritarianism, social dominance orientation, prejudice, low intergroup contact and relative deprivation.
     
    • Authoritarianism is characterized by such traits as "deference to authority, aggression toward outgroups, a rigidly hierarchical view of the world, and resistance to new experience" (Pettigrew, 2017, p. 108). In social psychology, an outgroup is a social group with which the individual does not identify. Authoritarians see the world as dangerous and their response is triggered by threat and fear. While there is a debate among social psychologists about whether authoritarianism is a personality construct or a political ideology, Pettigrew argues that "there is no necessary conflict between these two perspectives" and suggests that authoritarianism usually starts as a personality orientation which then leads to an engagement with right-wing political ideology. Pettigrew (Ibid.) indicates how authoritarianism is measured by political scientists. This is done by asking what is more important for children: "respect for elders more important than independence; obedience more important than self-reliance; being well-behaved more important than being creative; and having good manners more important than being curious." Other interesting aspects of authoritarians according to findings of Pettigrew's research are that authoritarians are more numerous among right-wingers than left-wingers and that they are attracted to the absolutist language favored by Trump ("losers," "complete disasters," etc.). Before the rise of Trump, Republicans averaged higher than Democrats in authoritarian scales but then used this absolutist approach in their appeal to voters by their "opposition to virtually everything proposed by the African-American President Obama" (Pettigrew, 2017, p. 109).

    • Social dominance orientation (SDO) is related to authoritarianism but is distinct from it. It is characterized as "an individual's preference for the societal hierarchy of groups and domination over lower-status groups" (Pettigrew, 2017, p. 108). It espouses a preference for anti-egalitarianism within and between groups. Those who score high with SDO are "typically dominant, driven, tough-minded, disagreeable, and relatively uncaring seekers of power" (Ibid.). Trump's assertions about those at the top of society (white people like him) and those "losers" and "bad hombres" at the bottom are SDO kinds of statements. Trump supporters score high both in authoritarianism and in SDO. Trump broke the unwritten rules of American politics by appealing directly and openly to both traits in his supporters.

    • Prejudice. What is interesting here is that devoted followers of Trump are not just anti-immigrant, but anti-outgroup in general. Pettigrew (2017, p. 110) observes that the Republicans' racially neutral facade began to crumble in the 2008 election where one Republican club published the image of a fake ten-dollar food stamp with the pictures of Obama and stereotypical Afro-American food. In the 2016 election, Trump launched full scale attacks on immigrants, Mexicans and Muslims. Again, Trump supporters correlate highly with a standard scale of modern racism.

    • Low intergroup contact. Pettigrew (Ibid.) observes that there is growing evidence that Trump's white supporters have low intergroup contact (i.e., little contact with groups other than their own). For example, they have less experience with minorities than other Americans. Ironically, "Trump support increased [Pettigrew's italics] as an area's distance from the Mexican border increased."

    • Relative deprivation. The media's main explanation for the basic rationale of Trump supporters was economic in nature, arguing that they were economically deprived and often unemployed. Pettigrew (2017, p. 111) argues that while it may be true of some of Trump's supporters, where the majority are concerned it is not only an oversimplification but also untrue: in reality, Trump supporters were less likely than others to be unemployed, employed part-time or looking for work. And those voters living in districts with more manufacturing were actually less inclined to vote for Trump. The issue is not really reality, but what voters think is true. As Pettigrew observed, "Trump adherents feel deprived relative to what they expected to possess at this point in their lives and relative to what they erroneously perceive other 'less deserving' groups have acquired. Rapidly rising costs of housing and prescription drugs have aggravated their financial concerns. Their savings may not allow the type of ideal retirements they had long envisioned. And hopes for their children advancing beyond their status and going to college are being dashed by rising tuitions."

Pettigrew's focus is on Trump and his supporters, but he suggests throughout that the analysis can be applied to many right-wing political groups and figures, e.g., Marion ("Marine") Le Pen, president of France’s National Front, a far-right political party in France. Finally, he also observes that authoritarianism and SDO clearly correlate with extreme right-wing voting patterns across Europe.

 

Authorities and Fake News or Reliable Truths

Of course, one important way to ameliorate the dangers of misinformation and fake news is to rely on authorities. The role of authorities is an important aspect of the embrace of different cognitive states. We tend to seek information (or misinformation) from our authorities — persons or institutions we believe to be reliable for the kind of information we are seeking. These authorities can supply genuine information, information founded in evidence or fact, or sham information, forms of information mentioned above. We often seek sources to find or confirm information with which we are likely to agree. It is no surprise that fundamentalists seek web sites that confirm their religious position, that conservatives seek web sites that confirm their version of conservatism, that liberals or leftists seek websites that confirm their version of liberalism or leftism, that white supremacists seek web sites that confirm their version of white nationalism. Does this mean that "there is nothing like the truth"? Or, to be more precise, like 'truths'? No, it means that some, if not many, individuals acquiesce to, seek comfort in or justification for their own ignorance. In fact, there are approaches that can demonstrate the shallowness of certain viewpoints, that these viewpoints lack factual support or evidence or that they are logically inconsistent or incoherent. What is disheartening is that the validity of such approaches is often discounted by the ignorant, particularly those who share the psychological conditions noted above.

Of course, information professionals can take on the role of an authority, often not as a subject authority (unless, for example, they have training in the field of which they are making recommendations, as many special librarians do) but as an authority about authorities (see especially an excellent treatment of this issue in Patrick Wilson's Second-Hand Knowledge: An Inquiry into Cognitive Authority,1983, whose thematic heritage has been undertaken by Soo Young Rieh, in such articles as "Credibility and cognitive authority of information" in the third edition of the Encyclopedia of Library and Information Sciences) (Rieh, Soo Young, 2010, pp. 1337–1344; n.d.). While we do not necessarily have subject expertise, we can point to authorities who do: we can recommend a particular text based on peer evaluation or subject evaluators or consensus. What is, of course, another possibility is that an information professional working for an institution with a set of biases or frameworks can make sure that information supplied to a patron conforms to the biases of the institution. Such professionals could become authorities based on a singular authority, e.g., a fundamentalist university. Unfortunately, many of Trump's supporters see Fox or Fox News as their authority in political matters. PunditFact, a project of the Poynter Institute's Tampa Bay Times published on Politifact, analyzed a set of statements by Fox, Fox News, and Fox Business personalities and their pundits over a period of time with the following results: 10 % of the statements were true, 12% were mostly true, 18% were half-true, 21% were mostly false, 30% were false, and 9% were what Politifact calls "pants-on-fire" false (FOX's File, s.d.). Some authorities are not very authoritative when gauged by experience and observation, though this does not deter their adherents, including Trump, from regarding contrary information as fake news. Thus, the notion of authority, particularly of a cognitive authority, can be fraught with ambiguity. We are constantly challenged to question the reliability of our authorities, their foundation in our personal and educational history, and the persistence of their reliability, even more so as information professionals. We will return to the role of information professionals in the age of ignorance, misinformation and its various guises shortly. We will look at some of the ways to deal with the production of ignorance, doubt, misinformation, disinformation or fake news.

 

The Role of Information Professionals

What can individuals or information professionals do about fake news and other forms of misinformation or disinformation? First of all, it would help to acquire some training in identifying logical fallacies, meaning mistakes in reasoning dressed up as rational arguments. The University of Texas at El Paso has published a "Master List of Logical Fallacies" (Williamson, s.d.) which is available as open courseware. One of Trump's favorite weapons is the argumentum ad hominem, typically described as an attack on a person’s character or physical appearance. For example, Trump's own ad hominem salvos on Bill Clinton include such phrases as "He doesn't know much," "Wild Bill," "There's never been anyone more abusive to women in politics," and "Highly overrated!" The New York Times has kept track of Trump's ad hominem tweets, which are mostly examples ofname-calling, and it publishes these in an ongoing list ("The 359 People, Places and Things Donald Trump Has Insulted on Twitter: A Complete List").4

 

The point is that even if the phrase "crooked Hillary" were describing something true, Clinton's assertions that the Russians interfered in the election could only be refuted by specific evidence to the contrary and not by calling her names.

One of the most egregious examples of a logical fallacy was contained in Trump's remarks after the tragedy in Charlottesville, VA, on August 15, 2017, when dozens of people were injured and three died (the civilian Heather Heyer and two state troopers whose helicopter crashed). On that day, a group of neo-Nazis, members of the Klu Klux Klan (KKK) and white supremacists marched on the campus of the University of Virginia to protest the removal of a statue of Robert E. Lee and were met by counter-protesters. Trump claimed that there were "two sides to every story" and put the counter-protesters on the same moral plane as the neo-Nazis, KKK members and white supremacists (even while he said he was not doing this). He argued that on both sides there were "people that were very fine people"; and in an assertion contrary to facts, he claimed that "not all of those people were neo-Nazis [...] Not all of those people were white supremacists." One would ask why anyone would join a march espousing the beliefs of any one of these three groups unless they were committed to them, in an effort to protest the removal of a statue of Robert E. Lee. Supposedly, there are "very fine" racists, who don't believe what they believe. The logical fallacy, apart from arguing against the facts (see the Fallacy of Alternative Truth or Alt Facts), is a "False Analogy," in which two things are incorrectly compared in order to draw a false conclusion, i.e., moral equivalence of the two groups, the racists and the counter-protestors, which he mislabeled the "alt-left" in order to push the false analogy. Similarly, when he complained about "changing history" or "changing culture" or the loss of "a very, very important statue," with the removal the statue of Robert E. Lee because Washington and Jefferson were also slave owners and none of their monuments would be demolished, he engaged in the fallacy of "Half-Truth." While the former was true, there are considerable differences between the founding fathers, such as George Washington and Thomas Jefferson, and the heroes of the Confederacy, such as Robert E. Lee and Stonewall Jackson (Wang; Breuninger, 2017). The former helped create the United States of America; the latter tried to undermine it. Trump's ignorance of American history fails to grasp this basic distinction. According to him, his knowledge of American history is the truth, whereas all other traditional newspaper or historian accounts are "fake news".

For a good analysis of the variety of logical fallacies that Trump employed on the campaign trail, see the YouTube video "Analyzing Trump: 15 Logical Fallacies in 3 Minutes," 2016.

When a patron comes to the reference desk at the library looking for information to substantiate a fake news story, how is the librarian to react? On the one hand, librarians are supposed to act as impartial information providers. On the other hand, social responsibility demands that stories such as fake news be confronted. Toni Samek (2007, p. 8) notes, in reference to her work, Librarianship and Human Rights: A twenty-first century guide, that the book "will help break down the constraints imposed by the myth of library neutrality that divorces library and information work from participation in social struggle, and makes the profession vulnerable to control networks such as economic or political regimes."5 Clearly, the librarian cannot tell the patron that they are wrong in believing in their fake news story and refuse to help. Such a posture would not only be inappropriate professionally but would be unlikely to achieve the desired effect. It would most likely increase the intensity of their false beliefs or opinions. However, information specialists may promote a course of benumbing/stinging and self-realization using the Socratic method as mentioned earlier. They may lead the patron to sources that question their assumptions (the benumbing or stinging aspect of the Socratic method), and if successful (realizing that there are many psychological impediments mentioned above), lead them to sources and critical thinking and discovery of methods of critical self-awareness, e.g., learn something of information literacy. In fact, the promotion of information literacy is probably the most important activity that an information specialist can undertake both with individual patrons and through library programs.

The International Federation of Library Associations offers the following general principles for spotting fake news (International Federation of Library Associations, 2017): (1) consider the source – investigate the site, its mission and contact information; (2) read beyond the given site or source, especially if the content is outrageous or intended to inflame; (3) check the author to see what credentials they have or whether in fact they are real; (4) check the kind of supporting resources that are provided – follow the links and where they lead one to, to assess the credibility of supporting resources; (5) check the date of the story – old news may in fact be old and not currently relevant; (6) determine whether the site is a spoof or satire, such as many stories that appear in The Onion (http://www.theonion.com/); (7) check your own biases – no one is unbiased – make sure that you are not prey to your own biases, liberal or conservative; and finally (8) ask the experts – consult a librarian or subject expert or check a fact-checking site, like Politifact (http://www.politifact.com/).

 

Information Professionals and Their Potential Weaknesses

Information professionals are not immune to some of the varieties of ignorance or misinformation mentioned above. Many information specialists may be prone to missing information or misinformation, due to a lack of knowledge or competence. Is it acceptable for a librarian to assert that Google is always a reliable source of information and urge library patrons to use it? Are librarians behaving unethically by recommending a source of information whose algorithm often produces acceptable results but may not produce the best (or, even, better) results? Are ethics at stake when information seekers (and librarians) are ignorant of the algorithms that drive Google in such a way that the most popular hits in one search become more popular in the next? The Google ranking algorithm, in general, makes the most popular source the best source. The powerful influence exerted by Google's page ranking algorithm means that the result of using Google is rather like the idea of "the rich getting richer": the sites at the top of a ranked list may be used by others to create links in their own site and so their link popularity grows. Thus, there is a bias towards existing popular pages and new pages/sites tend to be ignored or undervalued. Junghoo Cho and Roy Sourashis (2004) estimated "how much longer it takes for a new page to attract a large number of Web users when search engines return only popular pages at the top of search results." Their work has disturbing implications for the slow impact and therefor accessibility of new web pages. While we cannot deny the utility of search engines like Google or Bing in that they often produce reasonable results, if one is looking for high precision or high recall (in the sense of broad retrieval of related intellectual works in the same subject domain) these engines may not be the best sources to use.

Turning to the subject of databases, information professionals can choose from a wide variety of tools but they must be cautious in their use or recommendation of their use, or be guilty of unwitting incompetence or ignorance. For example, what ethical issues are at stake when professional librarians are ignorant of the lack of authority control in commercial, professional and academic databases? As anyone who has worked in the field of database searching knows, databases vary extensively in authority control: from consistent forms of author/journal name/company name entry in high quality databases (e.g., MEDLINE) to multiple forms of each of these, and from rigorously controlled vocabularies (e.g., EMBASE) to no controlled vocabulary (e.g., any of the other citation databases, such as the Science Citation Database). In one loosely controlled database, Gale Group Magazine Database, one search produced eight variants of a particular author's name and more than 30 variants of a journal name. Even a quality database like MEDLINE produced 23 variants of one author's name. The problem is not always the indexer's fault. Journal publication requirements have different forms for approved author entry, and the indexer is forced to use the data supplied by the publication: authors with last names followed by initials of their first and second name or authors with first names included. The problem is that end-users and some librarians believe that if they enter one variant of an author's name, the computer will automatically map to all the variants, the same being true with journal name, subject headings, and the like. The search system does not, so when one thinks that they are doing a precise and complete search, they are not. When the information seeker, either end-user or librarian, searches using one particular form of an author's name, the results are only for that one particular form and no other. Clearly, while librarians and information professionals act with good intent in advising patrons or information seekers, such good intent does not offset the unwitting actuality of missing information or a lack of competence by failing to understand the limits of computers and the variability of indexing practices among databases, even databases produced by the same producer. Having said that, it is not often the case that patrons are seeking high recall and/or high precision, where these potential deficiencies are more accentuated. The magnitude of the problem is unknown, but one has the suspicion that it is more common than it should be.

With regard to citations, the internet also offers librarians (and information seekers) many tools, as do libraries or information centers. Again, however, librarians should exercise caution when recommending a particular tool. For example, high citation counts of a particular journal article or author may in fact successfully lead to related work. But there are flaws in citation work, for example, that citation of a particular work indicates that the author used that work. There are many reasons why an author may include a reference in their paper that in fact he or she does not use. For example, he or she may be hoping to benefit from the "halo effect." By citing more prominent authors in the field, they may be hoping that their work will gain prestige from the citation. For an excellent analysis of the problems of interpreting citations, see Linda Smith's "Citation Analysis" (1981, pp. 83–106), in which she delineates five major points about the problems with citation work. What is the point of this? These tools are useful, but an information professional should know the merits and defects of a particular tool, and make sure that, if consulted, they explain these merits and defects to end-users; otherwise, they mislead the information seekers and exhibit a level of incompetence. This is a concern for all library or information center tools, including collection development, and reference services both in person and online. The patron or customer can be left unwittingly with missing information or misinformation. It is a noble thing to become an information professional, but one's skills and competencies must be ever honed, especially in an age of missing information, misinformation, disinformation, self-deception and the like.

 

Plato Casts His Shadow to Enlighten Us

We live in a society awash with forms of ignorance where few reliable signs point to truth. Not only that: the normal signs pointing toward truth have been sullied with and by propaganda. To borrow a phrase from Alice in Wonderland, the truth has become "whatever I want it to be." We have a president who has smashed the political orthodoxy, due to ignorance, incompetence, lack of interest and, according to psychologist John Gartner, because of at least three personality disorders (narcissistic personality disorder, antisocial personality disorder, and paranoid personality disorder) (Psychology Today Editorial Staff, 2017). Gartner has not been alone in so thinking: Bandy X. Lee (2017) authored a work entitled The Dangerous Case of Donald Trump: 27 Psychiatrists and Mental Health Experts Assess a President. The conclusion of these professionals is that Trump is unfit to serve as President. As such, he has managed to turn orthodox government into a kakistocracy, a nicely onomatopoetic word derived from the ancient Greek, kakistos (κάκιστος), meaning the worst. It is government by the worst: by those least qualified, competent or principled. It is a kakistocracy abetted by the Republican Party that has lost any semblance of moral principle or their historical foundation. The result is a veritable tidal wave of deception.

Plato still casts his own long shadow some 2400 years after his actual existence. It is a shadow that does not promote ignorance but enables us to see the light. The Socratic profession of ignorance enlightens us by challenging our own, individually or professionally, and society's ignorance. It declares that truth matters, that truths matter in a society that hopes to remain democratic and just. Every librarian and information specialist have a responsibility to promote the truth(s) in the communities they serve, both individually and collectively. Individually, this must be done in a Socratic fashion, to challenge the ignorance of information seekers and patrons and to cultivate their own competence; collectively, it must be done through programs that promote information literacy, challenge easy answers to complex questions and make available resources that inspire insight in information seekers, ourselves and the world. In their finest hour, information professionals are signs pointing toward truth(s). There are many psychological and personal pitfalls, as we have shown, that demand proactive critical thinking and rational discourse. However, in another invocation of the Socratic method, librarians must be enabled to cope with their own ignorance and biases (Socratic stinging) and foster competence in themselves and their colleagues (Socratic midwifery). We have many new challenges in the library and information professions as we become more and more wired and digitized. But to avoid occasions of violating information ethics or practicing the ethics of ignorance, we must remain current, competent, and knowledgeable (including knowing when our skill set has been exceeded) and we must seek programs that promote individual and collective information literacy. Only in this way can we move out of the cave and into the light.

 

Bibliography

"The 359 People, Places and Things Donald Trump Has Insulted on Twitter: A Complete List" (2017), New York Times, August 15, 2017. <https://www.nytimes.com/interactive/2016/01/28/upshot/donald-trump-twitter-insults.html?_r=0>. [Retrieved: August 18, 2017].

"Agnotology" (2016). In: Wikipedia. <https://en.wikipedia.org/wiki/Agnotology>. [Retrieved: September 9, 2016].

"Analyzing Trump: 15 Logical fallacies in 3 Minutes" (2016), March 6, 2016. <https://www.youtube.com/watch?v=w2CxDu7jiyE>. [Retrieved: August 18, 2017].

Apuzzo, Matt (2017). "F.B.I. Agents Supported Comey, Surveys Show, Weakening Trump's Claim of Turmoil," The New York Times, August 16, 2017. <https://www.nytimes.com/2017/08/16/us/politics/comey-fbi-agents-confidence-survey.html>. [Retrieved: October 26, 2017].

Carroll, Lauren (2016). "Hillary Clinton Blames High-up Russians for Wikileaks Releases," Politifact, October 19, 2016. <http://www.politifact.com/truth-o-meter/statements/2016/oct/19/hillary-clinton/hillary-clinton-blames-russia-putin-wikileaks-rele/>. [Retrieved: August 18, 2017].

Carroll, Lauren; Jacobson, Louis (2017). "Fact-checking Trump's TIME Interview on Truths and Falsehoods," Politifact, March 23, 2017. <http://www.politifact.com/truth-o-meter/article/2017/mar/23/fact-checki…>. [Retrieved: October 25, 2017].

Chisolm, Roderick M.; Feehan, Thomas D. (1977). "The Intent to Deceive,” Journal of Philosophy, 74, 143–159.

Cho, Junghoo; Sourashis, Roy (2004). "Impact of Search Engines on Page Popularity," WWW2004, May17–22, 2004, New York, NY. <http://oak.cs.ucla.edu/~cho/papers/cho-bias.pdf>. [Retrieved: August 18, 2017].

"Doxing" (2017). In: Wikipedia. <https://en.wikipedia.org/wiki/Doxing>. [Retrieved: November 24, 2017].

"Dunning–Kruger effect" (2017). In: Wikipedia.<https://en.wikipedia.org/wiki/Dunning%E2%80%93Kruger_effect>. [Retrieved: November 24, 2017].

"Fake news"(2017). In: Wikipedia. <https://en.wikipedia.org/wiki/Fake_news>. [Retrieved: November 24, 2017].

Fallis, Don (2009). "A Conceptual Analysis of Disinformation," in iConference Proceedings. <http://hdl.handle.net/2142/15205>. [Retrieved: June 6, 2016].

Fallis, Don (2014). "The Varieties of Disinformation" In: L. Floridi and P. Illari (eds.), The Philosophy of Information Quality, Synthese Library 358, DOI 10.1007/978-3-319-07121-3_8, Springer International Publishing Switzerland.

"FOX's File" PunditFact, (n.d.). <http://www.politifact.com/punditfact/tv/fox/>. [Retrieved: September 15, 2017].

Frank Luntz Republican Playbook: Part X, "Appendix: The 14 Words Never To Use." (s.d.). <http://journalism.uoregon.edu/~tbivins/J496/readings/LANGUAGE/wordsnevertosay.pdf>. [Retrieved: August 18, 2017].

Grayson, Alan (2011). "Alan Grayson exposes Frank Luntz!," email sent to Veritas/Aequitas, posted approximately 2011. <http://veritas-aequitas-zeke.tumblr.com/post/10862766787/alan-grayson-exposes-frank-luntz>. [Retrieved: July 6, 2016].

"Hamilton 68 Website Tracks Russian-backed Propaganda on Twitter" (2017), Reuters, August 3, 2017. <https://venturebeat.com/2017/08/03/hamilton-68-website-tracks-russian-backed-propaganda-on-twitter/>. [Retrieved: August 20, 2017].

International Federation of Library Associations (2017). "How to Spot Fake News" (2017), International Federation of Library Associations publications, updated August 13, 2017. <https://www.ifla.org/publications/node/11174>. [Retrieved: August 18, 2017].

Kosoff, Maya (2017). "The Russian Troll Farm That Weaponized Facebook Had American Boots on the Ground," Vanity Fair Hive: Technology, October 18, 2017. <https://www.vanityfair.com/news/2017/10/the-russian-troll-farm-that-weaponized-facebook-had-american-boots-on-the-ground>. [Retrieved: October 26, 2017].

Lee, Bandy X. (2017). The Dangerous Case of Donald Trump: 27 Psychiatrists and Mental Health Experts Assess a President. New York: Thomas Dunne Books.

Loewenstein, George; Golman, Russell; Hagmann, David  (2017). "Information avoidance: How people select their own reality." Journal of Economic Literature. Núm. 55(1), p. 96–135.

Muyskens, John (2016). "Most of Trump's Charts Skew the Data. And Not Always in His Favor," The Washington Post, October 31, 2016. <https://www.washingtonpost.com/graphics/politics/2016-election/trump-charts/>. [Retrieved: October 25, 2017].

Oosting, Jonathan (2017). "Study: Fake Election News Flooded Mich. Twitter Feeds," Detroit News, April 3, 2017. <http://www.detroitnews.com/story/news/politics/2017/04/03/study-fake-election-news-flooded-mich-twitter-feeds/99997558/>. [Retrieved: October 26, 2017].

"The Persuaders" (2004). Interview Frank Luntz, Frontline, Public Broadcasting System (PBS), November 9, 2004. <http://www.pbs.org/wgbh/pages/frontline/shows/persuaders/interviews/luntz.html>. [Retrieved: May 30, 2016].

Pettigrew, Thomas F. (2017). "Social Psychological Perspectives on Trump Supporters," Journal of Social and Political Psychology, 2017, vol. 5.<https://jspp.psychopen.eu/index.php/jspp/article/view/750/html>. [Retrieved: August 18, 2017].

Plato, Republic (514a–520a).

Poundstone, William (2017). "The Dunning-Kruger President," Psychology Today, January 21, 2017. <https://www.psychologytoday.com/blog/head-in-the-cloud/201701/the-dunning-kruger-president>. [Retrieved: August 18, 2017].

Psychology Today Editorial Staff (2017). "Shrinks Battle Over Diagnosing Donald Trump," Psychology Today, January 31, 2017. <https://www.psychologytoday.com/blog/brainstorm/201701/shrinks-battle-over-diagnosing-donald-trump>. [Retrieved: August 10, 2017].

Rea, Shilo (2017). "Information Avoidance: How People Select Their Own Reality," Carnegie Mellon University News, March 13, 2017.<http://www.cmu.edu/news/stories/archives/2017/march/information-avoidance.html?utm_source=pocket&utm_medium=email&utm_campaign=pockethits>. [Retrieved: July 7, 2017].

Rieh, Soo Young (2010). "Credibility and Cognitive Authority of Information" In: M. Bates & M. N. Maack (Eds.) Encyclopedia of Library and Information Sciences, 3rd Ed. (pp. 1337–1344), New York: Taylor and Francis Group, LLC. <http://hdl.handle.net/2027.42/106416>. [Retrieved: August 18, 2017].

Rieh, Soo Young (n.d.), "Cognitive Authority," <http://rieh.people.si.umich.edu/papers/rieh_IBTheory.pdf>. [Retrieved: August 18, 2017].

Samek, Toni (2007). "An Urgent Context for Twenty-first Century Librarianship," Librarianship and Human Rights: A twenty-first century guide. Oxford: Chantos.

Shanesept, Scott (2017). "The Fake Americans Russia Created to Influence the Election," New York Times, September 7, 2017. <https://www.nytimes.com/2017/09/07/us/politics/russia-facebook-twitter-election.html>. [Retrieved: October 26, 2017].

Simpson, Ian (2017). "Man Pleads Guilty in Washington Pizzeria Shooting over Fake News," Reuters, March 24, 2017. <http://www.reuters.com/article/us-washingtondc-gunman-idUSKBN16V1XC>. [Retrieved: August 18, 2017].

Smith, Alexander; Banic, Vladimir (2017). "Fake News: How a Partying Macedonian Teen Earns Thousands Publishing Lies," NBC News, December 6, 2106. <http://www.nbcnews.com/news/world/fake-news-how-partying-macedonian-teen-earns-thousands-publishing-lies-n692451>. [Retrieved: August 13, 2017].

Smith, Linda (1981). "Citation Analysis," Library Trends, vol. 30 (1), June 1981, pp. 83-106.

Stafford, Tom (2016). "How Liars Create the 'Illusion of Truth," BBC Future, October 26, 2016. <http://www.bTom stafford, bc.com/future/story/20161026-how-liars-create-the-illusion-of-truth>. [Retrieved: August 18, 2017].

"Truthiness" (2017). In: Wikipedia. <http://en.wikipedia.org/wiki/Truthiness>. [Retrieved: November 24, 2016].

Wang, Christine; Breuninger, Kevin (2017). "Read the Transcript of Donald Trump's Jaw-dropping Press Conference," CNBC News, published August 15, 2017, updated August 16, 2017. <https://www.cnbc.com/2017/08/15/read-the-transcript-of-donald-trumps-jaw-dropping-press-conference.html>. [Retrieved: August 18, 2017].

Waters, Anna; Hargadon, Sean (2017). "Mind the Misinformation," Northwestern Campus Life, Spring 2017. <http://www.northwestern.edu/magazine/spring2017/campuslife/mind-the-misinformation-david-rapp-explains-appeal-of-fake-news.html>. [Retrieved August 18, 2017].

Williamson, Owen (s.d.). "An Short Course in Intellectual Self-Defense", A Master List of Logical Fallacies. <http://utminers.utep.edu/omwilliamson/ENGL1311/fallacies.htm>. [Retrieved: August 18, 2017].

Wilson, Patrick (1983). Second Hand Knowledge: An Inquiry into Cognitive Authority. Westport, Conn.: Greenwood Press.

 

Notes

1 The author would like to thank Richard Rubin, Ph.D., Professor Emeritus, School of Information, Kent State University, for his immense help in editing the manuscript; other colleagues of the School of Information at Kent State University for edits and suggestions; Cristóbal Urbano, Faculty of Library and Information Science, Universitat de Barcelona, for his friendship, encouragement and patience in the development of this paper; and the Faculty of Library and Information Science, Universitat de Barcelona, for inviting me to provide lectures for students and the public in 2014. This paper is a follow-up article to my article published in 2004, "A Brief History of Information Ethics," BiD: textos universitaris de biblioteconomia i documentació, issue 13, December 2004, ISSN 1575-5886, DL B-19.675-1998 available at: <https://bid.ub.edu/13froel2.htm>. That article inspired the somewhat convoluted title of the current paper.

2 Super PACs are political committees that supposedly support a candidate with unlimited donations from companies, unions or individuals and that work independently, even anonymously. While they cannot directly support a candidate, they can run ads favorable to this person or unfavorable to this person’s opponents.

3 Natalie Kiriazis, a student in my summer 2017 information ethics workshop, referred to this news story in her essay on fake news.

4 The source continuously updates, so that if one clicks on the link, one will be given an updated page.

5 Katy Tribuzzo, a student in my summer 2017 information ethics workshop offered this reference in her essay on fake news, reminding me of the devotion of Toni Samek to promoting the social responsibilities of librarians.

 

Similares

Temària's articles of the same author(s)

Froehlich, Thomas

[ more information ]

llicencia CC BY-NC-ND Creative Commons licence (Attribution-Non-Commercial-No Derivative works). They may be consulted and distributed freely provided that the author and publisher are quoted (in accordance with the "Recommended citation" section in each of the articles). However, no derivative works (translation, change of format, etc.) may be made without the publisher’s permission. Therefore, it meets the definition of open access form the Budapest Open Access Initiative declaration. The journal allows the author(s) to hold the copyright without restrictions and to retain publishing rights without restrictions.