Encyclopedia of Deception


Edited by: Timothy R. Levine

  • Citations
  • Add to My List
  • Text Size

  • Reader's Guide
  • Entries A-Z
  • Subject Index
  • Front Matter
  • Back Matter
    • [0-9]
    • A
    • B
    • C
    • D
    • E
    • F
    • G
    • H
    • I
    • J
    • K
    • L
    • M
    • N
    • O
    • P
    • Q
    • R
    • S
    • T
    • U
    • V
    • W
    • X
    • Y
    • Z

      • Loading...
    • Copyright

      View Copyright Page

      List of Articles

      Reader's Guide

      About the Editor

      Timothy R. Levine, Ph.D., is professor of communication and media at Korea University, South Korea, where he teaches and conducts research on topics related to interpersonal communication, persuasion, social influence, and social-scientific research methods. In addition to teaching graduate and undergraduate classes, Levine conducts training for police, attorneys, and people in the intelligence and counterintelligence communities. Before Korea University, he held faculty appointments at Michigan State University, Indiana University, and the University of Hawai'i. He has published more than 100 refereed journal articles reporting original research related to communication.

      Levine is an internationally recognized leader in deception research. He is the author or coauthor of Information Manipulation Theory, Truth Default Theory, The Veracity Effect, The Probing Effect, and The Park-Levine Probability Model. His research on deception has been funded by the National Science Foundation, the U.S. Department of Defense, and the Federal Bureau of Investigation. His current research focuses on what makes some people more believable than others and on effective interrogation strategies.

      List of Contributors

      • Owen Anderson

        Arizona State University

      • Michael A. Applegate

        Westminster College

      • Jacqueline Arnold

        Texas Tech University

      • Michael A. Arntfield

        University of Western Ontario

      • Elliot Aronson

        University of California, Santa Cruz

      • Majed Ashy

        Bay State College

      • Randy Azzato

        Bay State College

      • Therese K. Bailey

        Cornell University

      • Alyssa N. Baker

        Westminster College

      • Webster B. Baker

        Saint Leo University

      • Emily Balcetis

        New York University

      • Andrew Bass

        Cornell University

      • Michael J. Beatty

        University of Miami

      • Valarie Bell

        University of Nevada, Reno

      • Morgan Beller

        Cornell University

      • Kevin R. Binning

        University of California, Santa Barbara

      • Geoff Bird

        Birkbeck, University of London

      • Jordan Blackhurst

        Southern Illinois University, Edwardsville

      • Rachel Blady

        Cornell University

      • Amanda Blinebry

        University of Missouri, St. Louis

      • Charles Bond

        Texas Christian University

      • Michiel Bot

        New York University

      • S. Renee Botton

        University of Nevada, Reno

      • William Brady

        New York University

      • Marco Briziarelli

        University of New Mexico

      • Megan C. Brown

        Drake University

      • Judee K. (Kathelene) Burgoon

        University of Arizona

      • Jason A. Cantone

        Federal Judicial Center

      • Elisabeth Carter

        Buckinghamshire New University

      • Stacy L. Carter

        Texas Tech University

      • Sean D. Carter-Hopkins

        Knox College

      • Rod Carveth

        Morgan State University

      • Joseph R. Castro

        Syracuse University

      • Zoë Chance

        Yale University

      • Felix O. Chima

        Prairie View A & M University

      • James J. Chriss

        Cleveland State University

      • David E. Clementson

        Ohio State University

      • Kevin Cochran

        University of California, Irvine

      • Shana Cole

        New York University

      • Tim Cole

        DePaul University

      • Shirley M. Crawley

        Western Connecticut State University

      • Erin Crecelius

        California State University, Fullerton

      • Nicholas J. Crowe

        Centre For Medieval & Renaissance Studies, Oxford

      • Madeleine T. D'Agata

        Queen's University

      • Colin Dailey

        Cornell University

      • Rick Dale

        University of California, Merced

      • Douglas J. Dallier

        University of Maryland

      • Phillip J. Dalton

        Cornell University

      • Kimberly M. Danner

        Oakland University

      • Daniel Cochece Davis

        Illinois State University

      • Deborah Davis

        University of Nevada, Reno

      • Martin V. Day

        Princeton University

      • Evelyne Debey

        Ghent University

      • James I. Deutsch

        Smithsonian Institution

      • Patrick J. Dillon

        University of Memphis

      • Holly Domke

        Cornell University

      • Shane Dunau

        Cornell University

      • Adam Dunbar

        University of California, Irvine

      • Norah E. Dunbar

        University of Oklahoma

      • Nicholas D. Duran

        University of California, Merced

      • Sonja Edwards

        University of Southampton

      • Tiffany Edwards

        Southern Illinois University, Edwardsville

      • Gaven A. Ehrlich

        Syracuse University

      • Aaron Elkins

        University of Arizona

      • Sarah Elliot

        Cornell University

      • Don Fallis

        University of Arizona

      • Chantelle Farmer

        Cornell University

      • Hannah Elizabeth Fawcett

        Manchester Metropolitan University

      • Carey Fitzgerald

        Oakland University

      • Brianna Fowler

        Cornell University

      • Steven Frenda

        University of California, Irvine

      • James Geistman

        Ohio Northern University

      • Ashley George

        University of Alabama

      • James I. Gerhart

        Rush University Medical Center

      • Matthew Ghiglieri

        University of Nevada, Reno

      • Tobias T. Gibson

        Westminster College

      • Christopher Givan

        Westminster College

      • Trangdai Glassey-Tranguyen

        University of California, San Diego

      • Anne K. Gordon

        Bowling Green State University

      • Joseph R. Goulet

        University of Maryland

      • Jamie Graham

        California State University, Fullerton

      • Richard H. Gramzow

        Syracuse University

      • Pär Anders Granhag

        University of Gothenburg

      • Yael Granot

        New York University

      • Danica Gredoña

        Cornell University

      • Aiden P. Gregg

        University of Southampton

      • Alyssa Gretak

        Southern Illinois University, Edwardsville

      • Rosanna E. Guadagno

        University of Alabama

      • Corey Guenther

        Creighton University

      • Swati Gupta

        Institute of High Performance Computing

      • Jennifer A. Guthrie

        University of Nevada, Las Vegas

      • Lauren M. Hamel

        Wayne State University

      • Jeffrey T. Hancock

        Cornell University

      • Dennis J. Hand

        University of Vermont

      • Geraldine Hannon

        Southern Illinois University, Edwardsville

      • Ryan D. Harrison

        Westminster College

      • Joseph Hartgerink

        Tilburg University

      • Maria Hartwig

        John Jay College

      • Frederick Hawley

        Western Carolina University

      • Spencer Haze

        Trent University Oshawa

      • Theresa Storey Hefner-Babb

        Lamar University

      • Heather Heimbaugh

        University of Missouri, St. Louis

      • Jason A. Helfer

        Knox College

      • Catherine Ho

        Cornell University

      • Michael Hoerger

        Tulane University

      • Caitlin Homan

        University of Portsmouth

      • Anne Hubbell

        New Mexico State University

      • Stefan Huynh

        New York University

      • Jill A. Jacobson

        Queen's University

      • J. Jacob Jenkins

        California State University, Channel Islands

      • Jason R. Jolicoeur

        Ivy Tech Community College

      • Aleiah Jones

        University of Toledo

      • James Kang

        Cornell University

      • Thomas Kelley

        Wayne State University

      • Robert Kennedy

        University of California, Santa Barbara

      • Robert Wade Kenny

        Mount Saint Vincent University

      • Holly L. Ketterer

        Central Michigan University

      • Jeffrey Kraus

        Wagner College

      • Bill Kte'pi

        Independent Scholar

      • Adrianne D. (Dennis) Kunkel

        University of Kansas

      • Anastacia Kurylo

        Fortified Communication Consulting

      • Brandon Kuss

        Cerner Corporation

      • Patrice Lawless

        Cornell University

      • Justin J. Lehmiller

        Harvard University

      • Timothy Levine

        Korea University

      • Amanda Levis

        Yale University

      • Helen Lee Lin

        University of Houston

      • Kevin Lin

        Cornell University

      • John T. Llewellyn

        Wake Forest University

      • Elizabeth F. Loftus

        University of California, Irvine

      • Meghan R. Lowery

        Psychological Associates

      • Arthur J. Lurigio

        Loyola University Chicago

      • Joan Luxenburg

        University of Central Oklahoma

      • Stacey L. MacKinnon

        University of Prince Edward Island

      • Nikhila Mahadevan

        University of Southampton

      • James Edwin Mahon

        Washington & Lee University

      • Jaibel Makiyil

        Nova Southeastern University

      • Kathleen Malley-Morrison

        Boston University

      • William David Marelich

        California State University, Fullerton

      • Kent Marett

        Mississippi State University

      • David Markowitz

        Cornell University

      • Tamara Marksteiner

        University of Mannheim

      • Jaume Masip

        University of Salamanca

      • Brian Mayer

        Cornell University

      • Steven A. McCornack

        Michigan State University

      • Matthew S. McGlone

        University of Texas, Austin

      • Joseph McGlynn

        University of Texas, Austin

      • Nicholas McLean

        Yale University

      • Rachel Meisinger

        Creighton University

      • Nicholas Merola

        University of Texas, Austin

      • Robert J. Meuller III

        Knox College

      • Michael Milburn

        University of Massachusetts, Boston

      • Lauren S. Miller

        Syracuse University

      • William J. Miller

        Flagler College

      • Carol Bishop Mills

        University of Alabama

      • Michael Minaudo

        University of Nevada, Reno

      • Tayopa Mogilner

        University of California, Irvine

      • Diane M. Monahan

        Saint Leo University

      • Christopher Monteiro

        University of Massachusetts, Boston

      • Rosey Morr

        Southern Illinois University, Edwardsville

      • Wendy L. Morris

        McDaniel College

      • Kelly Morrison

        Michigan State University

      • Federika Garcia Muchacho

        Bay State College

      • Lauren Murphy

        Southern Illinois University, Edwardsville

      • Tony Murphy

        Sheffield Hallam University

      • Joel T. Nadler

        Southern Illinois University, Edwardsville

      • Patrick J. Nebl

        Bowling Green State University

      • Ololade Ogundimu

        University of Nevada, Reno

      • Luke A. Oosterbaan

        Knox College

      • Yok-Fong Paat

        University of Texas at El Paso

      • Ruwan Pallegedara

        Cornell University

      • Lawrence Patihis

        University of California, Irvine

      • Marcus Patterson

        University of Massachusetts, Boston

      • Melissa Pauasci

        Cornell University

      • Lindsay Perez

        University of Nevada, Reno

      • Narissra Maria Punyanunt-Carter

        Texas Tech University

      • Elizabeth Rholetter Purdy

        Independent Scholar

      • Marc-André Reinhard

        University of Mannheim

      • Rachel Malis Reznik

        Elmhurst College

      • Matthew Riccio

        Columbia University

      • Timothy D. Ritchie

        University of Limerick

      • Beatriz Rivera-Barnes

        Pennsylvania State University

      • Sungjong Roh

        Cornell University

      • Rachel Romero

        Texas State University

      • Paul Rose

        Southern Illinois University, Edwardsville

      • Kevin Rounding

        Queen's University

      • Kayo Sakamoto

        Institute of High Performance Computing

      • Ilana Sandler

        Cornell University

      • Juan Pablo Sarmiento

        Cornell University

      • Huda Sarraj

        Texas Tech University

      • Antoinette W. Satterfield

        U.S. Naval Academy

      • Juliann Scholl

        Texas Tech University

      • Stephen T. Schroth

        Knox College

      • Kim B. Serota

        Oakland University

      • Sana Sherali

        New York University

      • Jeanetta Sims

        University of Central Oklahoma

      • Kirsten Smith

        Southern Illinois University, Edwardsville

      • Evan Mitchell Stark

        Message Science, Inc.

      • Susan Stearns

        Eastern Washington University

      • Mariëlle Stel

        Tilburg University

      • R. Weylin Sternglanz

        Nova Southeastern University

      • Victor B. Stolberg

        Essex County College

      • Chris Street

        University College London

      • Jennifer Summary

        Southeast Missouri State University

      • Barry L. Swanson

        Knox College

      • Ian W. Tingen

        University of California, Irvine

      • Tracy Faye Tolbert

        California State University, Long Beach

      • Marcella Bush Trevino

        Barry University

      • Anna Elisabeth van ‘t Veer

        Tilburg University

      • Lyn Van Swol

        University of Wisconsin, Madison

      • Karen Vanderzanden

        Creighton University

      • Bruno Verschuere

        University of Amsterdam

      • J. Guillermo Villalobos

        University of Nevada, Reno

      • Beth A. Visser

        Trent University

      • Kimberly Voss

        University of Central Florida

      • Aldert Vrij

        University of Portsmouth

      • Courtney A. Waid-Lindberg

        Bemidji State University

      • Andrew Jackson Waskey

        Dalton State College

      • Susan E. Waters

        Auburn University

      • Stan Weeber

        McNeese State University

      • Robert Westerfelhaus

        College of Charleston

      • Michael J. Williams

        University of Nevada, Reno

      • V. Skye Wingate

        Morehead State University

      • Eilene Wollslager

        University of Texas at San Antonio

      • Sarah E. Wood

        University of Wisconsin, Stout

      • Colin Wright

        University of Nottingham

      • Gordon R. T. Wright

        University of London

      • Iris Shuang Xia

        Texas Tech University

      • Michael Zeigler

        University of Nevada, Reno

      • Julia Zhu

        Cornell University

      • Jordan Zurowski

        Southern Illinois University, Edwardsville


      Few topics capture people's curiosity and interest as much as deception. Deception is the stuff of soap operas, spy thrillers, multimillion-dollar frauds, and poker championships. There are the deadly serious deceptions during wartime, as well as harmless pranks such as the jackalope and April Fool's Day. (If you have not seen a jackalope, it is a jackrabbit with antelope horns.) The famous deceptions throughout history are too numerous to mention. Examples include the biblical serpent in the Garden of Eden and the scientific hoax of Piltdown Man.

      Deception is not an exclusively human activity; examples of deception are also common in biology. Primates are capable of quite sophisticated deception. However, humans do not develop the cognitive capacity to fully engage in true deception until somewhere between the ages of 3 and 5.

      Deception is most often defined as intentionally or knowingly misleading another individual or group, although sometimes self-deception is included as well. Deception comes in many shades and flavors such as outright lies, white lies, omission, evasion, equivocation, puffery, half truths, and so forth. People lie for all kinds of reasons. All of these variations are covered in the two volumes of this encyclopedia, which survey the wide variety of lies and deceptions in their many manifestations. Deception is examined from the vantage points of social science, history, philosophy, and pop culture, to name a few. The Encyclopedia of Deception is the authoritative source on the topic.

      Popular Draw of Deception

      Perhaps the best recent example of how deception captures public interest was the initial popularity of the crime drama television series Lie to Me on Fox Network. The main character, Dr. Cal Lightman (played by Tim Roth), is a psychologist with expertise in reading body language and facial expressions, especially brief expressions called microexpressions. Lightman and his team solve crimes through detecting lies and reading human behavior. With approximately 11 million viewers in the first season, the show initially had strong ratings and was renewed for three seasons. Besides the United States, the show was aired in Canada, Australia, South Africa, Latin America, and several countries in Europe.

      The premise of Lie to Me is based on the ideas of real-life psychologist Paul Ekman. However, Ekman's ideas are controversial, and the show may not depict science as much as suggested by its marketing. As a skeptic, I did a little experiment that was published in the academic, peer-reviewed journal Communication Research. We showed research participants either an episode of Lie to Me or a different crime drama, Numb3rs. A third group viewed no show at all. Then, we had our participants try a deception detection task. Watching Lie to Me did not make people any better at distinguishing truths and lies, but it did make subjects more cynical than either of the two control conditions. Lie to Me was more about entertaining fiction than solid science, but it exemplifies the draw of the topic.

      Deception Detection and Deception Research

      Deception detection is also big business. Web sites such as http://www.liespotting.com promise “proven techniques to detect deception.” Paul Ekman's http://EkmanInternational.com offers “cutting edge behavioral science for real-world applications.” There are also http://EyesforLies.com and http://Humintell.com. John E. Reid and Associates train thousands of people, especially law enforcement professionals, each year in the Reid Technique. The Encyclopedia of Deception covers the gamut of deception detection methods, from the scientifically discredited to the most promising.

      In the realm of deception, things are often not what is expected or what they seem. It is a very common belief, for example, that liars will not look one in the eye while lying. However, one might be surprised just how widespread that belief is. Research by psychologist Charley Bond has found that the gaze aversion belief is widespread around in the world. He and his team surveyed people in more than 70 different countries and found that the liars-won't-look-you-in-the-eye belief was nearly universal. But much experiential works reveal that belief to be objectively false. Gaze is unrelated to actual honesty. It has zero validity as a lie-detection clue.

      There has been much research on nonverbal cues to deception dating back to the original work of Ekman and his idea of leakage. It is well documented that people use others’ nonverbal behaviors as a way to detect lies. My research and that of many others has strongly supported people's reliance on observations of others' nonverbal behaviors when assessing honesty. Many psychological theories also specify a link between lying and nonverbal behaviors, as portrayed on Lie to Me. However, social scientific research on the link between various nonverbal behaviors and the act of lying suggests that the link is typically not very strong or consistent. In my research, I have observed that the nonverbal signals that seem to give one liar away are different than those given by a second liar. Further, people do not give away their lies the same way every time, and there are often honest people enacting those supposed lie-revealing behaviors. What's more, the scientific evidence linking nonverbal behaviors and deception has grown weaker over time. People infer honesty based on how others nonverbally present themselves, but that has very limited utility and validity.

      Research on lie detection suggests that without the aid of technology such as the polygraph, people are often poor lie detectors. In most deception detection experiments, people typically do statistically better than chance, but usually not by much. However, very recent research suggests promising new approaches, such as the strategic use of evidence and content in context. Strategic use of evidence involves withholding what one knows to see if the person contradicts that knowledge. Even with the aid of technology such as the polygraph, lie detection is not perfect. Research is progressing on other technologies such as fRMI and thermal imaging, but the polygraph in conjunction with a skilled examiner may be the best approach for now. All of these methods and many more are covered in the Encyclopedia of Deception.

      Deception is not limited to humans; in fact, deception is nature is commonplace and varied. Perhaps you and your dog have engaged in “fake out” while playing fetch. Examples of deception in nature include camouflage and mimicry. Primatologists have also observed gorillas engaged in deception. For example, a band of gorillas is seen walking along a jungle path in single file. One gorilla, which spies a desired food in a nearby tree, stops by the side of the trail and grooms until the others are out of sight. Then, the gorilla grabs the food, quickly eats it, and hurries to join the band.

      Historical Deceptions

      There are many famous lies and deceptions throughout history. There was P. T. Barnum, Niccolò Machiavelli, Sun Tzu, Operation Mincemeat, and the Trojan horse. As long as people have communicated, there have been honesty and deception. As long has history has been recorded, there has been a record of deception in human affairs.

      Biblical examples of deception are numerous. Besides the serpent lying to Eve about the forbidden fruit, Abraham's wife told the Egyptians that she was Abraham's sister. Jacob was deceived by his sons about the death of his favorite son, Joseph. The entry on deception in ancient civilizations describes at least eight stories of lies that are told in the book of Genesis alone.

      World War II is another source of well-known deceit. Nazi propaganda was rife with deceit targeted toward internal and external audiences. However, deception was not exclusively practiced by Germany during the war. For example, the Allies caught Germany largely by surprise with the invasion of Normandy on June 6, 1944. The successful Allied plan was to fool Germany into believing that the real invasion at Normandy was merely a diversion, and that the actual invasion would happen elsewhere. False orders and fake troop movements were created to mask actual troop buildups.

      Lies happen around the world. The concepts of lies and deception are pancultural. Although every major world religion frowns on deception, people everywhere engage in it. However, culture also shapes how deception in enacted and understood. These volumes cover deception around the world, such as how the Spanish conquistador Francisco Pizarro deceived the Inca Empire, and how deception is approached in the Arab culture in the Middle East.

      Politics, Commerce, and Personal Life

      No encyclopedia of lies and deception could be complete without covering lies and deception in politics. When this introduction was written, it seemed that the last major national U.S. election (Obama versus Romney) was marked by an especially high prevalence of deception. However, there is never a shortage of spin on the national news networks of FOX, MSNBC, or CNN. In the past, there was Nixon's Watergate and Bill Clinton's denial of sex with “that woman.” The Encyclopedia of Deception covers government propaganda as well as the decline in public trust in government—a trust that is essential for a functioning democracy—and the scandals, corruption, influence peddling, and lies by public officials that undercut this trust.

      Lies abound in business and commerce. There is puffery in advertising and false claims on resumes. There are Ponzi schemes and defective product coverups.

      Finally, there are lies in our personal lives. For example, Notre Dame's linebacker Manti Te'o fictitious girlfriend was a recent example that captured wide media attention. One study found that as much as 61.5 percent of statements in important relationships were less than fully honest. Another study found that only about one-quarter of people thought that complete honesty was important in maintaining a romantic relationship, and that most people think that being honest depends on the situation. However, discovered lies can also harm relationships, especially when the lies are about important issues.

      While one might think that one is better able to detect lies from people close to one as opposed to strangers, the opposite seems to be the case. Steven McCornack's well-known research finding was that as one develops close relationships, trust and truth bias increases, and truth bias blinds one to a partner's lies.

      This two-volume Encyclopedia of Deception provides nearly 350 entries examining all facets of lying and deception. Philosophical and historical perspectives are offered. Examples of deception from around the world and throughout history are recounted. The new social science of deception receives compressive examination, and deception in relationships and popular culture are also covered. On behalf of the excellent contributors, we hope you find this both a valuable reference set and an entertaining and engaging reading experience.

      Timothy R.Levine, Editor


      3rd century B.C.E.: Archimedes (ca. 287–212 b.c.e.) discovers that he can determine the purity of gold in a crown by measuring how much water it displaces. If it is made of solid gold, it will displace the same amount of water as an equal weight of pure gold, whereas if it is made of gold combined with a lighter metal, it will displace more water.

      ca. 8th century C.E.: A forged document, the Donation of Constantine, is used to justify papal supremacy.

      ca. 13th century: Christian crusaders in the Holy Land create forged versions of the coins used by the local population.

      1699: The English con artist and counterfeiter William Chalone is executed; Sir Isaac Newton, master of the Royal Mint, plays a key role in his conviction.

      1702: The British collector William Charlton claims to have found a rare yellow butterfly with an unusual pattern of black spots on its wings. It is included in the 12th edition of Carl Linnaeus's Systema Naturae, but is later found to be a hoax, a common Brimstone butterfly with painted spots.

      1720: In England, the value of a share of stock in the South Sea Company rises from 128 pounds in January to over 1,000 pounds by August, only to fall to 100 pounds by the end of the year; this is known as the “South Sea bubble.”

      1760:Fragments of Ancient Poetry Collected in the Highlands, marketed as an English translation of work by the 3rd-century Gaelic poet Ossian, is published. It is later revealed to be the work of James Macpherson, a contemporary Scotsman.

      1784: The French Academy of the Sciences investigates the works of Anton Mesmer, who claims that he can heal illness through the use of magnets. The academy concludes that Mesmer's results were from the power of suggestion.

      1836: The European naturalist Constantine S. Rafinesque claims to have discovered the Walam Olum, a document written in the Lenape language (used by the Delaware Indians) on bark, describing how the Indians populated North America. It is believed for decades to be genuine; not until 1996 was it demonstrated to be a hoax, as an examination of Rafinesque's papers proved that he first wrote it in English, and then translated it into Lenape.

      1840: England issues the first postage stamp, the “penny black.” Forged versions began appearing within a year.

      1841: The Scottish author Charles Mackay publishes Extraordinary Popular Delusion and the Madness of Crowds, reporting on a number of historical phenomena including economic bubbles, prophecies, fortune telling, and witch hunts.

      1848: American sisters Katherine and Margaret Fox report hearing strange rappings in their home, reportedly the result of spirit communications, and later put on public exhibitions of similar rappings. Scientist Sir William Crookes, among others, is taken in by their performance, which the sisters later reveal (in 1888) that they produced by cracking their toe joints.

      1857: The Boston Courier offers a $500 prize for anyone who can produce a spiritualistic phenomenon in the presence of Benjamin Pierce, E. N. Horsford, and Louis S. Agassiz, professors at Harvard. Several well-known mediums try and fail to produce evidence sufficient to claim the prize.

      1861: William H. Mumler, an engraver in Boston, begins selling “spirit photographs,” portraits in which a mysterious second image appears with as that of the sitter. Mumler was exposed as a fraud in 1863 when someone recognized that the “spirits” bore a remarkable resemblance to living persons in Boston.

      1863: The U.S. Congress passes the False Claims Act, also known as the Lincoln Law because it was passed during Abraham Lincoln's presidency, in response to high levels of fraud perpetrated during the Civil War. The act allows people to report cases of suspected fraud against the government and collect a portion of the damages recovered.

      1864: Samples from a meteor shower in southern France are collected and sent to various museums around Europe. In the 1960s, several researchers examined the samples in the Musee d'Histoire Naturelle in Montauban, France, and found that they had been tampered with. Fragments of plants and coal had been inserted into the meteorite fragments, and the results were coated with glue to recreate an apparent fusion layer on the outside of the fragments.

      1866: A human skull is discovered in a mine in Calaveras County, California. It is originally accepted as authentic, and is judged to be from the Pliocene age, making it the oldest human skull discovered in America. However, in the early 20th century, it was determined to have been planted at the site as a practical joke.

      1879: Sarah Howe establishes the Ladies' Deposit Bank in Boston, offering returns of 8 percent monthly, and only accepting deposits from women. In 1880, a run on the bank demonstrated that the operation was a scam, and an estimated 800 women lost more than $250,000 in the process. Howe was sentenced to three years for obtaining money on false pretenses.

      1882: The Society for Psychical Research authenticates Douglas Blackburn and G. A. Smith as having genuine telepathic powers. In 1908, Blackburn revealed the methods they used to fool the examiners.

      1891: In Germany, William Van Osten claims that his horse, Clever Hans, can do mathematical calculations. This skill was later shown to be the result of unconscious signaling by the owner to his horse, a type of cuing now known in psychology as the ideomotor effect, or the Clever Hans phenomenon.

      1892: The Ouija Board, a purported method of contacting spirits, is patented in the United States.

      1893: German psychologist Max Dessoir publishes an article, “The Psychology of Ledgerdemain,” which claims that the ability of magicians to fool their audience depends on a partially inherited ability to guide the thoughts of others to a desired conclusion.

      1900: American psychologist Joseph Jastrow publishes Fact and Fable in Psychology, including a chapter on “The Psychology of Deception,” which discusses the uncertainty of knowledge from sensory information and a doctrine of unconscious inference.

      1900: American psychologist Norman Triplett writes his Ph.D. dissertation on the psychology of conjuring deceptions, in which he argues that primitive men developed to gain control over their peers. Triplett also believed that deception was common among young children, and as part of his work collected over 300 examples of spontaneous deception by children aged 3 and younger.

      1908: German American psychologist Hugo Münsterberg publishes On the Witness Stand, which includes descriptions of experiments suggesting that eyewitness memories are not infallible.

      1912: Charles Dawson launches the Piltdown Man hoax, displaying a fossilized skull reportedly discovered in a gravel pit in East Sussex, England. The skull was accepted for several decades as that of a modern human, forming a “missing link” in the evolution from apes to modern humans. In 1953, the Piltdown Man was determined to be a forgery, made up of a modern human skull and the jawbone of an orangutan.

      1917: U.S. President Woodrow Wilson creates the Committee on Public Information when the United States enters World War I; this government office is the first in modern history to engage in large-scale propaganda dissemination.

      1917: Creation of the first of several Cottingley fairy photographs, purporting to capture “real” fairies on photographic film. Sir Arthur Conan Doyle was among the many people fooled by these photographs; the girls who created them, Elsie Wright and Frances Griffiths, later admitted that they faked the pictures using paper cutouts from a children's book.

      1920: Italian immigrant Charles Ponzi is arrested in the United States for running a scheme, which coined the term Ponzi scheme. Ponzi claimed to produce a 50 percent return in 90 days, which was temporarily supported by the attraction of money from new investors (rather than any actual plan of investment). Ponzi was deported to Italy in 1934 after his involvement in another scam involving Florida real estate.

      1921: John August Larson develops an early version of the lie detector, or polygraph, to aid in determining the truth of answers given by those suspected of crimes. He automatically records a subject's blood pressure and breathing depth as he or she is asked a series of routine questions and questions related to a specific crime.

      1923: The International Criminal Police Organization (INTERPOL) is founded in Vienna as the International Criminal Police Commission. The purpose of INTERPOL is to facilitate cooperation among police forces in different countries.

      1925–26: Adolf Hitler publishes Mein Kampf, a combination of biography and political statement; it includes comments on how to successfully use propaganda.

      1926: In an attempt to demonstrate that acquired characteristics can be inherited (Lamarckian genetics), Austrian scientist Paul Kammerer conducts a series of experiments on the midwife toad. However, his apparent success in demonstrating that these toads, if forced to mate in water, would develop the black scaly bumps typical of toads that naturally mate in water is demonstrated to be a hoax, caused by injecting ink under the frogs' skin.

      1931: Lloyd's of London begins offering a discount on their insurance rates to banks that require their employees to take lie detector tests. An astonishing number (10 to 25 percent) of employees admit to stealing, usually from petty cash.

      1937: Edward Filene establishes the Institute of Propaganda and Analysis to help educate Americans to spot techniques commonly used in propaganda and reducing the effects of such techniques.

      1937: Frank J. Wilson, newly appointed head of the U.S. Secret Service, begins a strenuous campaign against counterfeit currency that is credited with reducing losses from counterfeiting by 93 percent.

      1940: American linguist David W. Maurer, a specialist on the language used by members of marginal American subcultures, publishes The Big Con, based on interviews with hundreds of con artists and other criminals.

      1945: Leonarde Keeler is brought in to conduct polygraph tests on German prisoners in an Arizona prisoner-of-war camp to determine which of them harbors Nazi sympathies or otherwise poses a threat to the United States.

      1946: English author George Orwell, in his essay, “Politics and the English Language,” criticizes the way that politicians use language to mislead people and hide the truth, and charges that bad writing habits can interfere with clear thinking.

      1947: The Dutch painter Hans van Meegeren is convicted of forgery. His most notable work was a number of paintings that he claimed were painted by Vermeer, and which had been accepted by experts as genuine.

      1947–53: Workers at the nuclear facility in Oak Ridge, Tennessee, are subjected to numerous polygraph tests, partly in response to the belief that Soviet spies had infiltrated the facility.

      1953:Flying Saucers Have Landed, the first of several books by George Adamski that purport to describe his experiences traveling in outer space and communicating with extraterrestrial beings, is published.

      1955: The London Society for Psychical Research publishes a report on the Borley Rectory, a purportedly haunted house built in 1863 in England, concluding that all reported phenomena could be explained by ordinary, natural causes.

      1959: In the United States, the National Labor Relations Board rules that polygraph tests can be required as a condition of employment.

      1968: American magician and skeptic James Randi offers a prize of $1,000 for anyone who can produce evidence of paranormal activity before a group of witnesses, on terms agreed upon by both parties. The value of the prize has since been increased to $1 million, but no one has successfully claimed it.

      1968: Erich von Däniken publishes Chariots of the Gods? Unsolved Mysteries of the Past, arguing that the knowledge and technology required to produce various aspects of ancient religion and culture (e.g., Stonehenge and the pyramids of Egypt) were brought to Earth by visitors from outer space.

      1968: The discovery of a pattern on the ocean floor near Bimini, an island in the Caribbean Sea, leads to claims that it was once a road, and the area is the location of the “lost continent” of Atlantis. In fact, the road-like features are part of a now-submerged coastline of the island.

      1970s: Israeli performer Uri Geller frequently appears on television, demonstrating his apparent ability to exert physical effects (e.g., bending spoons) by only using the power of his mind. However, skeptic James Randi and others have since demonstrated that all of Geller's performances could be produced using tricks known to many magicians.

      1970–75:The Amazing World of Kreskin, a television program starring George Joseph Kresge (a.k.a., the stage magician The Amazing Kreskin), is produced in Canada and broadcast in Canada and the United States. A key feature of the program is Kreskin's ability to “read” the audience to discover where his check for the evening's performance has been hidden; he does not claim to have paranormal powers, but instead relies on interpreting voices and body language.

      1972: Psychologist Elizabeth Loftus publishes “Reconstructing Memory: The Incredible Eyewitness” in Psychology Today, detailing how it is possible to alter a person's memory of an event by asking leading questions or introducing other information.

      1972: John D. Bransford and Marcia K. Johnson publish research showing that contextual information, such as providing titles to brief paragraphs to establish a context, affected encoding of the information in the paragraph. Providing such information before the subjects read the paragraph produces greater comprehension and recall than providing it after the paragraph.

      1974: Charles Berlitz publishes The Bermuda Triangle, claiming that mysterious forces within an area defined by Florida, Bermuda, and Puerto Rico pose grave dangers to ships and airplanes. The book is partially based on a 1944 incident in which five U.S. Navy planes were lost in that region.

      1974: Dermatologist William T. Summerlin, working at the Memorial Sloan-Kettering Cancer Center in New York City, claims to have successfully grafted skin from a black mouse to a genetically unrelated white mouse. However, this claim is later determined to be fraudulent when the “black” skin patches are determined to have been colored by a felt marker, rather than originating from a black mouse.

      1975: John Nance publishes The Gentle Tasaday: A Stone Age People in the Philippine Rain Forest, describing an isolated tribe living a primitive and peaceful lifestyle in the rain forest, without corruption or conflict. The story, originally perpetuated in 1971 by Manuel Elizalde Jr., the adviser to Ferdinand Marcos on Filipino national minorities, is exposed as a hoax in the 1980s, when tribal members said they were poor and had been pressured by Elizalde to live like a Stone Age tribe in order to receive assistance and serve Elizalde's political agenda.

      1976: American Morris Lamar Keene publishes The Psychic Mafia, describing how he previously posed as a psychic, tricking thousands of people into believing that he had powers to contact spirits.

      1976: The Committee for Scientific Investigation of Claims of the Paranormal (CSI-COP) is founded in Buffalo, New York, to investigate claims of paranormal phenomena. CSI-COP publishes the Skeptical Inquirer, a journal carrying news of paranormal investigations.

      1977: British painter Tom Keating is charged with forgery; although the charges are later dropped, he admits that he produced paintings attributed to a number of noted artists, including John Constable, J. M. W. Turner, Claude Monet, and Vincent Van Gogh.

      1977: American psychologist Philip Zimbardo and colleagues coin the term the illusion of personal invulnerability to refer to the reaction of people when they hear of someone else victimized by a hoax; it rests on the probably unwarranted assumption that the listeners would have seen through the hoax.

      1978–83: German forger Konrad Kujau creates the “Hitler Diaries,” which are later published in the German magazine Der Stern. Although some experts believe them to be genuine, they are later revealed as fake.

      1979: Crop circles, complex geometric patterns created by flattening parts of grain fields, begin to appear in the United Kingdom. Although some believe that they are evidence of visitors from outer space, in 1992, two British men admitted to creating the first crop circles.

      1979: Psychologist Elizabeth Loftus becomes the first person in Washington State to provide expert testimony about eyewitness identification. She testifies on the behalf of the defense in a murder trial, explaining how memory distortion can influence eyewitness testimony.

      1984: Poltergeist phenomena are reported in the Columbus Dispatch, a newspaper in Ohio. The story received widespread coverage, but it was later shown to be the result of tricks by a 14-year-old girl, Tina Resch.

      1985: Psychologist Bella M. DePaulo and colleagues establish that while young children frequently attempt to lie to adults, they are seldom successful; however, by the time they are fifth graders, children can often construct lies that fool adults, including their parents.

      1986: Dutch primatologist Frans de Waal reports that chimpanzees engage in many apparently deceptive behaviors, such as pretending to ignore the presence of hidden food when another chimp is present. However, de Waal also acknowledges that it is impossible to establish intentionality in such cases.

      1987: D. L. Delanoy and colleagues report on the results of a teenager who claimed to have psychic abilities and took part in over 20 testing sessions at the University of Edinburgh. The teenager was able to bend metal objects under informal conditions, but never when strict controls were in place. In the final testing sessions, a hidden camera captured evidence that the teenager was performing tricks to produce his results. Delanoy and colleagues note that they were easily fooled by the teenager, and found him convincing until the hidden camera revealed how he was achieving his results.

      1987–89: John Jacob Cannel investigates score inflation in standardized educational tests. One result of his study is the finding that in 48 of 50 states, students report performing above the national average, a result christened the “Lake Wobegon effect,” in reference to the fictional location in which all the children are above average.

      1992: The Innocence Project is founded at Yeshiva University with the purpose of using DNA evidence to exonerate individuals who have been wrongfully convicted. As of 2012, over 290 people have been released as a result of the Innocence Project's work and, because the trials of many of the wrongfully convicted included eyewitness testimony, the Project also played a key role in questioning eyewitness accuracy, reinforcing similar evidence found in scientific research.

      1994: In Ramona v. Isabella, Gary Ramona successfully establishes that his daughter's psychotherapists implanted false memories in his daughter's mind, leading to her accusations of his sexual abuse of her as a child. He is awarded $500,000 in damages for lost wages, and the case sets a precedent because third-party negligence suits were previously rare and seldom victorious.

      1994: Professor Linda M. Williams publishes a study supporting the contention that memories of sexual abuse are sometimes repressed. For the study, she interviewed 129 women who were known to have been sexually abused—based on hospital records of their treatment for the abuse—and found that 38 percent of them failed to mention the abuse, even when specifically questioned about it.

      1995: Nicholas Leeson, a British broker for Barings Bank in the United Kingdom, is convicted in Singapore of forgery and other acts of fraud. Leeson had been engaged in speculative trading since 1992, using a Barings “error account” to hide his losses, which reached over $1 billion by 1995, and caused Barings Bank to collapse.

      1996: Alan Sokal, a physics professor, publishes the article, “Transgressing the Boundaries: Towards a Transformative Hermeneutics of Quantum Gravity” in the journal Social Text. He later reveals that the article was a hoax, submitted to demonstrate the lack of intellectual rigor in the journal and in the field of cultural studies more generally.

      1998: British physician Andrew Wakefield publishes case studies claiming to support a link between autism and the childhood measles, mumps, and rubella vaccine. This report is later determined to be based on faked data, and in 2010, Wake-field's name is struck off the Medical Register and he is barred from practicing medicine in the United Kingdom. Many of his supporters still believe that vaccines can cause autism, and refuse to have their children injected with the vaccine.

      1998: The journalist Stephen Glass is discovered to have falsified all or part of at least 27 stories he wrote for the New Republic magazine from 1995 to 1998.

      1999: The Canadian theatrical producers Garth Drabinsky and Myron Gottlieb are indicted in New York on charges of fraud and embezzlement in connection with the Canadian production company Live Entertainment Corporation of Canada Inc. In 2009, both were convicted of fraud and forgery in Canada; they remain under fugitive arrest warrants in the United States.

      1999: The National Geographic Society announces that a fossil discovered in China establishes the validity of the hypothesized link between dinosaurs and modern birds; it has the body of a bird and the tail of a dinosaur. However, this discovery is quickly recognized as a fraud, made up of two separate fossils. In March 2000, National Geographic published an admission of the error.

      1999: In the United States, the Department of Energy resumes the use of mandatory polygraph tests for people working in nuclear weapons labs.

      2000: The film Boiler Room, distributed by New Line Cinema, dramatizes a “pump and dump” operation in which a crew of young salesmen sell overvalued or worthless stock shares over the phone to customers by misleading them about the shares' values; causing the shares to temporarily rise in price.

      2000: Eastgate Elementary School in Columbus, Ohio, is praised by President Clinton for the improved standardized test scores of its students. However, several students from the school later come forward and say that they were given correct answers on the test by a teacher's aide.

      2000: The Japanese archaeologist Shinichi Fujimura admits that many of his “discoveries” were fakes, after a newspaper produces photographs showing him burying artifacts on a site in Miyagi Prefecture, where he would later “discover” them.

      2000: Professors Barbara Tversky and Elizabeth J. Marsh publish research indicating that memories are altered by retelling events and changing the context in which they are retold because retelling involves selectively retrieving and using information.

      2000: Jonathan Lebed, a New Jersey high school student, becomes the youngest person to face charges of stock fraud. He bought stocks and promoted them on Internet message boards using multiple aliases, then sold them after the price rose (a “pump and dump” scheme). The case was settled in 2001, when Lebed agreed to forfeit a share of his profits.

      2001: On December 2, American energy company Enron files for bankruptcy amid widespread indications of fraud. It is the biggest bankruptcy in the United States, and Enron shares are worth less than $1 on the day of the bankruptcy filing, as compared to a high of over $90 per share in August 2000.

      2001: Martha Stewart, an American publisher and media personality, is accused of engaging in insider trading by ordering her broker to sell shares of ImClone Systems immediately before news became public that the U.S. Food and Drug Administration would not approve a new drug created by ImClone. Stewart is later convicted of obstruction of justice, and is sentenced to six months in jail. ImClone CEO Samuel Waksal, who provided her with the information, is sentenced to seven years.

      2002: Accounting firm Arthur Anderson is found guilty of obstruction of justice over its part in destroying documents related to the Enron scandal. Although this decision was overturned by the U.S. Supreme Court in 2005, the company did not resume operations because its reputation was destroyed.

      2002: Economists Stephen Levitt and Brian Jacob examine standardized test results from the Chicago Public Schools from 1993 to 2000 and charge that cheating occurs in 4 to 5 percent of classrooms each year. Their results, which are based on the examining patterns of answers on test sheets, are supported when students in classes suspected of cheating retake the test under close supervision and score substantially lower.

      2003: Bernie Ebbers, cofounder and chief executive officer of WorldCom, is convicted of fraud. WorldCom's bankruptcy in 2002 was the largest in history until Lehman Brothers went bankrupt in 2008.

      2003: Jayson Blair, a 23-year-old reporter for the New York Times, is dismissed after it is revealed that he had plagiarized and/or fabricated information in numerous stories written for the Times. Executive Editor Howell Raines and Managing Editor Gerald Boyd also resigned from the Times over the scandal.

      2004: In the United States, the Federal Bureau of Investigation announces that, since 2000, its investigations into financial fraud have resulted in over 11,000 convictions and over $8.1 billion in restitution orders.

      2004: In Birmingham, Alabama, the director of a general educational development (GED) program finds that 5.6 percent of the student population of one high school, Woodlawn, was forced to withdraw before the state standardized tests were administered because they were projected to receive low scores, thus lowering the school average.

      2006: The Brennan Center for Justice, part of the New York University School of Law, reports that voter fraud is extremely rare; for instance, it occurred about 0.0009 percent of the time in the 2004 Washington gubernatorial election.

      2006: John Paul Lewis, Jr., is convicted of fraud for a long-running Ponzi scheme in California, and sentenced to 30 years in prison.

      2008: American broker Bernard Madoff is revealed to be running a Ponzi scheme involving thousands of investors and over $65 billion in investments. In 2009, he was sentenced to 150 years in federal prison.

      2008: The U.S. Securities and Exchange Commission (SEC) announces that it has halted a Ponzi scheme targeting the Haitian American community through a series of investment clubs.

      2008: Lou Pearlman, well known for his work creating and promoting boy bands such as 'N Sync and the Backstreet Boys, is convicted of conspiracy, money laundering, and other charges as part of a Ponzi scheme he operated for over 20 years.

      2009: Marcus J. Schrenker, an investment adviser facing charges of defrauding investors, attempts to fake his own death in January through a plane crash.

      2009: The SEC obtains a court order to halt a Ponzi scheme targeting members of the deaf community in Japan and the United States; the scheme is run by Billion Coupons Inc., a company based in Hawaii.

      2009: Marc S. Dreier pleads guilty to investment fraud over a Ponzi scheme that stole a reported $380 million. He also faces charges in Canada for impersonating a lawyer in connection with the sale of financial instruments to the Ontario Teachers' Pension Plan.

      2010: Harry Markopolos publishes No One Would Listen: A True Financial Thriller, detailing how he uncovered Bernard Madoff's financial deceptions years before the scandal became public, and his difficulties in having the SEC take note of his investigations.

      2010:Inside Job, a documentary directed by Charles Ferguson, examines the causes behind the global financial crisis of 2008; it wins the Oscar for Best Documentary.

      2011: In the United States, the Centers for Medicare and Medicaid Services begins screening Medicare fee-for-service claims through its Fraud Prevention System, a process similar to the screening technology used by credit card companies.

      2011: In Los Angeles, the director of Crescendo charter schools is found to have ordered the principals in its six schools to literally teach to the test to raise student scores on the state standardized tests. The principals were ordered to open the seal on the state exams and teach students using the actual test questions.

      2011: Raj Rajaratnam is convicted on fraud charges in connection with his activities with Galleon Group, one of the world's largest hedge funds.

      2011: Dutch social psychologist Diederik Stapel is suspended from his post at Bilburg University on the basis of alleged scientific misconduct, including fabricating data.

      2011: The U.S. government announces that it has recovered over $10.7 billion in fraudulent health care claims over the past three years, with $4.1 billion in fiscal year 2011 alone.

      2011: The Atlanta Journal-Constitution reports on a widespread cheating scandal in the Atlanta public schools, in which cheating was so commonplace in some schools that teachers organized “erasure parties” to change numerous student answers on test forms from incorrect to correct.

      2011: The scientific journal Nature issues a report stating that published retractions of scientific papers has increased 10 times over the past decade, while the number of published papers has increased only 44 percent in the same period. The most common reason cited for retraction is misconduct, including falsified data and plagiarism.

      2012: Production of the Broadway musical Rebecca, based on the Alfred Hitchcock film, is delayed when about one-third of the show's financing evaporates, reportedly due to the death of an investor who was later determined to be fictional. Several individuals are under criminal investigation in the matter, and a civil lawsuit is filed against Mark Hotton, who acted as a middleman between the show's producer and the alleged investor.

      2012: Citigroup agrees to pay over $158 million to settle multiple cases of mortgage fraud, including its applications for FHA mortgage insurance for about 30,000 mortgages using certifications that were known to be false. About one-third of the mortgage loans in question went into default.

      2012: About 70 students at New York City's prestigious Stuyvesant High School are placed under investigation in a cheating scandal. The students are accused of electronically sending or receiving answers (including by smartphone) to Advanced Placement exams given at the school in June.

      2012: Rajat K. Gupta, a former director of Goldman Sachs, is convicted of passing insider information to Raj Rajaratnam, the former head of the hedge fund Galleon Group.

      2012: As of October, 19 U.S. states have regulations (executive actions or laws) aimed at preventing voter fraud, such as requiring prospective voters to present an approved photo ID at the polling place.

      2012: The U.S. Medicare Fraud Strike Task Force files charges in October against 91 individuals for participating in various fraudulent activities, resulting in $432 billion in false billing, including over $100 million in community health care fraud, $49 billion in ambulance fraud, and over $230 million in home health care fraud.

      2012: Lorraine O. Brown pleads guilty in November to charges of criminal fraud. Brown, the former president of DocX, one of the largest foreclosure-processing companies in the United States, admits to participating in the falsification of over 1 million mortgage documents, many of which were used in foreclosure proceedings.

      2012: A cheating scandal at Harvard implicates about half the students in a 279-person undergraduate class, including a number of prominent athletes. The basis for the allegations are identical or nearly identical responses submitted as part of the class's take-home final.

      2012: Falko Bindrich, a German chess grandmaster, is accused of cheating during a November match by using his smartphone to access a chess program during breaks. He refuses to let officials inspect his phone, and forfeits the match.

      2013: Cycling champion Lance Armstrong admits that he engaged in an organized and long-term program of blood-doping and the use of performance-enhancing drugs, despite previously having denied the use of artificial enhancements while he was competing.

      2013: Europol, the law enforcement agency of the European Union, announces that an 18-month investigation yielded evidence suggesting match-fixing in 680 soccer games played in 15 countries, including World Cup qualifiers and Champions League games.

      SarahBoslaugh, Kennesaw State University
    • Glossary

      • Absence of information: A flaw in human reasoning in which people asked to make judgments based on specified information seldom think about the fact that other, important information might not be supplied to them.
      • Ad hominenattack: A logical fallacy in which a person makes a personal attack on his or her opponent rather than addressing their argument.
      • Advance fee scheme: A type of fraud in which a person is induced to pay for access to an opportunity that is promised to provide even greater rewards, for example, paying for information to set up a work-from-home business, only to find that the real probability of earning money in the proposed manner is nonexistent or minimal.
      • Affinity group fraud: Fraud that targets members of a group that share characteristics with the person perpetrating the fraud, such as ethnicity or religion, with the implication that the perpetrator was a particularly trusted member of the group. Bernard Madoff's Ponzi scheme is often cited as an example of affinity fraud because he is Jewish, as are most of his victims.
      • Anchoring: A cognitive strategy that can lead to incorrect judgments by making estimates beginning with some starting point, then adjusting that estimate upward or downward. The original starting point may exert a strong influence on the final estimate or judgment because the person is reluctant to abandon it, even if evidence suggests that they should.
      • Argument from ignorance: A logical fallacy in which something is asserted to be true because it has not been proven false.
      • Art of War, The: A Chinese treatise, probably written between 500 and 300 b.c.e., attributed to the military official Sun Tzu; the treatise also includes commentary by 11 other writers, added over a period of centuries after the original writing. It includes one section on deception in military operations, which Sun Tzu says is a requirement of military strategy, and also discusses the usefulness of military deception in many other sections.
      • Automatic writing: A purported psychic phenomenon in which an individual touches a sheet of paper with a writing implement and the implement is said to write out messages of its own accord.
      • Availability rule: A type of bias in estimating the probability of some event, in which people judge as more common the events they can easily imagine or for which they already have numerous memories. For instance, someone whose mother died of lung cancer is likely to judge the risk of smoking as higher than someone who does not know anyone with lung cancer, although the personal knowledge has no bearing on the actual risk.
      • Base rate fallacy: A fallacy in which a person ignores information about the base rate, or prior probability, of an event in making a judgment about a single event.
      • Begging the question: A type of fallacy in which the premise assumes the conclusion rather than proving it.
      • Bias for causality: The tendency of people to create causal explanations for random situations as a way of making sense out of their lives, a fact that makes it difficult to discard beliefs that are incorporated into such explanations, even if those beliefs are based on evidence that has been discredited.
      • Big Lie: A propaganda technique used in Germany in the Nazi era, traced back to 1925 when Adolf Hitler described a form of this technique in Mein Kampf. The concept of the Big Lie was popularized by Joseph Goebbels, the Nazi minister of propaganda, who said that it was possible to convince people of a lie if it was big enough and repeated often enough.
      • Bilocation: The ability of a person or object to be in two places at once, a power claimed by various magicians and religious figures.
      • Boy-who-cried-wolf syndrome: The tendency for someone who is known to lie or exaggerate to not be believed when he or she is telling the truth. The reference is to one of Aesop's fables, in which the people of a village ignore a shepherd boy's cries for help because in the past he repeatedly made false claims that his flock was being attacked by a wolf.
      • Cambridge investigation: An 1857 event in which several well-known mediums attempted to claim a $500 prize offered by a newspaper, the Boston Courier, if they could produce evidence of spiritualistic phenomena to the satisfaction of three Harvard professors: Louis S. Agassiz, E. N. Horsford, and Benjamin Pierce. No satisfactory evidence was produced, and the prize remained unclaimed. The event is named after the location of Harvard University, which is in Cambridge, Massachusetts.
      • Cardiff giant: A stone sculpture created by George Hull and buried in the ground near Cardiff, New York, in 1869. The “giant” was discovered by workmen and became a popular attraction, believed by some to be the petrified remains of a human giant. Upon closer examination, it was declared a fake, but the public continued to pay to see it.
      • Chain referral scheme:See pyramid scheme.
      • Circular reasoning: A type of logical fallacy in which the evidence for the validity of an assertion assumes that validity; in effect, it assumes what it claims to prove.
      • Cognitive dissonance: A psychological concept explaining the discomfort that a person experiences when one of their beliefs is contradicted by their experiences or other evidence. In order to reduce this discomfort, the person may change their beliefs to reflect reality, or may reject the evidence of reality in order to hold on to their beliefs.
      • Concealed information test: A method of questioning introduced in the 1950s to determine the truthfulness of someone's claim that they were involved in a crime; the test determined whether the person who was questioned had information that only someone involved with the crime would have.
      • Cottingley fairies: A famous hoax that fooled, among others, Sir Arthur Conan Doyle, who wrote an article about them in 1920. The basis of the hoax was a series of photographs of fairies, which were later shown to be paper cutouts from a children's book.
      • Cramming: A type of fraud in which small charges are added to a legitimate bill, such as for phone service, usually by a third party.
      • Crop circles: Geometric patterns created in grain fields that began appearing in the United Kingdom in 1979 and have appeared in many other countries since. Initially claimed by some as evidence of visitors from outer space, they were later revealed to be the work of two Englishmen and other people who imitated them.
      • CSI-COP: The Committee for Scientific Investigation of Claims of the Paranormal, an organization founded in Buffalo, New York, in 1976.
      • Eyewitness: Someone who has direct knowledge of an event, such as by hearing or seeing it. In the law, eyewitness testimony has traditionally been considered strong evidence, although psychological experiments have demonstrated that eyewitnesses are far from infallible and the memory of events can be manipulated by questioning or by the later introduction of other information about the events.
      • Fakelore: Manufactured folklore presented as if it were traditional. The term was coined by Richard M. Dorson to describe American legends such as Pecos Bill and Paul Bunyan, which were known to have been created by particular writers rather than being characters in authentic traditional tales.
      • False memory syndrome: A condition in which a person strongly believes in memories that are in fact untrue; the term was popularized in the 1990s by Pamela and Peter Freyd, who were accused by an adult daughter of childhood sexual abuse. The Freyds' contention, shared by many psychologists, is that false memories can be created during the course of memory therapy, and the patient has no way to distinguish between false and true memories.
      • Federal Trade Commission: A U.S. regulatory agency established in 1914 by the Federal Trade Commission Act and charged with protecting consumers from deceptive advertising and ensuring fair and ethical competition among businesses.
      • Flashbulb memory: A detailed, vivid, and accurate memory of a moment in a person's life, analogous to that which would be captured by a photograph, were a photograph able to prioritize certain details over others. Despite many claims of flashbulb memories, generally in connection with some major, emotionally arousing event (e.g., receiving news of the assassination of President John F. Kennedy), not all psychologists believe that they are different from other memories, or that they are particularly accurate or stable.
      • FOAF: An acronym for “friend of a friend,” a source often cited in urban legends. The point of FOAF stories is that their veracity is automatically suspect, although possibly believed by the teller because the source of information is so distant from the teller—it didn't happen to them or to someone they knew, but rather to someone two connections removed from them.
      • Forced feedback: A technique to plant false memories in a person's mind by presenting the person with a fictional summary of an event that actually occurred. Psychologist Elizabeth Loftus developed this technique to demonstrate how false memories could be planted in individuals; often, her subjects accepted her false version of events as true and part of their actual experience, and some would make the false version of events more vivid in recalling them than they were as originally presented to them.
      • 419 fraud: Also known as the Nigerian fraud, a scam operated by mail or e-mail in which an individual receives a message purportedly from a government official, asking them to send money or reveal confidential information (e.g., bank account number) in order to receive a percentage of a sum the author is trying to transfer out of their country. The name refers to section 419 of the Nigerian criminal code, which is violated by this type of fraud.
      • Gambler's fallacy: A mistaken belief that the laws of probability will hold true in the short run as well as the long run and/or the related belief that independent events are affected by previous events. For instance, if a fair coin is flipped five times and heads result each time, one may believe that the next flip is more likely to be tails, when in fact heads and tails are equally likely on each flip.
      • Hindsight biases: Biases that frequently emerge when evaluating intelligence reports, including overestimation of the accuracy of past judgments, underestimation of how much has been learned from intelligence reports, and the tendency to believe that events were more forseeable than they really were.
      • Identity theft: A criminal act in which a thief obtains information (e.g., social security numbers or credit card numbers) that allows them to assume another person's identity; the information is then used for other criminal activities, such as stealing from the individual's bank account.
      • Ignoranti elenchi: A logical fallacy in which an argument is not relevant to the issue under discussion; the argument itself may or may not be valid.
      • Illusory correlation: Assigning causal explanations to observed covariation that may in fact be due to chance or other factors not considered.
      • Lake Wobegon effect: The result found in many standardized testing situations in schools, in which most or all the children are found to be performing above average. Since in the long run about half should be above average and about half below, if most or all are found to be above average, this suggests that the results are invalid due to teaching to the test, score inflation, teacher cheating, etc. The name refers to the fictional Lake Wobegon featured on Garrison Keillor's radio program A Prairie Home Companion, where “all the children are above average.”
      • Law of small numbers: A humorous label applied to the common tendency to overvalue information gained from small samples of data without taking into account the high degree of variability in data points. The name reverses the law of large numbers, a true principle in statistics that a sufficiently large sample can provide a good estimate of the characteristics of a population.
      • Letter of credit fraud: A type of fraud in which an individual is offered the opportunity to invest in a letter of credit and is promised large investment returns. However, it is impossible to invest in a letter of credit because it is simply a guarantee issued by a bank in connection with goods shipped internationally.
      • Liar paradox: A sentence of this type: “This sentence is a lie.” This paradox has been discussed by philosophers since ancient times because it is at once true and false.
      • Malingering: Fabricating information for personal gain by drawing attention to oneself or by harming someone else.
      • “Man-who” syndrome: An example of the tendency for information with a personal connection to be overvalued in comparison with abstract information. For instance, statistical evidence of the dangers of tobacco consumption may be ignored because a person knows a man who smoked for years and never got sick.
      • Medicare fraud: Fraud that makes improper claims to be paid by the Medicare system, the U.S. system of health insurance for people over age 65 or those with disabilities. Examples include billing for medical equipment that is not needed, billing for unneeded tests, or billing for services never performed.
      • Misinformation effect: A concept developed by Elizabeth Loftus in 1978 and later, in which recall of an event may be adversely affected by other information presented between the time an event is encoded in memory and the time it is recalled. Hence, eyewitness recall of an event may be affected by irrelevant or false information received by the witness after the fact, such as hearing descriptions of the event by others.
      • Nigerian fraud:See 419 fraud.
      • On the Witness Stand: A book published in 1908 by the German American psychologist Hugo Münsterberg, detailing the results of psychological studies suggesting that eyewitness testimony was not as infallible as commonly believed.
      • Online auction scam: A financial fraud committed over the Internet, in which someone runs an auction on a Web site, such as eBay, and accepts payment but never delivers the advertised goods or delivers goods less valuable than those that were advertised.
      • Operation Dirty Play: An investigation carried out in 2011 and 2012 by the Broward County, Florida, Sheriff's Department into betting on youth football games. The operation ended with the arrest of nine men and the allegation that young players had been bribed to affect the outcomes of games.
      • Othello error: Falsely accusing a truthful person of lying, based on a predetermined view that the person is lying and by discounting other explanations (e.g., stress) for their apparently suspicious behavior. The reference is to Shakespeare's play, Othello, in which the title character misjudges his wife's reaction to the death of another character and believes her to be unfaithful to him.
      • Overestimation of predictability: A cognitive bias in which people are likely to believe that the actions of other people or groups are intentional and result from centralized planning, failing to realize that chance and coincidence often play a large role in the outcome of real-life events.
      • Persistence of discredited evidence: The principle that impressions based on experiences or other evidence are difficult to discard, even if the evidence has been discredited. This is true even in experimental situations in which a test subject is informed that they took part in a manipulated situation, and it is even more persistent in the real world, in which truth and falsehood may be much more ambiguous.
      • Phishing: A type of fraud committed over the Internet, in which an e-mail appears to come from a legitimate company or financial institution and tells the recipient that he or she needs to supply confidential information in order to straighten out a problem with their bank account, credit card account, or other financial institution.
      • Polygraph: A machine that is used to detect whether a person is telling the truth or lying, hence the popular name “lie detector.” Attached to the person answering the questions, the device detects physiological changes such as pulse and blood pressure, which are believed to be beyond conscious control; however, scientific research does not support the accuracy of polygraph testing.
      • Ponzi scheme: A type of fraud in which high financial returns are promised for investment in a business, whereas in fact the only income is provided by new investors. A Ponzi scheme can work for some time, but generally falls apart when it runs out of new investors or the amount of money owed to investors becomes too great. One of the most famous Ponzi schemes of the 20th century was perpetrated by Bernard Madoff, a New York investment advisor who was convicted and sentenced to 150 years in prison in 2009.
      • The Prince: A 1532 treatise written by Italian political theorist Niccolò Machiavelli that discusses how to achieve and maintain authority. Rather than endorsing an absolute set of ethical principles, Machiavelli advises that leaders suit their actions to the situation, giving rise to the term Machiavellian to describe ruthless, deceptive behavior.
      • Psychic surgery: A fraudulent practice in which a performer claims to perform surgery on a patient with her or her bare hands and without breaking the patient's skin, while in fact doing sleight-of-hand tricks and presenting blood or tissue from an animal. One well-known practitioner was Tony Agpaoa, who began his practice in the Philippines, and was arrested in 1968 in the United States on charges of medical fraud.
      • Pump and dump: A type of financial fraud in which the price of a security is artificially inflated (the “pump”) through increased trading volume (often by inducing investors who are not a part of the scheme to buy the security), then sold before the price falls (the “dump”).
      • Pyramid scheme: Also known as chain referral schemes, a type of fraud in which a person pays to distribute or franchise a product, but the only real money comes from the sale of the franchises or distributorships.
      • Radiocarbon dating: A method used to establish the approximate age of objects, often used when investigating suspected art or antiquities fraud. The technique is based on the fact that organic materials contain carbon-14, which decays at a steady rate; the amount of carbon-14 remaining in an object can therefore give an approximation of how old it is.
      • Realism: A philosophical point of view that states that the world exists independently of human descriptions of or thoughts about it and that people's thoughts refer to this actual world, so that they may be objectively judged true or false.
      • Recovered memory therapy: A type of psychotherapy aimed at helping a patient recover memories of abuse that were not previously accessible to their conscious mind. Recovered memory therapy is based in part on the finding that about 10 percent of abuse victims forget the abuse, but it is controversial because scientific studies have shown that it is possible to “plant” false memories that an individual cannot distinguish from memories of their actual experiences.
      • Red herring: A device used in speech and writing to distract people from the true subject of an argument.
      • Scudder's American Museum: An entertainment hall opened in 1842 in New York City by P. T. Barnum. The museum included a variety of exhibits, from live animals to freak shows; among the more famous attractions were the Fiji Mermaid and the Siamese Twins Cheng and Eng.
      • Shattered Glass: A 2003 film written and directed by Billy Ray, based on an Atlantic Magazine article by Buzz Bissinger, chronicling the career of New Republic journalist Stephen Glass, who was found to have fabricated many of the stories that he wrote while working for the magazine. The film details the lengths to which Glass—who had formerly worked as a fact checker—went to create a false trail of evidence that would make his stories appear genuine and the personal dynamics within the newsroom that facilitated his deceptions.
      • Shifters: A short-lived Ponzi scheme seen in the United States in 1922, in which college students became members of an organization called the Shifters, paid an initiation fee, and then sought other people to become members in order to make their money back. Various merchandise, such as pins and hats, was also marketed to members.
      • http://Snopes.com: A Web site run by Barbara and David Mikkelson that collects urban legends and rates them as true, false, or undetermined, supplied with available evidence in support of their decision.
      • Spectographic analysis: A method used to investigate suspected cases of art fraud. The technique involves removing a tiny sample of paint, burning it in a flame, and analyzing the resulting spectrum to determine if the paint includes materials not available at the time that the painting was claimed to be created or materials not known to be used by the supposed painter of the work.
      • Strawman fraud: A type of fraud based on the incorrect belief that the U.S. Treasury Department has bank accounts in the name of every U.S. citizen that can be accessed through a procedure known to the person perpetrating the scam. One feature of such scams is that the victim is told to specify their name in all capital letters.
      • Telemarketing fraud: A type of fraud in which someone is contacted by telephone and induced to send money or release confidential information, such as their bank account or credit card number, in order to collect a “free prize” or some other inducement.
      • Thermoluminescence: A method of dating pottery, often used to examine suspected forgeries. The technique is based on the principle that pottery loses its radioactivity when fired but gradually reabsorbs radioactivity from its surroundings. When heated to over 640 degrees Fahrenheit, pottery emits a glow that is brighter in older samples; hence, the glow emitted by a suspected fraud can be compared with that from another of known age.
      • To Tell the Truth: An American television program aired from 1956 to 1968 on CBS and revived in 1991 by NBC. The concept of the show is that a celebrity panel tries to determine which of three contestants is telling the truth about their identity by asking them questions. The true contestant must give true answers, whereas the other two are allowed to lie to try to convince the panel of their identity.
      • United Nations Convention against Corruption: An international legal instrument adopted by the United Nations General Assembly in 2003 and entered into force on December 14, 2005. Among other things, the convention requires countries to make corrupt acts illegal and to cooperate with other countries in the prevention, investigation, and prosecution of corruption.
      • Urban legend: A type of modern folklore common in industrialized societies in which stories are passed from person to person (often through e-mail). The original source may or may not be known, and the sender may or may not believe the story to be true. The term was popularized by Jan Harold Brunvand in a series of books, beginning with The Vanishing Hitchhiker: American Urban Legends and Their Meanings, in 1981.
      • Vividness criterion: A bias in the evaluation of evidence, in which human thinking and memory is more affected by information that is vivid, concrete, and directly perceived, than by abstract information experienced secondhand (e.g., by reading about it) that may actually have more value and hence should be given greater consideration.
      • War of the Worlds: A radio drama written by Orson Welles, adapted from H. G. Wells's novel of the same name and aired on CBS radio in 1938. The radio show, which tells of a Martian invasion, imitates a series of news bulletins. Many listeners did not realize that they were listening to a fictional program and panicked, believing the invasion to be real.
      • When Prophecy Fails: A book published in 1956 by American psychologists Leon Festinger, Henry W. Riecken, and Stanley Schachter, demonstrating cases of cognitive dissonance in members of a cult who believed that the world was about to end. When the world did not end on schedule, rather than discard their beliefs, the cult members were reinforced in them.
      • White lie: A lie that is unimportant and told with good intentions, for instance to spare someone's feelings or for the sake of politeness, rather than to maliciously deceive a person.
      SarahBoslaugh, Kennesaw State University

      Resource Guide

      Alterman, Eric. When Presidents Lie: A History of Official Deception and Its Consequences. New York: Penguin, 2004.
      Andrews, Edmund L.Busted: Life Inside the Great Mortgage Meltdown. New York: W. W. Norton, 2009.
      Ariely, Dan. The (Honest) Truth About Dishonesty: How We Lie to Everyone—Especially Ourselves. New York: Harper, 2012.
      Ayres, Ian and GregoryKlass. Insincere Promises: The Law of Misrepresented Intent. New Haven, CT: Yale University Press, 2005.
      Bierman, Harold, Jr.Accounting/Finance Lessons of Enron: A Case Study. Hackensack, NJ: World Scientific Press, 2008.
      Boldt-Irons, Leslie, CorradoFederici, and ErnestoVirgulti, eds. Disguise, Deception, Trompe-l'oeil: Interdisciplinary Perspectives. New York: Peter Lang, 2009.
      Booth, Robin, SimonFarrell, GuyBastable, and NicholasYeo. Money Laundering Law and Regulation: A Practical Guide. New York: Oxford University Press, 2011.
      Bruhns, Karen O. and Nancy L.Kelker. Faking the Ancient Andes. Walnut Creek, CA: Left Coast Press, 2010.
      Busch, Rebecca S.Healthcare Fraud Auditing and Detection Guide. Hoboken, NJ: John Wiley & Sons, 2008.
      Carson, Thomas L.Lying and Deception: Theory and Practice. New York: Oxford University Press, 2010.
      Carter, Carolyn L., Jonathan A.Sheldon, and John W.Van Alst. Automobile Fraud: Odometer, Salvage, and Lemon Laundering Fraud Abuses and Yo-Yo Sales. Boston: National Consumer Law Center, 2011.
      Clikeman, Paul M.Called to Account: Fourteen Financial Frauds That Shaped the American Accounting Profession. New York: Routledge, 2009.
      Cohan, William D.House of Cards: A Tale of Hubris and Wretched Excess on Wall Street. New York: Doubleday, 2009.
      Cover, Robert and Michael S.Goodman, eds. Spinning Intelligence: Why Intelligence Needs the Media, Why the Media Needs Intelligence. New York: Columbia University Press, 2009.
      Cowan, Rick and DouglasCentury. Takedown: The Fall of the Last Mafia Empire. New York: G. P. Putnam's Sons, 2002.
      Crivelli, Paolo. Plato's Account of Falsehood: A Study of the Sophist. New York: Cambridge University Press, 2012.
      Deneault, Alain and GeorgeHoloch, trans. Offshore: Tax Havens and the Rule of Global Crime. New York: Perseus, 2011.
      Dornscheit-Berg, Daniel, TinaKlopp, and JeffersonChase, trans. Inside Wikileaks: My Time With Julian Assange at the World's Most Dangerous Website. New York: Crown Publishers, 2011.
      Edelman, Murray J.The Politics of Misinformation. New York: Cambridge University Press, 2001.
      Eichenwald, Kurt. Conspiracy of Fools: A True Story. New York: Broadway Books, 2005.
      European Commission Anti-Fraud Office. The Fight Against Fraud and Transnational Crime: OLAF and International Cooperation. Luxembourg: Office for Official Publications of the European Communities, 2003.
      Flaxman, Gregory. Gilles Deleuze and the Fabulation of Philosophy. Minneapolis: University of Minnesota Press, 2012.
      Foucault, Michel and GrahamBurchell, trans., and FrèdericGros, ed. The Courage of the Truth (The Government of Self and Others II): Lectures at the Collège de France, 1983–1984. New York: Palgrave Macmillan, 2011.
      Frankel, Tamar. The Ponzi Scheme Puzzle: A History and Analysis of Con Artists and Victims. New York: Oxford University Press, 2012.
      Gannon, James. Stealing Secrets, Telling Lies: How Spies and Codebreakers Helped Shape the Twentieth Century. Washington, DC: Brassey's, 2001.
      Gerard, Philip. Secret Soldiers: The Story of World War II's Heroic Army of Deception. New York: Dutton, 2002.
      Gragido, Will, JohnPirc, and RussRogers. Cybercrime and Espionage: An Analysis of Subversive Multivector Threats. Oxford: Elsevier Science, 2011.
      Halligan, Peter W., ChristopherBas, and David A.Oakley. Malingering and Illness Deception. New York: Oxford University Press, 2003.
      Harrington, Brook, ed. Deception: From Ancient Empires to Internet Dating. Palo Alto, CA: Stanford University Press, 2009.
      Harris, Kaitlyn O., ed. Seafood Fraud and Safety: Background and Issues. New York: Nova Science Publishers, 2010.
      Hawken, Angela, Stephen J.Carroll, and Allan F.Abrahamse. The Effects of Third-Party, Bad Faith Doctrine on Automobile Insurance Costs and Compensation. Santa Monica, CA: Rand, 2001.
      Heffernan, William C. and JohnKleinig, eds. Private and Public Corruption. Lanham, MD: Rowman & Littlefield, 2004.
      Henriques, Diana B.The Wizard of Lies: Bernie Madoff and the Death of Trust. New York: Times Books/Henry Holt, 2011.
      Hirstein, William. Brain Fiction: Self-Deception and the Riddle of Confabulation. Cambridge, MA: MIT Press, 2005.
      Hirstein, William, ed. Confabulation: Views From Neuroscience, Psychiatry, Psychology, and Philosophy. New York: Oxford University Press, 2009.
      Hitz, Frederick Porter. The Great Game: The Myth and Reality of Espionage. New York: Alfred A. Knopf, 2004.
      Hitz, Frederick Porter. Why Spy? Espionage in an Age of Uncertainty. New York: St. Martin's Press, 2008.
      Hopwood, William S., Jay J.Leiner, and George R.Young. Forensic Accounting and Fraud Examination.
      2nd ed
      . New York: McGraw-Hill Irwin, 2012.
      Hoving, Thomas. False Impressions: The Hunt for Big-Time Art Fakes. New York: Simon & Schuster, 1996.
      Javers, Eamon. Broker, Trader, Lawyer, Spy: Inside the Secret World of Corporate Espionage. New York: Harper, 2010.
      Jay, Martin. The Virtues of Mendacity: On Lying in Politics. Charlottesville: University of Virginia Press, 2010.
      Jay, Ricky. Jay's Journal of Anomalies: Conjurers, Cheats, Hustlers, Hoaxsters, Pranksters, Jokesters, Impostors, Pretenders, Sideshow Showmen, Armless Calligraphers, Mechanical Marvels, Popular Entertainments. New York: Farrar, Straus, and Giroux, 2001.
      Johnson, Anthony L, ed. Reducing Medicare Fraud, Waste, and Abuse. New York: Nova Biomedical Books, 2011.
      Jowett, Garth S. and VictoriaO'Connell. Propaganda and Persuasion.
      5th ed
      . Thousand Oaks, CA: Sage, 2012.
      Katz, Norman A.Detecting and Reducing Supply Chain Fraud. Burlington, VT: Gower, 2012.
      Kelly, Jan Seaman and Brian S.Lindblom, eds. Scientific Examination of Questioned Documents.
      2nd ed
      . Boca Raton, FL: CRC/Taylor & Francis, 2006.
      Keyes, Ralph. The Post-Truth Era: Dishonesty and Deception in Contemporary Life. New York: St. Martin's Press, 2004.
      Koller, Synthia. White Collar Crime in Housing: Mortgage Fraud in the United States. El Paso, TX: LFB Scholarly Publishers, 2012.
      Kroger, John. Convictions: A Prosecutor's Battles Against Mafia Killers, Drug Kingpins, and Enron Thieves. New York: Farrar, Straus, and Giroux, 2008.
      Lewis, Lionel S.Con Game: Bernard Madoff and His Victims. New Brunswick, NJ: Transaction Publishers, 2012.
      Lingua Franca editors. The Sokal Hoax: The Sham That Shook the Academy. Lincoln: University of Nebraska Press, 2000.
      Macintyre, Ben. Double Cross: The True Story of the D-Day Spies. New York: Crown, 2012.
      Markham, Jerry W.A Financial History of Modern U.S. Corporate Scandals: From Enron to Reform. Amonk, NY: M. E. Sharpe, 2006.
      Martin, Clancy, ed. The Philosophy of Deception. New York: Oxford University Press, 2009.
      May, Gary. The Informant: The FBI, the Ku Kux Klan, and the Murder of Viola Liuzzo. New Haven, CT: Yale University Press, 2005.
      McGlone, Matthew S. and Mark L.Knapp, eds. The Interplay of Truth and Deception: New Agendas in Communication. New York: Routledge, 2010.
      Mendilow, Jonathan, ed. Money, Corruption, and Political Competition in Established and Emerging Democracies. Lanham, MD: Lexington Books, 2012.
      Metzger, Miriam J. and Andrew J.Flanagin. Digital Media, Youth, and Credibility. Cambridge, MA: MIT Press, 2008.
      Mihm, Stephen. A Nation of Counterfeiters: Capitalists, Con Men, and the Making of the United States. Cambridge, MA: Harvard University Press, 2007.
      Montague, David A.Essential of Online Payment Security and Fraud Prevention. Hoboken, NJ: Wiley, 2011.
      Munoz, Arturo. U.S. Military Information Operations in Afghanistan: Effectiveness of Psychological Operations 2001–2010. Santa Monica, CA: Rand, 2012.
      Nasiri, Omar. Inside the Jihad: My Life With Al Qaeda, A Spy's Story. New York: Basic Books, 2006.
      Neuwirth, Robert. Stealth of Nations: The Global Rise of the Informal Economy. New York: Pantheon Books, 2011.
      Nguyen, Tomson H.Fraud and the Subprime Mortgage Crisis. El Paso, TX: LFB Scholarly Publishers, 2011.
      Olsen, William P.The Anti-Corruption Handbook: How to Protect Your Business in the Global Marketplace. Hoboken, NJ: John Wiley & Sons, 2010.
      O'Shaughnessy, Nicholas J.Politics and Propaganda: Weapons of Mass Seduction. Ann Arbor: University of Michigan Press, 2004.
      Palan, Ronen, RichardMurphy, and ChristianChavagneux. Tax Havens: How Globalization Really Works. Ithaca, NY: Cornell University Press, 2010.
      Palda, K. Filip. Tax Evasion and Firm Survival in Competitive Markets. Northampton, MA: Edward Elgar Publishing, 2001.
      Partnoy, Frank. Infectious Greed: How Deceit and Risk Corrupted the Financial Markets. New York: Times Books/Henry Holt, 2003.
      Payne, Brian K.Crime in the Home Health Care Field: Workplace Violence, Fraud, and Abuse. Springfield, IL: Charles C. Thomas, 2003.
      Randi, James. An Encyclopedia of Claims, Frauds, and Hoaxes of the Occult and the Supernatural. New York: St. Martin's Press, 1995.
      Rapoport, Nancy B., Jeffrey D.Van Niel, and Bala G.Dharan, eds. Enron and Other Corporate Fiascos: The Corporate Scandal Reader. Upper Saddle River, NJ: Thompson Reuters/Foundation Press, 2009.
      Robin, Ron. Scandals and Scoundrels: Seven Cases That Shook the Academy. Berkeley: University of California Press, 2004.
      Ronson, Jon. The Psychopath Test: A Journey Through the Madness Industry. New York: Riverhead Books, 2011.
      Rothstein, Bo. The Quality of Government: Corruption, Social Trust, and Inequality in International Perspective. Chicago: University of Chicago Press, 2011.
      Salter, M. S.Innovation Corrupted: The Origins and Legacy of Enron's Collapse. Cambridge, MA: Harvard University Press, 2008.
      Saul, Jennifer Mather. Lying, Misleading, and What Is Said: An Exploration in Philosophy of Language and in Ethics. Oxford: Oxford University Press, 2012.
      Schilit, Howard M. and JeremyPerler. Financial Shenanigans.
      3rd ed
      . New York: McGraw-Hill, 2010.
      Schneider, Friedrich and Dominik H.Enste. The Shadow Economy: An International Survey. New York: Cambridge University Press, 2002.
      Shulan, David. From Hire to Liar: The Role of Deception in the Workplace. Ithaca, NY: ILR Press, 2007.
      Smith, David Livingston. Why We Lie: The Evolutionary Roots of Deception and the Unconscious Mind. New York: St. Martin's Press, 2004.
      Sokal, Alan D.Beyond the Hoax: Science, Philosophy, and Culture. New York: Oxford University Press, 2008.
      Spencer, Ronald D., ed. The Expert Versus the Object: Judging Fakes and False Attributions in the Visual Arts. New York: Oxford University Press, 2004.
      Steinmeyer, Jim. The Glorious Deception: The Double Life of William Robinson, aka Chung Ling Soo, the “Marvelous Chinese Conjurer.”New York: Carroll & Graf, 2005.
      Stern, Rebecca. Home Economics: Domestic Fraud in Victorian England. Columbus: Ohio State University Press, 2008.
      Sussman, Gerald, ed. The Propaganda Society: Promotional Culture and Politics in Global Context. New York: Peter Lang, 2011.
      Swartz, Mimi and SherronWatkins. Power Failure: The Inside Story of the Collapse of Enron. New York: Currency Doubleday, 2004.
      Tillman, Robert and Michael L.Indergaard. Pump and Dump: The Rancid Rules of the New Economy. New Brunswick, NJ: Rutgers University Press, 2005.
      Transparency International. Global Corruption Report: Climate Change. Washington, DC: Earthscan, 2011.
      Tribble, Scott. A Colossal Hoax: The Giant From Cardiff That Fooled America. Lanham, MD: Rowman & Littlefield, 2009.
      Turner, Andrew, James H.Kim, OnChong-Gossard, and Frederik JuliaanVervaet, eds. Private and Public Lies: The Discourse of Despotism and Deceit in the Graceo-Roman World. Boston: Brill, 2010.
      van der Does de Wilebois, Emile, et al. The Puppet Masters: How the Corrupt Use Legal Structures to Hide Stolen Assets and What to Do About It. Washington, DC: World Bank, 2011.
      Vian, Taryn, William D.Savedoff, and HaraldMathisen. Anticorruption in the Health Sector: Strategies for Transparency and Accountability. Sterling, VA: Kumarian Press, 2010.
      Wallace, Robert and H. KeithMelton. Spycraft: The Secret History of the CIA's Spytechs, From Communism to Al-Qaeda. New York: Dutton, 2008.
      Weisman, Stewart L.Need and Greed: The Story of the Largest Ponzi Scheme in American History. Syracuse, NY: Syracuse University Press, 1999.
      Wiener, Jon. Historians in Trouble: Plagiarism, Fraud, and Politics in the Ivory Tower. New York: New Press, 2005.
      Wolff, Robert V.A Full Response to an Empty House: Public Safety Strategies for Addressing Mortgage Fraud and the Foreclosure Crisis. New York: Center for Court Innovation, 2010.
      Zack, Gerard M.Financial Statement Fraud: Strategies for Detection and Investigation. Hoboken, NJ: John Wiley & Sons, 2013.
      Zuckoff, Mitchell. Ponzi's Scheme: The True Story of a Financial Legend. New York: Random House, 2005.
      Accounting, Accountability & Performance
      Accounting and the Public Interest
      Acta Analytica: Philosophy and Psychology
      American Journal of Forensic Psychiatry
      American Journal of Sociology
      American Psychologist
      Behavioral Sciences & the Law
      British Journal of Social Psychology
      Business Ethics Quarterly
      Computer Fraud & Security
      Contemporary Accounting Research
      European Journal of Social Psychology
      European Journal of Psychology Applied to Legal Context
      European Psychiatry
      International Journal of Business and Social Science
      International Sociology
      Journal of Business Ethics
      Journal of Economic Issues
      Journal of Economic Psychology
      Journal of Forensic Psychiatry & Psychology
      Journal of Individual Differences
      Journal of Law, Medicine & Ethics
      Law and Human Behavior
      MIS Quarterly
      Open Access Journal of Forensic Psychology
      Philosophy & Public Affairs
      Psychology, Crime & Law
      Psychology, Public Policy, and Law
      Social Behavior & Personality
      Social Issues and Policy Review
      Social Psychology
      Social Sciences Research
      African Development Bank Group: Integrity and Anti-Corruption
      American Association for Social Psychiatry
      American Psychiatric Association
      American Psychoanalytic Association
      American Psychological Association
      American Sociological Association
      Asian Development Bank: Anticorruption and
      British Psychological Society
      British Sociological Association
      FBI: Common Fraud Schemes
      Financial Fraud Enforcement Task Force
      George Mason University: Propaganda
      INTERPOL: Cybercrime
      INTERPOL: Financial Crime
      James Randi Educational Foundation
      OLAF: European Commission European Anti-Fraud Office
      Retraction Watch (scientific fraud)
      http://Snopes.com: Urban Legend Reference Pages
      Transparency International
      United Nations Office on Drugs and Crime
      United Nations Office on Drugs and Crime:
      Convention Against Corruption
      U.S. Department of Health and Human Services,
      U.S. Department of Justice: Mortgage Fraud
      U.S. Department of Justice: Stop Medicare Fraud
      U.S. Department of State: International Financial Scams
      U.S. Holocaust Museum: Propaganda Exhibit
      U.S. Securities and Exchange Commission:
      Division of Enforcement
      World Bank: Financial Market Integrity
      SarahBoslaugh, Kennesaw State University

      Appendix: The Prevalence of Lying in America: Three Studies of Self-Reported Lies

      Kim B.Serota, Timothy R.Levine, and Franklin J.Boster, Department of Communication, Michigan State University

      International Communication Association

      Human Communication Research

      Volume 36, Issue 1 (January 2010)

      The Prevalence of Lying in America: Three Studies of Self-Reported Lies
      Kim B.Serota, Timothy R.Levine, & Franklin J.Boster, Department of Communication, Michigan State University, East Lansing, MI 48823, USA

      This study addresses the frequency and the distribution of reported lying in the adult population. A national survey asked 1,000 U.S. adults to report the number of lies told in a 24-hour period. Sixty percent of subjects report telling no lies at all, and almost half of all lies are told by only 5% of subjects; thus, prevalence varies widely and most reported lies are told by a few prolific liars. The pattern is replicated in a reanalysis of previously published research and with a student sample. Substantial individual differences in lying behavior have implications for the generality of truth-lie base rates in deception detection experiments. Explanations concerning the nature of lying and methods for detecting lies need to account for this variation.


      Humans are ambivalent about deception. On one hand, virtually all human cultures have some prohibition against lying. On the other hand, the ability to deceive well may be essential for polite interaction and, at times, self-preservation. Considerable research exists on the topic of deception, yet surprisingly little is known about the base prevalence of deception. Instead, much of this research has relied on untested assumptions and anecdotal evidence or on a few studies with small and nonrepresentative samples.

      The dearth of deception prevalence research is a symptom of a broader systemic concern regarding research in the social sciences. Asch (1952, reprinted 1987) observed that “before we inquire into origins and functional relations, it is necessary to know the thing we are trying to explain.” Influenced by Asch, Rozin (2001) argued that social scientific research often emphasizes experimental studies and formal hypothesis testing to the exclusion of more basic descriptive work. In line with Rozin's critique, more than 30 years of experimental detection research has proceeded without much attention to the basic nature of the phenomena itself. We believe that inquiry into deception and related behaviors associated with deception detection requires basic descriptive research examining the extent and distribution of deceptive communication in the population. In the extensive literature on deception, the question of prevalence remains without a clear, well-documented answer. Thus, our research investigates reports of how often people lie.

      In order to study the prevalence of lying, it is necessary to consider what constitutes a lie. Simply and broadly put, lying occurs when a communicator seeks knowingly and intentionally to mislead others. Ford, King, and Hollender (1988) suggest the “consciousness of falsity” to distinguish “normal” lies from pathological ones. Thus, it is not sufficient that something is false for it to be a lie; it is the intent that distinguishes the lie. As Bok (1999) observes: “The moral question of whether you are lying or not is not settled by establishing the truth or falsity of what you say. In order to settle this question, we must know whether you intend your statement to mislead” (p. 6).

      Bok (1999) argues for the principle of veracity that involves a moral asymmetry between honesty and lies. Lying requires justification, whereas truth telling does not. Given prohibitions against deceit, people may try to avoid situations in which there is pressure to lie. Research finds that this principle of veracity guides everyday communication and people consider using deception only when the truth is problematic (Levine, Kim, & Hamel, 2009). But this tells us about the situations in which people lie and not how often people lie.

      Despite this moral asymmetry, most deception research has presumed the ubiquity of lying and moved past the question of frequency to focus on the behavioral correlates of lying or lie detection. The frequency question remains mostly unanswered. A notable exception, and the best and most cited prevalence research, however, is the 1996 diary study of lying in everyday life by DePaulo, Kashy, Kirkendol, Wyer, and Epstein (1996). Using two small samples, students and recruited members of the local community, DePaulo et al. reported the mean number of lies per day as 1.96 (SD= 1.63, N = 77) for the students and 0.97 (SD= 0.98, N = 70) for the subsequent nonstudent sample. Importantly, the aim of the second sample was to replicate findings regarding the nature and reasons for lying with a different but not necessarily representative sample of the population. Nonetheless, a brief synopsis of this study in Psychology Today (“The Real Truth About Lying,” 1996) reported that DePaulo conducted research to answer the question “how often do people lie … ?” and in many subsequent research reports the finding that people tell one to two lies per day has been reified.

      More recently, two smaller and lesser-known studies have sought to replicate and extend elements of the DePaulo et al. (1996) diary study. Hancock, Thom-Santelli, and Ritchie (2004) examined differences between reports of face-to-face lies and reports of lying through computer-mediated communication. Results from a student sample yielded an average of 1.58 lies per day (SD = 1.02, N = 28) and a significant difference for the rates of lying (lies as a proportion of interactions) between face-to-face, telephone, instant message, and e-mail interactions; the highest rates occurred during phone conversations and the lowest rates with e-mail. George and Robb (2008) replaced the pencil-and-paper diary with a personal digital assistant (PDA). They report fewer lies per day; M = 0.59 (SD = 0.37, N = 25) with the 10-minute definition of interaction used by DePaulo et al. and M = 0.90 (SD = 0.54, N = 25) in a second study shortening the interaction definition to 5 minutes (increasing the number of reporting opportunities). Thus, the current literature provides estimates ranging from 0.59 to 1.96 lies per day. Variation in estimates from study to study is expected due to small sample sizes and large standard deviations.

      Other broad estimates of prevarication cited by deception researchers have come from nonacademic sources. In a poll conducted for the book The Day America Told the Truth (Patterson & Kim, 1991), 90% of the subjects admitted being deceitful about a list of subjects, the most common being true feelings, income, accomplishments, sex life, and age. In a Reader's Digest poll (Kalish, 2004) of 2,861 of the magazine's readers, 93% reported one or more kinds of dishonesty at work or school, 93% reported one or more dishonest acts in the market place, and 96% reported lying to or committing other dishonest acts toward family and friends.

      Some studies report on various facets of prevalence but provide limited insight into the overall extent of lying because they deal with specific situations such as lying by job applicants, students lying to parents, or lying about spousal infidelity. Levashina and Campion (2007) found that 90% of undergraduate job candidates used some form of deception during job interviews; however, the distinction between impression management and outright lies is often blurred and their report notes that behaviors that are semantically close to lies are difficult to confirm. They estimate actual lies occur in the wide range of 28–75% of job interviews. Jensen, Arnett, Feldman, and Cauffman (2004) examined lying to parents by adolescents and young adults, and quantified the extent of lying by topic over the course of a year. This study found that 82% of all students reported lying to their parents on at least one of six topical issues (money, alcohol/drugs, friends, dating, parties, and sex) with the mean incidence of lying ranging from 0.6 to 2.4 lies depending on the issue. Much of the research that seeks to quantify lying behavior is concentrated in the area of relational communication. Cochran and Mays (1990) studied dating dishonesty among college students and found that 60% of women claimed to have been lied to in order to obtain sex, whereas 34% of the men in the study admitted lying to obtain sex. Knox, Schacht, Holt, and Turner (1993) found that 92% of students (when given the opportunity to report anonymously) admitted to lying to a current or potential sexual partner.

      It is not difficult to understand why many scholars believe lying is a frequent event. Life experiences and anecdotal evidence encourage acceptance of the proposition. A typical research report discussion statement illustrates this view: “Lying is ubiquitous and comes in many forms, from cherished beliefs about Santa Claus to the self-deception commonly encountered in the treatment situation” (Tosone, 2006).

      General acceptance of the ubiquity assumption has implications for studies on lying and deception detection. If everyone lies and lying is an everyday occurrence for most people, this would suggest that individual differences should not have much influence on the identification of lying behaviors. If this is the case, it should be possible to understand the nature of lying and deceptive behavior and find ways to detect it by studying anyone telling lies. For example, this presumption is recurrent in studies that look for regularities in nonverbal cues, microexpressions, and the leakage of emotions. Individual differences are typically considered of less relevance than situational considerations such as whether the lie is a minor everyday lie or if the lie is a consequential, high-stakes lie. If, on the other hand, base rates for lying (the frequency with which one lies or the ratio of truths to lies) vary across groups of individuals or if the ubiquitous average masks variation that is not normally distributed in the population, researchers looking into the nature of the phenomenon need to take into account this variation. Research designs and sample selection procedures ought to control for this variation, or they should examine the nature of the variation itself. Consistent with this second possibility, recent meta-analysis suggests substantial individual differences in people's ability to lie convincingly (Bond & DePaulo, 2008).

      Study 1

      Examination of the literature reveals few attempts to document the extent to which everyday lies occur. The few studies that offer behavioral rates have performed so as an adjunct to the main objectives of the research and the rates obtained are restricted to the specific conditions of the studies. This situation is exemplified by the DePaulo et al. (1996) studies. Although lie frequency is among the interests that prompted their research, most of the design and analysis is devoted to the topics of what people lie about, to whom they lie, and what motivates them to lie. DePaulo et al. noted that their observations of lying frequency are based on students and an adult sample that was chosen not for representativeness but, instead, to provide a dissimilar sample in order to determine whether or not the results from the student sample could be replicated. Still, DePaulo et al. reported: “Participants in the community study, on the average, told a lie everyday; participants in the college student study told two.” So despite being based on data that were (correctly) noted by its authors as lacking generalizability, this research report has become the standard reference for the prevalence of everyday lies in the deception literature. The goal of Study 1 is to test this claim by obtaining data from a large cross-section of the adult population.

      Participants and Design

      In order to examine the proposition that most people tell one to two lies per day with projectable data, an Internet survey of 1,000 American adults (18 years of age or older) was conducted using the Synovate eNation omnibus panel.1 The omnibus panel is a commercial survey research tool used for daily, multiclient studies and approximates a nationally representative sample with some limitations. Panelists are recruited into a pool of more than 1 million subject households using banner advertisements, mailing lists, and related procedures to promote participation; subjects must formally opt-in to confirm awareness that they are participating in research and consent to participation. Each survey day, a new sample (N = 5,500) is drawn from the pool. Subjects are randomly selected within strata defined by population characteristics. The response rate is typically 19–20%, accounting for both nonresponse and incomplete surveys. Responses exceeding the 1,000 daily quota are deleted using systematic (random start, nth selection) sampling. The panel is matched on age, gender, income, and region to the U.S. Census Bureau's monthly Current Population Survey (CPS; U.S. Census Bureau, 2008). Results are poststratification weighted (Kish, 1965) to these CPS criteria. In addition, Synovate uses weighting in order to adjust partially for underrepresentation of Hispanics and ethnic minorities in the sample. Subjects were included in a prize drawing as the incentive to participate. Table 1 provides the unweighted and weighted sample demographics and compares them to recent U.S. Census data and to eNation weighting targets, which are based on U.S. Census data.

      Table 1 Comparison of the Unweighted and Weighted Demographics of the Study Sample to U.S. Census Data and eNation Demographic Targets

      This study used a nonexperimental survey design in order to obtain descriptive measures for the incidence of lying in the population. Results from this survey are compared with the popular standard established by the DePaulo et al. (1996) studies.

      Procedure and Measures

      Subjects received an invitation e-mail asking them to participate in an omnibus survey on the date of the study. The invitation was directed to a specific member of the household identified by age and gender. The invitation instructed that individual to click on a link to the survey website. When subjects accessed the site, they were provided with instructions, asked questions confirming participant identification, asked the omnibus survey questions for several unrelated topics, and asked a series of demographic questions. On the day of the lying study, subjects were asked about four topics (in order of presentation): packaged meals, cat litter products, lying behavior, and water softeners.

      The DePaulo et al. (1996) diary study (and subsequent diary studies) used subject training to make the topic less sensitive and provide the subjects with a common definition of lying. Training of survey respondents was not possible, so to encourage accurate reporting, the self-report lying question was preceded by a definition of lying that incorporated the elements of foreknowledge and intent described at the beginning of this article. A brief description of types of lies was also included. Both were presented in a nonpejorative manner:

      We are interested in truth and lies in people's everyday communication. Most people think a lie occurs any time you intentionally try to mislead someone. Some lies are big while others are small; some are completely false statements and others are truths with a few essential details made up or left out. Some lies are obvious, and some are very subtle. Some lies are told for a good reason. Some lies are selfish; other lies protect others. We are interested in all these different types of lies. To help us understand lying, we are asking many people to tell us how often they lie.

      Subjects were then asked, using an open-ended format, how many times they had lied in the past 24 hours. They responded separately for lies to family members, friends, business contacts, acquaintances (“people you do not know but might see occasionally”), and total strangers; for each type of receiver, they were asked about lies in both face-to-face and mediated situations. Response categories were used as a mnemonic device and to proved additional detail for the analysis. The results of the 10 receiver-mode combinations were aggregated. Specifically, the question was worded:

      Think about where you were and what you were doing during the past 24 hours, from this time yesterday until right now. Listed below are the kinds of people you might have lied to and how you might have talked to them, either face-to-face or some other way such as in writing or by phone or over the Internet. In each of the boxes below, please write in the number of times you have lied in this type of situation. If you have not told any lies of a particular type, write in “0.” In the past 24 hours, how many times have you lied?

      The subjects were presented with a response grid showing the five types of people and two modes of communication. Subjects were instructed to enter a number in each box. The Internet questionnaire required a response in each of the 10 boxes before allowing the subject to continue to the next Web page. Figure 1 presents a screen shot of the lying description used in the Web survey; Figure 2 is a screen shot of the survey question. Based on the 5 × 2 individual categories of responses, the row, column, and grand total frequencies (of lies per day) were aggregated for each subject.

      Figure 1 Screen shot of the Internet survey page giving a definition and description of lying.
      Figure 2 Screen shot of the Internet survey page with the question and response grid.

      The results of this national study are consistent with the oft-cited observation that on average Americans tell one to two lies per day (M = 1.65 lies per day, SD = 4.45, Mdn = 0, Mode = 0, N = 998, Max = 53 lies, 95% CI = 1.37–1.93).2 But the most intriguing finding is the distribution of responses, not the mean. As Figure 3 illustrates the majority of people report telling no lies during the past 24 hours and most of the reported lies are told by few people. The 40.1% who reported lying told a total of 1,646 lies (M= 4.11, SD = 6.26, n = 400). Of these, 22.7% of all reported lies were told by 1% of the national sample. Results indicate that one-half of all reported lies are told by just 5.3% of American adults (M = 15.61, SD = 11.22, n = 53).

      Figure 3 (A) The majority of Americans reported they did not lie in the past 24 hours (59.9%). (B) Percentage distribution by number of lies told; 32.2% told one to five lies and 7.9% reported telling six or more lies.

      Figure 4 indicates that, among those who reported lying, the proportion of people who report a particular number of lies per day decreases as a function of the number of lies. Moreover, observation of this curve suggests that the decrease is a standard power function. Fitting a power function curve to these data produces the equation y = 152.225 × x−1.209 where x is the number of lies reported per day and y the frequency of people reporting a given number of lies. The function's sizeable intercept indicates that the majority of respondents who report lying do so in moderation, and the steep negative slope indicates the substantial decrease in frequency as the number of daily lies increases. Figure 4 also indicates that this pattern holds regardless of mode of communication. Although the slopes for both face-to-face lies and mediated lies are even steeper than that of the aggregated data, both retain the power law character of the total data set. Regardless of mode, most people report telling no lies and as the curve-fitting for number of lies reinforces, among those who report lying, most tell very few lies; but in each case, there are a few subjects who account for a large proportion of the lies being told.

      Figure 4 Frequencies of total, face-to-face, and mediated lies are represented by similar standard power functions.

      Similarly, Figure 5 (among those telling face-to-face lies) and Figure 6 (among those telling mediated lies) indicate that lying behavior replicates the fractal character of power functions observed in other disciplines such as biological systems (Brown et al., 2002) and market segmentation (Anderson, 2008). Although each group of lies told to family members or friends or other types of message receivers represents only a small portion of the total lies, within each of the 10 mode-receiver combinations, the power function pattern is repeated. Most of the variation is among the intercepts and reflects that more lies are typically told to family members or friends than to acquaintances or total strangers.

      Figure 5 Standard power function curves fit the frequencies of face-to-face lies plotted by number of lies for all receiver types.
      Figure 6 Standard power function curves fit the frequencies of mediated lies plotted by number of lies for all receiver types.

      These data do not include the number of interactions by type; therefore, it is possible that this variation is as much due to the number of opportunities for lying as it is that people are more likely to lie to others with whom they are familiar. Establishing the proportion of interactions involving lies may be more difficult than establishing the rate of lying for a prescribed time period. All of the diary studies to which these results are compared are flawed with regard to the interaction ratio. In order to capture as many lies as possible, DePaulo et al. (1996) specified that subjects were to record all interactions of 10 minutes or more. Subjects were told to record all lies that occurred during these interactions and, importantly, record lies occurring during shorter interactions as well. As a result, the ratio of total lies, regardless of interaction duration, to 10-minute interactions distorts the true relationship. If the number of 10-minute interactions varies across respondents or modes of communication, these comparisons may be more misleading than comparisons of the number of lies in each category during the fixed 24-hour time frame. Subsequent diary studies incorporate the lie per interaction distortion created by the DePaulo et al. methodology.

      In order to consider more fully the possibility of individual differences in the propensity to lie, a multiple regression analysis was performed. Initially, a natural logarithm transformation was applied to the continuous lying measure as a means of reducing its nonnormality. Although not eliminating the nonnormality completely, this transformation had the effect of decreasing skewness by a factor of approximately 4 and kurtosis by a factor of approximately 18. To assess the impact of the demographic measures on lying, the natural logarithm transformed lying measure was regressed onto all demographic measures. Trivial predictors were dropped, the analysis was iterated, and two important, albeit modest in magnitude, predictors emerged (R = .21, F(2,979) = 22.82, p< .001). First, agewas an important predictor (β = −.18, t(979) = −5.71, p < .001), and the effect was such that a decrease in lying was associated with increasing age. Second, race (Caucasian vs. other) was an important predictor (β = −.09, t(979) = −2.99, p = .003), and the effect was such that Caucasians reported lying less than other racial groups. This analysis was replicated with the dichotomous not lie/lie measure, with the same two important predictors emerging in the subsequent logistic regression analysis.

      No sex differences were observed when controlling for other demographic predictors. Bivariate analysis showed that the overall rate of lying by men (M = 1.93, SD = 4.81, n = 482) and women (M = 1.39, SD = 4.08, n = 516) is directionally but not significantly different when using conventional criteria for statistical significance (t(996) = 1.89, p = .059/ns, two-tailed, d = 0.12).3 This apparent gender difference is in the opposite direction observed by DePaulo et al. (1996) but is consistent with the finding that women find lying less acceptable than men (Levine, McCornack, & Avery, 1992).


      The results of this national study are consistent with the DePaulo et al. (1996) diary study and suggest that on average Americans lie once or twice a day. However, the important findings are that many people do not lie on a given day, the majority of lies are told by a few prolific liars, and because the distribution is highly skewed, the mean number of lies per day is misleading. This pattern is consistent across modes of communication and varies little on the basis of who is being lied to. Examination of individual differences suggests some variation but in most cases the differences are small.

      The representativeness of online panel data is debatable. Use of poststratification (such as employed by Synovate) and propensity (likelihood of response) weighting schemes are the usual solutions to matching panel samples to the population. In general, the use of a large number of small and internally homogeneous strata will enhance the proportional fit of the sample to the population. However, non-coverage and self-selection can create sampling problems in Web-based research that weighting does not solve (Loosveldt & Sonck, 2008). These representativeness issues are of particular concern when the measured values are correlated with the underlying reasons for the selection bias. However, when measuring socially undesirable behavior, assessment of representativeness is confounded. Loosveldt and Sonck obtained measures for validation of questions influenced by social desirability using face-to-face interviews; this introduces potential mode effects. Even with the selection problem, the social distance advantage of the Internet may actually produce better data. Birnbaum (2004) provides an argument for representativeness when the results are homogeneous across strata. Because selection bias tends to vary across the levels of a stratified sample, small individual differences for a measure across the stratification variables is evidence that the aggregate finding transcends the sampling issues and is indicative that the sample result is representative of the phenomenon in the population as a whole. In Study 1, confidence in representativeness is enhanced by the homogeneity of the results across strata, and convergent validity is subsequently established by consistency with the results in Studies 2 and 3, and by the advantage of the Internet for creating social distance in the measurement of a sensitive topic.

      Although the findings were generally homogenous across the sample of adult Americans, some small demographic differences were apparent. Notably, age and race/ethnicity account for small but statistically significant variation. Further, the difference between reports of lying by men and women approached statistical significance. These findings may have theoretical, social, and cultural implications.

      Perhaps the most interesting individual difference is the negative association between lying and age. Lying is acquired by children in early childhood and the ability to lie is correlated with the acquisition of perspective-taking, theory of mind, and communication skills (Vasek, 1986; see Knapp, 2008, pp. 91–116 for a summary discussion of lying and development). As the child reaches adolescence, lying skill is perfected, but lying declines in acceptance in early adulthood (Jensen et al., 2004). The difference between rates of lies reported by the DePaulo et al. (1996) student and (adult) community samples suggests that maturity tempers the usage of lying as a strategy for goal attainment, and the current findings of the national data in Study 1 are consistent with the claim that the lying declines with age.

      With regard to the finding that Caucasians report fewer lies than those of other ethnic or racial groups, it would be irresponsible to simply conclude that White people are more honest in general than those of other races. Research on race and deception is limited, and more research is required to make sense of this finding. The current study also found a marginally significant trend toward men reporting more lies than women. Some studies suggest men lie more than woman while others suggest the opposite (e.g., DePaulo et al., 1996; Levine et al., 1992). Other research finds that sex differences vary by the topic of the lie (Haselton, Buss, Oubaid, & Angleitner, 2005).

      Study 2

      The results of Study 1 replicate, with a large and nationally representative sample, the often repeated conclusion that people lie, on average, once or twice per day. The results also document that the distribution of lies per day is substantially skewed. Most reported lies were told by a few prolific liars.

      These findings have important implications. Most importantly, the nature of the distribution makes conclusions drawn from sample means misleading. Although the mean lies per day reported in the literature appear reflective of aggregate reality, the mean as a central tendency does not reflect the lying behavior of the typical person. Instead, most people reported telling no lies at all on a given day, with the median and mode both being zero. We are not the first to note this shape for the lie frequency distribution. A similar pattern, with many telling a few lies and a few telling most of the lies, was reported by Feldman, Forrest, and Happ (2002) in a laboratory study on self-presentation. But in their analysis of the data, the skew was treated as a methodological limitation rather than as a finding. The clarity of the results observed in the national study raises a question of whether or not this pattern existed in previous studies reporting lying frequency.


      The raw data were obtained from the student phase of the DePaulo et al. (1996) study and from both phases of the George and Robb (2008) study. A distribution of lie frequency was partially reconstructed from the results reported by Feldman et al. (2002). The shape of each of the four distributions (excluding those reporting no lies) is examined, and the resulting distributions are compared with the overall results from the national survey. Data from the DePaulo et al. community sample and Hancock et al. (2004) studies were not available.

      DePaulo et al. (1996)

      Of the 77 students sampled by DePaulo et al., 76 reported telling at least one lie over the period of 1 week (Mweek = 13.74, SDweek = 11.40; this is equivalent to Mday = 1.96, SDday = 1.63); the total number of lies told was 1,058; and the most lies told by one person was 46 (equivalent to 6.6 lies per day). Curve-fitting yields a power function for these data of y = 5.366 × x0.290(n = 76, r2 = .289). This function has a poor fit; however, no equation provided a fit better than r2 = .350. Even so, the data exhibit the overall distributional properties of a few lies told by most of the subjects and most of the lies told by a few liars. Of the total students in this study, 66.2% told the equivalent of two lies per day or less. Conversely, the seven most frequent liars (9.2% of the sample) told more than the equivalent of five lies per day, or 26.2% of all lies reported. Although raw data for the community sample are not available, DePaulo et al. reported 64 of 70 subjects told 477 lies with Mweek = 6.79 (SDweek = 6.86) and Mdnweek= 4.5; these measures suggest a positive skew similar to that observed with the student sample.

      George and Robb (2008)

      Two studies were conducted following the diary methodology used by DePaulo et al. (1996) and Hancock et al. (2004). The objective of the two studies was to examine variations in deceptive behavior by media use. A key difference between these studies and the prior diary studies was the use of PDAs instead of paper and pencil to record interactions and lies. In the first study, George and Robb used the 10-minute minimum time established in the prior diary studies to define an interaction; in the second study, the minimum interaction time was reduced to 5 minutes. In both studies, subjects were instructed to record lies even when the interaction was shorter than the prescribed time.

      Results of this research are notably different from the prior diary studies. In each study, the mean number of lies was small. In the first study, 24 of 25 subjects reported lying over the period of a week (Mweek = 4.16, SDweek = 2.59; this is equivalent to Mday = 0.59, SDday = 0.37; Mdn = 4); the total number of lies told was 104; and the most lies told by one individual was 11 (the equivalent of 1.6 lies per day). Despite the low average, only one subject who reported lying told fewer than two lies; consequently, a power function could not be fit to the distribution (if this individual is eliminated a power function with a modest r2 = .758 can be fit to the remaining data). Even with this poor fit, the distributional properties of the study demonstrated a positive skew similar to the national survey and the DePaulo et al. (1996) diary study. Half of the subjects (48% of the sample) told just 25% of the lies while the three subjects (12%) reporting the most lies told 26.9% of the lies.

      In George and Robb's (2008) second study, the length of interaction was shortened in order to encourage more reporting. As a result, 23 of 24 subjects reported lying over the period of a week and the lying frequency increased from the first study (Mweek = 6.33, SDweek = 3.78; this is equivalent to Mday = 0.90, SDday = 0.54; Mdn = 5). The total number of lies reported was 152 and the most lies told by one individual was 14 (the equivalent of two lies per day). Similar to the first study, and despite the overall low number of lies told, those reporting lies told a minimum of two lies; consequently, a power function could not be fit to the data. Nonetheless, the positive skew is again apparent in the pattern of responses. Just three subjects (12.5% of the sample) told 40 of 152 lies (26.3% of the total).

      Although addressing a much shorter time frame in an experimental setting, Feldman et al. (2002) observe the same distinct skew pattern in their lying data as was observed in the daily national study and the weekly student diary studies. This study examined lying as a component of self-presentation; a sample of 121 students was divided into two induction groups and an experimental control. Across all subjects, a total of 211 lies were analyzed with a mean of 1.75 lies per subject and 2.92 lies per subject who lied.4 Of the 121 subjects, 49 told no lies (40.5%), 23 told one lie (19.0%), and 18 told two lies (14.9%). The maximum number of lies was 12. Thus, 74.4% of the total sample accounted for only 28% of the lies told. The remaining 31 subjects (25.6%) told 3–12 lies, accounting for 72% of the lies (M = 4.9 lies per subject).

      When prior research reporting the frequency of lies is reexamined, results show the small diary samples, the experimental self-presentation study, and our large, national self-report survey have similarly skewed distributions. In all cases, the infrequent liars are a large part of the sample and account for a disproportionately small share of the lies reported. Conversely, each study includes a small number of individuals who account for a disproportionately large share of the lies.

      Study 3

      Several issues are of concern despite the consistency of the findings across studies and the face validity obtained by reanalysis of prior studies reporting lie frequency. Primarily, these concerns have to do with the accuracy of the study findings using the survey approach to obtaining self-report data. Further, the apparent discrepancy between the numbers of nonliars when the time frame is 1 day versus the number when the time frame is 1 week needs to be resolved. For these reasons, the national survey was replicated using student samples and additional measures.

      Accuracy of Reporting Lies

      An obvious concern with self-report, mass survey research is accuracy of reporting. Bias in the national study data, if it exists, would likely manifest itself as underreporting. Given the pervasive cultural prohibitions against lying, self-presentation motives favor under- rather than overreporting. Methodological research on other sensitive topics such as drinking (Lemmens, Tan, & Knibbe, 1992) and sexual behavior (Ramjee, Weber, & Morar, 1999) indicates that diary studies produce higher mean scores than self-report questionnaires. The diary method used by DePaulo et al. (1996) is expected to be less susceptible to underreporting bias even though the study had the added limitations of a smaller and nonrepresentative sample. Because the mean number of lies per day in the national survey data was greater than the mean observed in the diary research, concern that the national study may be underreporting due to the use of the self-report questionnaire method is not consistent with results of most comparisons of survey research to an alternative diary method. Thus, the observed frequency of lying reported is not likely attributable solely to the survey methodology used to collect the national study data.

      Nonetheless, a survey of lying behavior invites the question: “How do you know the subjects are not lying?” One answer resides in the methods used to address sensitive questions. Tourangeau and Yan (2007) identify four techniques for asking sensitive questions that might contribute to improved prevalence measurement: Use of self-administered rather than interviewer-administered questions, forgiving wording, randomized response techniques (RRT), and use of the bogus pipeline (BPL) approach. The national survey of lying was conducted as a self-administered study and used a forgiving wording preamble. The meta-analysis by Tourangeau and Yan (2007) provides clear evidence that self-administration yields more behavioral reporting for sensitive topics than does interviewer-administration. There is limited evidence that forgiving wording may also be helpful. Catania et al. (1996) found increased reporting of sexual activity but a series of experiments by Holtgraves, Eck, and Lasky (1997) suggest that forgiving wording is more likely to improve attitudinal responses than behavioral reporting.

      A second answer can be obtained through the use of additional measures in order to assess the social desirability bias (SDB) in the self-report measures (Fisher & Katz, 2000). This is sometimes done directly by obtaining measures of SDB using a scale of SDB traits (Crowne & Marlowe, 1960; Reynolds, 1982) and adjusting for the subjects' level of bias. Alternatively, Fisher and Katz suggest the validity of reports of a socially undesirable behavior such as lying can be assessed indirectly, for example, by comparing the self-report of that behavior to subjects' estimates of the extent of that behavior in others. Subjects in Study 3 were asked to report the total number of times others had lied to them in the past 24 hours.

      Minimum Observable Differences

      A second key issue raised by the results of the national study is the minimum observable difference of one lie per day. Those who lie but do so less than once a day maybe recorded as having told no lies. Because the frequency of lies is typically reported as the rate of lying in a fixed interval of time (i.e., one to two lies per day), expanding the duration in which the behavior can be observed is likely to increase the overall reported incidence of the behavior; measures taken over a longer period and converted to a daily rate will also increase precision. In order to observe those who lie once a week or once a month or even less frequently, it is necessary to use a wider time aperture. In the student sample of the DePaulo et al. (1996) study, 30% of diaries recorded six or fewer lies per week (less than once per day). If the number of lies told in a week is evenly distributed across the days of the week (and there is no reason to believe an even distribution is a good assumption), we might expect that as many as 17% (those reporting four, five, or six lies in a week) have an above average likelihood of being included as liars in a 1-day study. But the other 13% (reporting one, two, or three lies in a week) would be more likely to report not lying on the survey day. Identifying infrequent liars by repeating the daily survey on 2 or more days with the same sample, or by asking those who report no lies to identify other times when they had told a lie, we would expect to narrow substantially the gap between the national study observation that 40% told a lie on one given day and the DePaulo et al. result that 95% of subjects (both studies combined) reported at least one lie over the period of a week.

      Participants and Design

      Students were recruited from communication, advertising, and marketing classes at two large universities in the Midwestern United States. Of the 229 subjects, two failed to provide complete information regarding the number of lies told and two provided answers that were determined to be statistical outliers using a discordancy test. These subjects were excluded from the analysis. Of the remaining 225 respondents, 140 were female (62.2%) and 85 were male (37.8%). Participation was voluntary.

      The primary purpose of this study was to cross-validate the results of the national study using a separate sample. A nonexperimental survey design similar to that of the national study was used in order to obtain comparable descriptive measures for the incidence of lying in the student sample. Results from this survey are compared with those of Study 1 and the reanalysis of prior studies in Study 2.

      Procedures and Measures

      The study was administered in five separate classroom settings using a paper questionnaire. The introduction and frequency of lies question were identical to that of the online survey used with the national sample except that students could leave parts of the 5 × 2 grid blank (the online survey forced subjects to place zeroes in all cells for which no lies were told before they could continue with the questionnaire). Blank cells were coded as zero lies. In addition, those subjects who reported telling no lies in the past 24 hours were asked a follow-up closed-ended question regarding when they last told a lie. Response categories included, “more than 24 hours ago but within the last 2 days,” “more than 2 days ago but within the last week,” “more than a week ago but within the last month,” “more than a month ago,” and “never.” Subjects were asked how many lies they had been told by others in the past 24 hours and what percentage of the U.S. adult population lies on a given day. One group of subjects was asked about the difficulty of the lie reporting task.


      The pattern of lie frequency for the total student sample is consistent with the distribution of lies for the national survey using the same frequency measure. The data fit a power function of y = 53.891 × x−1.012 with a goodness-of-fit of r2 = .894. The pattern of many telling few lies versus a few telling many lies is reproduced in the student sample frequencies. Of the total subjects in the student sample, 68.4% told one, two, or no lies (accounting for 24.5% of 526 total lies); conversely, the 13 most prolific liars (5.8% of the total sample) told 22.4% of the lies. On the basis of the relationship of the means in the DePaulo et al. (1996) student and community diary studies, we expected the mean of the student survey to be higher than the mean of the national adult sample. Results from the student sample (M = 2.34 lies per day, SD = 2.94, Mdn = 1, Mode = 0, N = 225, Max = 21 lies, 95% CI = 1.95–2.72) indicate higher frequency than in the national survey (nonoverlapping 95% confidence intervals; t(1221) = 2.62, p < .01, d = 0.18).

      Those subjects who reported telling no lies in the past 24 hours (28.9%) were asked the follow-up question regarding when they last told a lie. Combining those who indicated their most recent lie was within the past week (though not in the past 24 hours) with those who reported at least one lie in the past 24 hours, a total of 92.4% reported telling a lie in the past week (204 of 221 subjects; 4 did not answer the follow-up question). This result is reasonably consistent with the DePaulo et al. (1996) result of 95%.

      Since there is a legitimate concern about SDB in the self-report of lying, a check on the subjects' reported rate of lying (lies per day) was obtained by asking subjects in the student sample how many times they believe they were lied to by others in the past 24 hours. Subjects reported that others told an average of 2.79 lies in the past 24 hours (SD = 2.82, N = 198). Although 28.9% reported telling no lies, 17.2% reported not being lied to by others. Own lies (M = 2.34, SD = 2.94) are significantly lower than others' lies; the paired sample t(197) = 2.18, p < .05. Despite this significant difference, the distribution of others' lies is strikingly similar to that of own lies following the long-tail pattern observed in the self-report of lies (the power function for others' lies is y = 44.552 × x−0.790 with a modest goodness-of-fit of r2 = .700). It is important to note that individual subject's reports of others' lies did not mimic their self-report of lying. The correlation of the two measures is a meager r = .073 (p = .152, ns).

      Participants were also asked what proportion of the U.S. adult population (18 years or older) told at least one lie on any given day. The mean estimate is 75.8% (SD = 21.7). The proportion of students (who are a subsegment of the adult population) telling at least one lie in the past 24 hours is 71.1% (CI = ±5.9% points). Although not directly comparable due to the different ways in which the proportions were derived, in both instances one percentage falls within the confidence interval of the other, suggesting that the two proportions would not be significantly different.


      The results of Study 3 fit a power function similar to that of the national survey and further indicated that, based on self-reports of lying behavior, most people tell few or no lies in a given day but a few prolific liars tell a disproportionately large share of the daily lies. Survey results, diary studies, and the distribution of lies in an experimental setting share this pattern, and the consistency of results provides strong evidence that the frequency of lying has a strong positively skew. The similarity of subjects' estimates of others' lies and the distribution of number of lies told provides convergent validity for the self-report measure of lying.

      With regard to the accuracy of mean prevalence figures, Study 3 provides mixed results. The mean reported by this student sample is 2.34 lies per day and expands the range of responses across all studies to 0.59–2.34 lies per day. The mean for the student survey is significantly higher than the mean of the national survey of American adults. This is similar to the relationship observed by DePaulo et al. (1996) for their student and community samples. Consistent with the Study 1 finding that younger people tend to lie more than older people, the predominantly young student sample has a higher frequency of lying. Two measures were obtained to assess the validity of the reported lying frequency. One measure estimated the proportion of the population that lies on a given day; the result of 75.8% is similar to the Study 3 result showing that 71.1% of the subjects told at least one lie in the past 24 hours. However, an estimate of others' lies to the subjects (M = 2.79) is significantly higher than the average of the subjects' own lies. Either the subjects truly believe that other people lie more than they do or the subjects are slightly but systematically underestimating their own lying behavior.

      Regardless, the difference is one of degree and not magnitude, and the relevant finding is the consistent skew of the distribution and not the average number of lies told. In fact, the mean of a long-tail (power) distribution is in part a function of sample size. As the sample size increases, the probability of rare but legitimate extreme values that are not statistical outliers being reported also increases. In Study 1, the maximum number of lies by one person was 46 (N = 998). Study 3 consisted of five subsamples (separate classroom administrations of the survey) and the maximum number of lies told was closely associated with sample size. In the largest subgroup (n = 118), the maximum number of lies was 21. In the next largest group (n = 31), the most lies told was 12. In the three smallest subsamples (n = 26, 25, and 25, respectively), the most lies told by one person were 10, 8, and 6.

      A question related to accuracy may be raised with regard to the difficulty of the task (since it might be expected that a more difficult task would produce increased variance). We note that researchers familiar with the topic of deception tend to struggle more with the question asked in the self-administered survey than do naive subjects. Those familiar with the deception literature wrestle with the explication of lying and, perhaps because of this cognitive effort, their own ability to recall precisely over a 24-hour period. Naive subjects seem to have no such problem. As a check, 25 students in one Study 3 subsample completed the recall of lies task and were subsequently asked to rate the difficulty of answering the question. On a 1–10 scale (1 = not at all difficult; 10 = extremely difficult), the subjects reported M = 3.48 (SD = 3.03) with the Mdn = 2 and the Mode = 1. Most subjects seemed to feel that they were able to complete the task with little difficulty. During a verbal debriefing of this subject group, subjects reiterated that most had made an earnest effort to estimate the number of times they had lied and most felt the task was not difficult.

      Finally, Study 3 appears to have resolved the discrepancy in the proportion of nonliars reported in the daily survey and weekly diary results. All of the diary studies report that more than 90% of the subjects told at least one lie in the period of 1 week. However, many of those subjects lie at a rate that translates to less than one lie per day. When Study 3 subjects who did not report lying in the 24 hours preceding the survey were asked when they last told a lie, results indicate that most did lie in the previous week. Therefore, the apparent discrepancy is due to the precision of the question being asked. The daily questions do not allow for fractional responses, whereas the conversion of weekly data to daily data inherently creates fractional reporting. When the precision issue is accounted for by additional measurement, the discrepancy disappears.

      General Discussion

      The results of Study 1 reproduce with a large, national sample the conclusion that people lie, on average (arithmetic mean), once or twice per day. More importantly, the results document that the distribution of self-reported lies per day is substantially skewed. Most people report telling few or no lies on a given day and most reported lies are told by a few prolific liars. The reanalyses in Study 2 and the replication of the survey methodology in Study 3 provide strong evidence in support of this finding. All the data reported here are consistent with the claim that most lies are told by a few prolific liars.

      The highly skewed, long-tail distribution that results from reports of lying may be emblematic of a larger class of behaviors. Although much of what we measure and observe relies on the principle of randomness and results in the well-known bell-shaped curve of observations distributed around a central tendency for a phenomenon, scientists continue to find new evidence for a self-organizing principle in nature that produces distributions that are scale-free, or lacking in a characteristic tendency (Barabasi, 2003; Barabasi & Albert, 1999). This principle is found in the examination of atomic structure, biological systems, economic theory (the Pereto Law or “80/20 Rule”), and the emergence of nodes on the Internet. The latter describes a pattern consisting of a few very large nodes that are connected with extremely high frequency (e.g., Google, eBay, and http://Amazon.com) that are surrounded by over 100 million smaller and, in many cases isolated, Websites. But it is the human tendency to search, use, and link to these few extreme large sites while ignoring so many others that creates this apparent scale-free pattern in the first place. It may be less the case that lying is some unique form of communicative behavior that divides people into those who do it (with vigor) and those who do not than an indicator that we need to reexamine a broad range of social and symbolic behaviors, look for scale-free distributions, and consider broadly the implications for all forms of communication research.

      Our finding with regard to the distribution of reported lies has specific implications for the conclusions drawn from deception detection accuracy experiments. A meta-analysis shows that people do only slightly better than chance (54%) when distinguishing between truths and lies (Bond & DePaulo, 2006). In the experiments cited by Bond and DePaulo, the deception base rate is almost always 50% and subjects are typically truth-biased. Truth bias leads to the veracity effect, meaning that accuracy is higher for detecting honest messages than for detecting lies (Levine, Park, & McCornack, 1999). As a consequence, honesty base rates impact accuracy such that people are increasingly accurate as the proportion of truthful to deceptive messages increases (Levine, Kim, Park, & Hughes, 2006). The finding that most people are honest most of the time suggests that experiments employing nonrepresentative base rates (i.e., a lower proportion of truths to lies than is typical of everyday interactions) will underestimate accuracy. Findings from this general population study indicate that truth bias may be functional. Since self-report data suggest most people do not lie or tell very few lies in the course of a typical day, it is reasonable for people to believe others (that is, to be truth-biased) most of time.

      However, although laboratory deception detection experiments may underestimate overall accuracy due to nonrepresentative base rates, lie detection rates may be overestimated. The current findings suggest substantial individual differences for truth telling (or lying) with an honest majority and a few prolific liars. Bond and DePaulo (2008) report substantial variation in both people's demeanor and people's ability to lie. People's ability to lie successfully likely impacts how often they lie (Kashy & DePaulo, 1996). We speculate that the prolific liars are likely those people with especially honest demeanor and that unusually transparent liars avoid lying. If most lies outside the laboratory are told by people who are usually believed, lie detection rates would be lower than those observed in randomized experiments.

      One caution relates to Rozin's (2001) concern for achieving “context- and culture-sensitive scientific social psychology.” The large sample of American adults used in the current research is more representative and allows greater generalizability than previous studies using students and other convenience samples. Nevertheless, caution should be taken before assuming that these results will hold for other cultures, individuals under 18 years of age, or subsets of the population with characteristics or beliefs that are sharply discrepant from the norm. Anecdotal evidence suggests that base rates may well vary across cultures and studies of lying involving young children and adolescents show considerable variation in the ways lying occurs (Knapp, 2008). Criminals, political extremists, and those at the far ends of the socioeconomic spectrum may well behave differently than the vast majority of the American population. The domain of deception research would benefit from cross-cultural, subcultural, and cross-generational examination of the prevalence and distribution of lying behavior.


      Over time, most Americans probably lie at least occasionally. But the inference of pervasive daily lying, drawn from the statistical average of one to two lies per day, and reinforced by media coverage of corporate deception and political malfeasance, as well as pop culture portrayals of deception detectors, belies the basic honesty present in most people's everyday communication. Self-report data for the U.S. adult population show the average rate of lying is around 1.65 lies per day; but the data are not normally distributed. On any given day, the majority of lies are told by a small portion of the population, and nearly 6 out of 10 Americans claim to have told no lies at all. As researchers continue to examine the nature of lying and look for ways to detect deception effectively, both the theories of deception, and the methods used to test those theories, necessarily must take into account that veracity is not a constant across the population and that the propensity to lie can be an important moderator.


      The authors thank Deborah Kashy and Joey George for providing the original data from their studies for additional analysis.


      1 Synovate, Inc. (Chicago) conducts marketing research and strategic studies for business, industry, and government agencies. White papers on the omnibus methodology are available at http://www.synovate.com. The omnibus panel was surveyed on April 30, 2008.

      2 Two subjects, one with an extreme value of 134 lies (20.8 standard deviations above the mean of 1.80 lies) and a corresponding subject with no lies, were eliminated from the sample by a-trimming. Using procedures outlined by Barnett and Lewis (1978), a discordancy test appropriate for gamma distributions was applied to the data. Even though the potential for extreme values is inherent in the tail of the power function, the most extreme value recorded in these data is well beyond the limits of statistical probability.

      3 The application of a t-test for establishing the significance of differences for nonnormal distributions is problematic. Nonetheless, in those situations where the shapes of the distributions are similar, the sample sizes are similar, and the samples are sufficiently large the t-test will provide an acceptable test of significance even though the assumption of normality has been violated (Johnson, 1978). Although most of the small student sample studies fail to meet all of these conditions, the power distributions in this large-scale national study replicate for most subsamples and do so with large enough samples to allow for robust statistical testing. An alternative is to make the nonparametric assumption and apply difference testing accordingly. Calculating chi-square for age (under 45 years old vs. 45 years and older) and lying (did vs. did not lie) yields χ2(1, N = 998) = 36.59, p < .001; this leads to a conclusion similar to the one reported in the text, specifically that younger people are more likely to lie than older people. Calculating chi-square for gender and lying yields χ2 (1, N = 998) = 2.84, p < .092; again the result is consistent with the reported t-test showing that men are directionally but not significantly more likely to lie than are women.

      4 Despite including the induction groups, using a student sample, and being constrained to only those who told lies, the mean of 2.92 lies has been misinterpreted in the media (online program description, About Lie to Me, 2009, http://www.fox.com/lietome/about) and in scholarly work as claiming “in the course of a 10-minute interaction … the average person tells two to three lies” (Harrington, 2009, p. 4).

      About Lie to Me. (2009). Online program description. Retrieved April 10, 2009, from http://www.fox.com/lietome/about.
      Anderson, C. (2008). The long tail. New York: Hyperion.
      Asch, S. E. (1987). Social psychology (Original work published 1952). New York: Oxford University Press.
      Barabási, A.-L. (2003). Linked. New York: Plume.
      Barabási, A.-L., & Albert, R. (1999). Emergence of scaling in random networks. Science, 286, 509–512.
      Barnett, V., & Lewis, T. (1978). Outliers in statistical data. Chichester, UK: Wiley.
      Birnbaum, M. H. (2004). Human research and data collection via the Internet. Annual Review of Psychology, 55, 803–832.
      Bok, S. (1999). Lying: Moral choice in public and private life. New York: Vintage.
      Bond, C. F. Jr., & DePaulo, B. M. (2006). Accuracy of deception judgments. Review of Personality and Social Psychology, 10, 214–234.
      Bond, C. F. Jr., & DePaulo, B. M. (2008). Individual differences in judging deception: Accuracy and bias. Psychological Bulletin, 134, 477–492.
      Brown, J. H., Gupta, V. K., Li, B.-L., Milne, B. T., Restrepo, C., & West, G. B. (2002). The fractal nature of nature: Power laws, ecological complexity, and biodiversity. Philosophical Transactions of the Royal Society B, 357, 619–626.
      Catania, J. A., Binson, D., Canchola, J., Pollack, L. M., Hauck, W., & Coates, T. J. (1996). Effects of interviewer gender, interviewer choice, and item wording on responses to questions concerning sexual behavior. Public Opinion Quarterly, 60, 345–375.
      Cochran, S. D., & Mays, V. M. (1990). Sex, lies and HIV. New England Journal of Medicine, 322(11), 774–775.
      Crowne, D. P., & Marlowe, D. (1960). A new scale of social desirability independent of psychopathy. Journal of Consulting Psychology, 24, 349–354.
      DePaulo, B. M., Kashy, D. A., Kirkendol, S. E., Wyer, M. M., & Epstein, J. A. (1996). Lying in everyday life. Journal of Personality and Social Psychology, 70, 979–995.
      Feldman, R. S., Forrest, J. A., & Happ, B. R. (2002). Self-presentation and verbal deception: Do self-presenters lie more?Basic and Applied Social Psychology, 24(2), 163–170.
      Fisher, R. J., & Katz, J. E. (2000). Social desirability bias and the validity of self-reported values. Psychology & Marketing, 17(2), 105–120.
      Ford, C. V., King, B. H., & Hollender, M. H. (1988). Lies and liars: Psychiatric aspects of prevarication. American Journal of Psychiatry, 145, 554–562.
      George, J. F., & Robb, A. (2008). Deception and computer-mediated communication in daily life. Communication Reports, 21, 92–103.
      Hancock, J. T., Thom-Santelli, J., & Ritchie, T. (2004). Deception and design: The impact of communication technology on lying behavior. CHI Letters, 6(1), 129–134.
      Harrington, B. (Ed.). (2009). Deception: From ancient empires to Internet dating. Stanford, CA: Stanford University Press.
      Haselton, M. G., Buss, D. M., Oubaid, V., & Angleitner, A. (2005). Sex, lies, and strategic interference: The psychology of deception between the sexes. Personality and Social Psychology Bulletin, 31, 3–23.
      Holtgraves, T., Eck, J., & Lasky, B. (1997). Face management, question wording, and social desirability. Journal of Applied Social Psychology, 27, 1650–1671.
      Jensen, L., Arnett, J., Feldman, S., & Cauffman, E. (2004). The right to do wrong: Lying to parents among adolescents and emerging adults. Journal of Youth and Adolescence, 33(2), 101–112.
      Johnson, N. J. (1978). Modifiedt tests and confidence intervals for asymmetric populations. Journal of the American Statistical Association, 73, 536–544.
      Kalish, N. (2004, January). How honest are you?Reader's Digest, 114–119.
      Kish, L. (1965). Survey sampling. New York: John Wiley & Sons.
      Kashy, D. A., & DePaulo, B. M. (1996). Who lies?Journal of Personality and Social Psychology, 70, 1037–1051.
      Knapp, M. L. (2008). Lying and deception in human interaction. Boston: Pearson Education.
      Knox, D., Schacht, C., Holt, J., & Turner, J. (1993). Sexual lies among university students. College Student Journal, 26, 269–272.
      Lemmens, P., Tan, E. S., & Knibbe, R. A. (1992). Measuring quantity and frequency of drinking in a general population survey: A comparison of five indices. Journal of Studies on Alcohol, 53, 476–486.
      Levashina, J., & Campion, M. (2007). Measuring faking in the employment interview: Development and validation of an interview faking behavior scale. Journal of Applied Psychology, 92(6), 1638–1656.
      Levine, T. R., Kim, R. K., & Hamel, L. M. (2009). People lie for a reason: An experimental test of the principle of veracity. Unpublished manuscript, Michigan State University.
      Levine, T. R., Kim, R. K., Park, H. S., & Hughes, M. (2006). Deception detection accuracy is a predictable linear function of message veracity base-rate: A formal test of Park and Levine's probability model. Communication Monographs, 73, 243–260.
      Levine, T. R., McCornack, S. A., & Avery, P. B. (1992). Sex differences in emotional reactions to discovered deception. Communication Quarterly, 40, 289–296.
      Levine, T. R., Park, H. S., & McCornack, S. A. (1999). Accuracy in detecting truths and lies: Documenting the “veracity effect.”Communication Monographs, 66, 125–144.
      Loosveldt, G., & Sonck, N. (2008). An evaluation of the weighting procedures for an online access panel survey. Survey Research Methods, 2(2), 93–105.
      Patterson, J., & Kim, P. (1991). The day America told the truth. New York: Prentice-Hall.
      Ramjee, G., Weber, A. E., & Morar, N. S. (1999). Recording sexual behavior: Comparison of recall questionnaires with a coital diary. Sexually Transmitted Diseases, 26, 374–380.
      Reynolds, W. M. (1982). Development of reliable and valid short forms of the Marlowe-Crowne social desirability scale. Journal of Clinical Psychology, 38, 119–125.
      Rozin, P. (2001). Social psychology and science: Some lessons from Solomon Asch. Personality and Social Psychology Review, 5(1), 2–14.
      The Real Truth About Lying (1996, September/October). Psychology Today, 29, 16.
      Tosone, C. (2006). Living everyday lies: The experience of self. Clinical Social Work Journal, 34(3), 335–348.
      Tourangeau, R., & Yan, T. (2007). Sensitive questions in surveys. Psychological Bulletin, 133(5), 859–883.
      U.S. Census Bureau (2008). Current population survey. Retrieved June 5, 2008, from http://www.census.gov/cps/.
      Vasek, M. E. (1986). Lying as a skill: The development of deception in children. In R. W.Mitchell & N. S.Thompson (Eds.), Deception perspectives on human and nonhuman deceit (pp. 271–292). Albany: State University of New York Press.

      Photo Credits

      VOLUME 1 U.S. Navy: 2 (Elliott Fabrizio), 116 (right, Amber K. Whittington), 127, 288 (Jeremy L. Grisham); National Archives and Records: 9, 152, 194; Boston Public Library: 18; Wikimedia: 25, 32, 41, 49 (Kok Leng Yeo), 55, 75, 84 (Matt Waldron), 93, 180, 187, 203 (Damon D'Amato), 217; 247, 265, 295, 347 (Frank Kovalchek), 357, 367 (Jay Tamboli), 374, 379, 391, 405 (West Midlands Police), 419, 440 (Arasmus Photo), 448 (Walters Art Museum), 455, 462, 467, 483; Flickr: 62 (Keith Allison), 68, 166 (Bill Humason), 226, 324 (Peter A. Smith), 352 (Dan Silvers), 386 (Tanaka Juuyoh), 474 (Anneli Salo); National Institutes of Health: 98; Department of Defense: 103, 159, 283, 303, 317 (bottom), 338 (Kevin J. Wastler); U.S. Department of Justice: 110; Library of Congress: 116 (left), 143, 173, 412, 436; MorgueFile: 136, 237, 240, 272; 305, 331, 426; National Museum of Health and Medicine: 210; Photo Booth: 258; DES Daughters: 312; U.S. Marines Corps: 317 (top); Food and Drug Administration: 360; © Duke University Photography/Jim Wallace: 400.

      VOLUME 2 Library of Congress: 492, 712, 878, 947; Flickr: 497, 594 (Matthew Hayles), 639 (Ed Schipul), 686 (Government Press Office), 701, 789, 865 (Charlie Neibergall); ThinkStock: 506, 582, 603; Wikimedia: 513 (left), 513 (right, Jay Joslin), 526, 556, 572, 587 (West Midlands Police), 632, 651, 694, 708, 726, 761, 766, 773 (David Bethune, Limestone Technologies), 782 (Keith Allison), 815, 822, 837 (Ulrich Bendele), 846, 851, 859, 872 (David Shankbone); MorgueFile: 519, 617, 622, 661, 801, 808, 885, 894, 899, 913, 917, 926, 938; White House Photo/Lawrence Jackson: 542; Department of Defense: 549, 579; U.S. Navy: 565 (Timothy Walter); © Yann Caradec: 610; Food and Drug Administration: 644, 890; U.S. Department of Homeland Security: 667 (Barry Bahler); U.S. Marines Corps: 672 (Jeremy Fasci); Yorck Project: 679; National Archives and Records: 719, 746, 755, 908; U.S. Air Force: 731; U.S. Coast Guard/Photo courtesy of Helge Fykse, Norway: 738; Orange County Archives: 794; Photo courtesy of Senator Sam J. Ervin Jr. Library and Museum: 931 (left); North Carolina Museum of History: 931 (right).

    Back to Top

    Copy and paste the following HTML into your website