Posts Tagged ‘honesty’

They say honesty is the best policy — and now there’s scientific evidence to prove it.

An unconventional study that offered volunteers the chance to pocket nearly $100 found that the more people stood to gain by hiding the truth, the more likely they were to come clean.

The results, published recently in the journal Science, offer surprising insights into the ways that money influences honesty.

The findings could help shape policies that encourage conscientious behavior in a range of situations, researchers said. The Internal Revenue Service could design its forms in a way that discourages people from cheating on their taxes, for example, while insurance companies could change the way they collect information about a car accident so that lying becomes less appealing.

“Honesty is essential for almost all social and economic relations,” said Michel Maréchal, an economics professor at the University of Zurich who helped lead the study.

Scientists have studied how the temptation of money affects honesty in laboratory settings, but little is known about how they’re related in the real world — especially on a global scale.

So a team of economists devised an experiment involving more than 170,000 “lost” wallets that turned up in 335 cities across 40 countries, from Indonesia to Ghana to Brazil.

Each wallet contained a grocery list, a key, and three business cards with the same male name and email address. Some of them had no money, while others had the equivalent of about $13.50.


The contents of a wallet used in the experiment. All three business cards were identical.

To kick off the experiment, a research assistant brought a wallet to the front desk of a hotel, bank, post office or other public place. He or she would claim to have found it on outside and push it toward the person behind the desk.

“Somebody must have lost it,” the research assistant would tell the unwitting employee. “I’m in a hurry and have to go. Can you please take care of it?”

The study authors hypothesized that employees would be more likely to email the wallet’s “owner” (using the address on the business card) if it contained no cash. After all, economists assume people will behave rationally and maximize their utility — which in this case would mean keeping the $13.50 windfall.

But that’s not what happened.

“Much to our surprise, we observed the opposite effect,” Maréchal said. “People were more likely to return the wallet when it contained a higher amount of money.”

In the experiment, 51% of wallets with cash were returned, compared with 40% of those without bills or coins.

And people were not treating themselves to a finder’s fee. Spot checks of the wallets showed that 98% of the money was turned in.

“When there was something missing, it was coins,” said study leader Alain Cohn, a behavioral economist at the University of Michigan. In those cases, he said, the researchers figured the coins simply “fell out of the wallet.”

Perhaps $13.50 was such a pittance “that people simply wouldn’t bother stealing it,” said co-author Christian Zünd, a graduate student at the University of Zurich. So the researchers expanded their experiment to include wallets with the equivalent of about $94.

The extra cash seemed to induce yet more honesty. Wallets with the big bucks were returned 72% of the time, compared with 61% of the wallets with less money and 46% of wallets with no cash. Once again, the economists were stumped.

“Why are people more likely to return a wallet that contains more rather than less money?” Zünd said.


Across 40 countries, the trend was clear: Wallets with money in them were more likely to be returned than wallets without cash.

Perhaps people were afraid of getting into legal trouble if they kept the money for themselves. The researchers checked to see if the return rates were higher when the wallet was dropped off in the presence of witnesses or security cameras, but those factors didn’t seem to make a difference. Nor was there a correlation between return rates and local lost-property laws.

If people weren’t acting out of fear or being influenced by peer pressure, maybe they were sincerely concerned about the well-being of the wallet’s “owner.” So the researchers introduced “lost” wallets that contained money but no key. The result: Employees were 9.2% more likely to return a wallet with a key than a wallet without one — a sign of altruism, the study authors said.

But that wouldn’t explain why wallets with $94 were returned at higher rates than wallets with $13.50. The researchers conducted a survey in the U.S., the United Kingdom and Poland (the three places where big-money wallets were “lost”) and asked people to rate how much they would feel like thieves if they kept a wallet, with or without money.

Keeping a wallet with no money in it did not feel like stealing, Zünd said. “With money, however, it suddenly feels like stealing, and it feels even more like stealing when the money in the wallet increases.”

In other words, the more money the ”lost” wallet contained, the greater the psychological cost of seeing oneself as a thief. That was a price people didn’t want to pay.

The results bolster the idea “that people care about maintaining a positive moral view of themselves,” said Nina Mažar, a Boston University behavioral economist who was not involved in the study.

Policymakers could use this insight to encourage better behavior among citizens “by making it more difficult for them to convince themselves that they are honest, when in fact, they did something wrong,” Cohn said.

Previous research has shown that reminding people about their moral standards just before they perform a specific task helps reduce their temptation to cheat. For example, people were more likely to fill out an insurance form honestly if they had to sign an honor statement at the beginning of the form rather than at the end.

Future studies should test whether these results would hold up if the owner of the “lost” wallet appeared to be a foreigner, according to Shaul Shavi, who studies behavioral ethics and economics at the University of Amsterdam.

“People find it worthwhile to act kindly toward members of their own group but not members of other groups,” he wrote in an essay that accompanies the study.

Mažar said she’d like to know more about how honest behavior varies across countries. In the experiment, for instance, the odds of a wallet with money being returned were more than three times higher in Switzerland than in China.

“We want to understand, ‘What are the commonalities, what are the differences?’” Mažar said. “Because if we understand those, maybe we’ll have a better sense of how we could increase civil honesty on a much larger scale or reduce corruption.”

https://www.latimes.com/science/la-sci-people-are-honest-lost-wallets-experiment-20190620-story.html

Advertisements

STARRE VARTAN

Linguists from Lund University in Sweden have discovered a previously undocumented language — a perfect example of why field research is so important in the social sciences. Only spoken by about 280 people in northern Peninsular Malaysia, this language includes a “rich vocabulary of words to describe exchanging and sharing,” according to researchers Niclas Burenhult and Joanne Yager, who published their findings in the journal Linguistic Typology.

Burenhult and Yager discovered the language while surveying for a subproject of the DOBES (Documentation of Endangered Languages) initiative. Under the Tongues of the Samang project, they were looking for language data from various speakers of the Asilan language.

They named the new language Jedek. “Jedek is not a language spoken by an unknown tribe in the jungle, as you would perhaps imagine, but in a village previously studied by anthropologists. As linguists, we had a different set of questions and found something that the anthropologists missed,” Burenhult, an associate professor of general linguistics, said in a university release.

The people who speak Jedek are settled hunter-gatherers, and their language may influence — or reflect — other aspects of their culture. As detailed by the linguists, “There are no indigenous verbs to denote ownership such as borrow, steal, buy or sell, but there is a rich vocabulary of words to describe exchanging and sharing.”

The community in which Jedek is spoken is different in other ways than just sharing versus owning. It’s more gender-equal than Western societies, according to the linguists. They also report that there are no professions; everyone knows how to do everything. “There are no indigenous words for occupations or for courts of law. There is almost no interpersonal violence, they consciously encourage their children not to compete, and there are no laws or courts.”

https://www.mnn.com/lifestyle/arts-culture/blogs/malaysias-jedek-language-rich-vocabulary-words-describe-sharing-cooperation

By Suzanne Allard Levingston

With her hair pulled back and her casual office attire, Ellie is a comforting presence. She’s trained to put patients at ease as she conducts mental health interviews with total confidentiality.

She draws you into conversation: “So how are you doing today?” “When was the last time you felt really happy?” She notices if you look away or fidget or pause, and she follows up with a nod of encouragement or a question: “Can you tell me more about that?”

Not bad for an interviewer who’s not human.

Ellie is a virtual human created by scientists at the University of Southern California to help patients feel comfortable talking about themselves so they’ll be honest with their doctors. She was born of two lines of findings: that anonymity can help people be more truthful and that rapport with a trained caregiver fosters deep disclosure. In some cases, research has shown, the less human involvement, the better. In a 2014 study of 239 people, participants who were told that Ellie was operating automatically as opposed to being controlled by a person nearby, said they felt less fearful about self-disclosure, better able to express sadness and more willing to disclose.

Getting a patient’s full story is crucial in medicine. Many technological tools are being used to help with this quest: virtual humans such as Ellie, electronic health records, secure e-mail, computer databases. Although these technologies often smooth the way, they sometimes create hurdles.

Honesty with doctors is a bedrock of proper care. If we hedge in answering their questions, we’re hampering their ability to help keep us well.

But some people resist divulging their secrets. In a 2009 national opinion survey conducted by GE, the Cleveland Clinic and Ochsner Health System, 28 percent of patients said they “sometimes lie to their health care professional or omit facts about their health.” The survey was conducted by telephone with 2,000 patients.

The Hippocratic Oath imposes a code of confidentiality on doctors: “I will respect the privacy of my patients, for their problems are not disclosed to me that the world may know.”

Nonetheless, patients may not share sensitive, potentially stigmatizing health information on topics such as drug and alcohol abuse, mental health problems and reproductive and sexual history. Patients also might fib about less-fraught issues such as following doctor’s orders or sticking to a diet and exercise plan.

Why patients don’t tell the full truth is complicated. Some want to disclose only information that makes the doctor view them positively. Others fear being judged.

“We never say everything that we’re thinking and everything that we know to another human being, for a lot of different reasons,” says William Tierney, president and chief executive of the Regenstrief Institute, which studies how to improve health-care systems and is associated with the Indiana University School of Medicine.

In his work as an internist at an Indianapolis hospital, Tierney has encountered many situations in which patients aren’t honest. Sometimes they say they took their blood pressure medications even though it’s clear that they haven’t; they may be embarrassed because they can’t pay for the medications or may dislike the medication but don’t want to offend the doctor. Other patients ask for extra pain medicine without admitting that they illegally share or sell the drug.

Incomplete or incorrect information can cause problems. A patient who lies about taking his blood pressure medication, for example, may end up being prescribed a higher dose, which could send the patient into shock, Tierney said.

Leah Wolfe, a primary care physician who trains students, residents and faculty at the Johns Hopkins School of Medicine in Baltimore, said that doctors need to help patients understand why questions are being asked. It helps to normalize sensitive questions by explaining, for example, why all patients are asked about their sexual history.

“I’m a firm believer that 95 percent of diagnosis is history,” she said. “The physician has a lot of responsibility here in helping people understand why they’re asking the questions that they’re asking.”

Technology, which can improve health care, can also have unintended consequences in doctor-patient rapport. In a recent study of 4,700 patients in the Journal of the American Medical Informatics Association, 13 percent of patients said they had kept information from a doctor because of concerns about privacy and security, and this withholding was more likely among patients whose doctors used electronic health records than those who used paper charts.

“It was surprising that it would actually have a negative consequence for that doctor-patient interaction,” said lead author Celeste Campos-Castillo of the University of Wisconsin at Milwaukee. Campos-Castillo suggests that doctors talk to their patients about their computerized-record systems and the security measures that protect those systems.

When given a choice, some patients would use technology to withhold information from providers. Regenstrief Institute researchers gave 105 patients the option to control access to their electronic health records, broken down into who could see the record and what kind of information they chose to share. Nearly half chose to place some limits on access to their health records in a six-month study published in January in the Journal of General Internal Medicine.

While patient control can empower, it can also obstruct. Tierney, who was not involved as a provider in that study, said that if he had a patient who would not allow him full access to health information, he would help the patient find another physician because he would feel unable to provide the best and safest care possible.

“Hamstringing my ability to provide such care is unacceptable to me,” he wrote in a companion article to the study.

Technology can also help patients feel comfortable sharing private information.

A study conducted by the Veterans Health Administration found that some patients used secure e-mail messaging with their providers to address sensitive topics — such as erectile dysfunction and sexually transmitted diseases — a fact that they had not acknowledged in face-to-face interviews with the research team.

“Nobody wants to be judged,” said Jolie Haun, lead author of the 2014 study and a researcher at the Center of Innovation on Disability and Rehabilitation Research at the James A. Haley VA Hospital in Tampa. “We realized that this electronic form of communication created this somewhat removed, confidential, secure, safe space for individuals to bring up these topics with their provider, while avoiding those social issues around shame and embarrassment and discomfort in general.”

USC’s Ellie shows promise as a mental health screening tool. With a microphone, webcam and an infrared camera device that tracks a person’s body posture and movements, Ellie can process such cues as tone of voice or change in gaze and react with a nod, encouragement or question. But the technology can neither understand deeply what the person is saying nor offer therapeutic support.

“Some people make the mistake when they see Ellie — they assume she’s a therapist and that’s absolutely not the case,” says Jonathan Gratch, director for virtual human research at USC’s Institute for Creative Technologies.

The anonymity and rapport created by virtual humans factor into an unpublished USC study of screenings for post-traumatic stress disorder. Members of a National Guard unit were interviewed by a virtual human before and after a year of service in Afghanistan. Talking to the animated character elicited more reports of PTSD symptoms than completing a computerized form did.

One of the challenges for doctors is when a new patient seeks a prescription for a controlled substance. Doctors may be concerned that the drug will be used illegally, a possibility that’s hard to predict.

Here, technology is a powerful lever for honesty. Maryland, like almost all states, keeps a database of prescriptions. When her patients request narcotics, Wolfe explains that it’s her office’s practice to check all such requests against the database that monitors where and when a patient filled a prescription for a controlled substance. This technology-based information helps foster honest give-and-take.

“You’ve created a transparent environment where they are going to be motivated to tell you the truth because they don’t want to get caught in a lie,” she said. “And that totally changes the dynamics.”

It is yet to be seen how technology will evolve to help patients share or withhold their secrets. But what will not change is a doctor’s need for full, open communication with patients.

“It has to be personal,” Tierney says. “I have to get to know that patient deeply if I want to understand what’s the right decision for them.”