Episodes

  • Consider some of the conflicts bubbling or boiling in the world today, and then plot where education – both schooling and less formal means of learning – fits in. Is it a victim, suffering from the conflict or perhaps a target of violence or repression? Maybe you see it as complicit in the violence, a perpetrator, so to speak. Or perhaps you see it as a liberator, offering a way out a system that is unjust in your opinion. Or just maybe, its role is as a peacebuilder.

    Those scenarios are the framework in which Tejendra Pherali, a professor of education, conflict and peace at University College London, researches the intersection of education and conflict. In this Social Science Bites podcast, Pherali discusses the various roles education takes in a world of violence.

    “We tend to think about education as teaching and learning in mathematics and so forth,” he tells interviewer David Edmonds. “But numeracy and literacy are always about something, so when we talk about the content, then we begin to talk about power, who decides what content is relevant and important, and for what purpose?”

    Pherali walks us through various cases outlining the above from locales as varied as Gaza, Northern Ireland and his native Nepal, and while seeing education as a perpetrator might seem a sad job, his overall work endorses the value and need for education in peace and in war.

    He closes with a nod to the real heroes of education in these scenarios.

    “No matter where you go to, teachers are the most inspirational actors in educational systems. Yet, when we talk about education in conflict and crisis, teachers are not prioritized. Their issues, their lack of incentives, their lack of career progression, their stability in their lives, all of those issues do not feature as the important priorities in these programs. This is my conviction that if we really want to mitigate the adverse effects of conflict and crisis on education of millions of children, we need to invest in teachers.”

    A fellow of the Academy of Social Sciences and of the Higher Education Academy, he is a co-research director of Education Research in Conflict and Crisis and chair of the British Association for International and Comparative Education.

  • Missing episodes?

    Click here to refresh the feed.

  • The work of human hands retains evidence of the humans who created the works. While this might seem obvious in the case of something like a painting, where the artist’s touch is the featured aspect, it’s much less obvious in things that aren’t supposed to betray their humanity. Take the algorithms that power search engines, which are expected to produce unvarnished and unbiased results, but which nonetheless reveal the thinking and implicit biases of their programmers.

    While in an age where things like facial recognition or financial software algorithms are shown to uncannily reproduce the prejudices of their creators, this was much less obvious earlier in the century, when researchers like Safiya Umoja Noble were dissecting search engine results and revealing the sometimes appalling material they were highlighting.

    In this Social Science Bites podcast, Noble -- the David O. Sears Presidential Endowed Chair of Social Sciences and professor of gender studies, African American studies, and information studies at the University of California, Los Angeles -- explains her findings, insights and recommendations for improvement with host David Edmonds.

    And while we’ve presented this idea of residual digital bias as something somewhat intuitive, getting here was an uphill struggle, Noble reveals. “It was a bit like pushing a boulder up a mountain -- people really didn't believe that search engines could hold these kinds of really value-laden sensibilities that are programmed into the algorithm by the makers of these technologies. Even getting this idea that the search engine results hold values, and those values are biased or discriminatory or harmful, is probably the thrust of the contribution that I've made in a scholarly way.”

    But through her academic work, such as directing the Center on Race & Digital Justice and co-directing of the Minderoo Initiative on Tech & Power at the UCLA Center for Critical Internet Inquiry and books like the 2018 title Algorithms of Oppression: How Search Engines Reinforce Racism, the scale of the problem and the harm it leaves behind are becoming known. Noble’s own contributions have been recognized, too, such as being named a MacArthur Foundation fellow in 2021 and the inaugural NAACP-Archewell Digital Civil Rights Award winner in 2022.

  • Most of us recognize the presence of ritual, whether in a religious observance, an athlete’s weird pre-competition tics, or even the cadence of our own morning ablutions. In general, most of these rituals are seen as harmless and probably a little unnecessary (or even silly). But according to cognitive anthropologist Dimitris Xygalatas, ritual often serves a positive purpose for individuals – synchronizing them with their communities or relieving their stress.

    In this Social Science Bites podcast, Xygalatas defines for host David Edmonds what his research considers ritual, citing two important characteristics of ritual: causal opacity (such as rain dances not actually creating precipitation) and that the ritual matters, often greatly, to the participants. What isn’t ritual, he notes, is habit – although habits can veer into ritual/

    “Utilitarian actions can become ritualized,” Xygalatas says, “and to that extent, they can be considered as rituals. So .. because I am a very avid consumer of coffee, when I get up in the morning, I always have to make a cup of coffee – [and] it always has to be in the same cup.”

    Xygalatas then describes fieldwork he’s done on “high-intensity” rituals, ranging from firewalking in Spain or an “excruciating” annual religious procession in Mauritius. These efforts – part ethnography and part lab experiment – have given him unique insight into the results of jointly experienced ritual, much of which he detailed in his recent book, Ritual: How Seemingly Senseless Acts Make Life Worth Living. (In a blurb, Jane Goodall wrote the book shows “how and why our most irrational behaviors are a key driver of our success.”)

    An associate professor in anthropology and psychological sciences at the University of Connecticut – where he heads the Experimental Anthropology Lab – Xygalatas also discusses the transdisciplinary scope of his work. This reflects his own roots in both anthropology and religious studies (he is a past president of the International Association for the Cognitive and Evolutionary Sciences of Religion).

  • At the end of every interview that host David Edmonds conducts for the Social Science Bites podcast, he poses the same question: Whose work most influenced you? Those exchanges don’t appear in the regular podcast; we save them up and present them as quick-fire montages that in turn create a fascinating mosaic of the breadth and variety of the social and behavioral science enterprise itself.

    In this, the fifth such montage, we offer the latest collection. Again, a wide spectrum of influences reveals itself, including nods to non-social-science figures like philosopher Derek Parfit and primatologist Jane Goodall, historical heavyweights like Adam Smith and the couple Sidney and Beatrice Webb, and two past guests on Social Science Bites itself, Nobel Prize laureates Angus Deaton and Daniel Kahneman.

  • Is giving to a charitable cause essentially equivalent to any other economic decision made by a human being, bounded by the same rational and irrational inputs as any other expenditure? Based on research by psychologist Deborah Small and others working in the area of philanthropy and altruism, the answer is a resounding no.

    In this Social Science Bites podcast, Small, the Adrian C. Israel Professor of Marketing at Yale University, details some of the thought processes and outcomes that research provides about charitable giving. For example, she tells interviewer David Edmonds, that putting a face to the need – such as a specific hungry child or struggling parent – tends to be more successful at producing giving than does a statistic revealing that tens of thousands of children or mothers are similarly suffering.

    This “identifiable victim effect,” as the phenomenon is dubbed, means that benefits of charity may be inequitably distributed and thus do less to provide succor than intended. “[T]he kind of paradox here,” Small explains, “is that we end up in many cases concentrating resources on one person or on certain causes that happened to be well represented by a single identifiable victim, when we could ultimately do a lot more good, or save a lot more lives, help a lot more people, if – psychologically -- we were more motivated to care for ‘statistical’ victims.”

    That particular effect is one of several Small discusses in the conversation. Another is the “drop in the bucket effect,” in which the magnitude of a problem makes individuals throw up their arms and not contribute rather than do even a small part toward remedying it.

    Another phenomenon is the “braggarts dilemma,” in which giving is perceived as a good thing, but the person who notes their giving is seen as less admirable than the person whose gift is made without fanfare. And yet, the fact that someone goes public about their good deed can influence others to join in.

    “[O]ne of the big lessons in marketing,” Small details, “is that word of mouth is really powerful. So, it's much more effective if I tell you about a product that I really like than if the company tells you about the product, right? You trust me; I'm like you. And that's a very effective form of persuasion, and it works for charities, too.”

    Small joined the Yale School of Management in 2022, moving from the University of Pennsylvania’s Wharton School, where she had been the Laura and John J. Pomerantz Professor of Marketing since 2015. In 2018, she was a fellow of the American Psychological Society and a Marketing Science Institute Scholar.

  • On his institutional web homepage at the University of California-Los Angeles’s Anderson School of Management, psychologist Hal Hershfield posts one statement in big italic type: “My research asks, ‘How can we help move people from who they are now to who they’ll be in the future in a way that maximizes well-being?”

    In this Social Science Bites podcast, Hershfield and interviewer Dave Edmonds discuss what that means in practice, whether in our finances or our families, and how humans can make better decisions. Hershfield’s new book, Your Future Self: How to Make Tomorrow Better Today, offers a popular synthesis of these same questions.

    Much of his research centers on this key observation: “humans have this unique ability to engage in what we call ‘mental time travel,’ the ability to project ourselves ahead and look back on the past and even project ourselves ahead and look back on the past while we're doing so. But despite this ability to engage in mental time travel, we don't always do it in a way that affords us the types of benefits that it could.”

    Those benefits might include better health from future-looking medical decisions, better wealth thanks to future-looking spending and savings decisions, or greater contentment based on placing current events in a future-looking context. Which begs the question – when is the future?

    “The people who think the future starts sooner,” Hershfield explains, “are the ones who are more likely to do things for that future, which in some ways makes sense. It's closer, it's a little more vivid. There's a sort of a clean break between now and it. That said, it is a pretty abstract question. And I think what you're asking about what counts in five years, 10 years, 20 years? That's a deeper question that also needs to be examined.”

    Regardless of when someone thinks the future kicks off, people remain acutely aware that time is passing even if for many their actions belie that. Proof of this comes from studies of how individual react when made acutely aware of the advance of time, Hershfield notes. “People place special value on these milestone birthdays and almost use them as an excuse to perform sort of a meaningfulness audit. of their lives, … This is a common finding, we've actually found this in our research, that people are more likely to do these sorts of meaning-making activities as they confront these big milestones. But it's also to some degree represents a break between who you are now and some future person who you will become.”

    Hershfield concludes the interview noting how his research has changed him, using the example of how he now makes time when he might be doing professional work to spend with his family. “I want my future self to look back and say, ‘You were there. You were present. You saw those things,’ and not have looked up and said, ‘Shoot, I missed out on that.’ I would say that's the main way that I've really started to shift my thinking from this work.”

  • A common trope in America depicts a traditional family of a married husband and wife and their 2.5 (yes, 2.5) children as the norm, if not perhaps the ideal. Leaving aside the idea of a “traditional” coupling or what the right number of children might be, is there an advantage to growing up with married parents?

    Definitely, argues Melissa Kearney, author of The Two-Parent Privilege: How Americans Stopped Getting Married and Started Falling Behind and the Neil Moskowitz Professor of Economics at the University of Maryland. In this Social Science Bites podcast, she reviews the long-term benefits of growing up in a two-parent household and details some of the reasons why such units have declined in the last four decades.

    As befits her training, Kearney uses economics to analyze marriage. “Marriage,” she tells host David Edmonds, “is fundamentally an economic contract between two individuals—here, I'm gonna sound very unromantic—but it really is about two people making a long-term commitment to pool resources and consume and produce things together.”

    In her own research, Kearney looks specifically at being legally married within the United States over the last 40 years and what that means when children are involved. Her findings both fascinate her and, she admits, worries her.

    “We talk at length in this country about inequality as we should, but this divergence in family structure and access to two parents and all the resources that brings to kids and the benefits it gives kids in terms of having a leg up in sort of achieving things throughout their life—getting ahead economically, attaining higher levels of education—[well,] we will not close class gaps. without addressing this.”

    She provides data showing that the percentage of young Americans living with married parents is indeed falling. In 2020, 63 percent of U.S. children lived with married parents, compared to 77 percent 40 years earlier. Meanwhile, 40 percent of children are born to unmarried parents.

    While these percentages are evenly distributed across the geography of the U.S., they are less so among the nation’s demographics. For example, children born to white or Asian, more educated or richer mothers are more likely to be born within wedlock.

    “The mechanical drivers of this,” Kearney explains, “are a reduction in marriage and a reduction in the share of births being born inside of marital union, not a rise in divorce, not a rise in birth rates to young or teen moms.” But economics does seem to be a driver, Kearney said – especially among men.

    As cultural tumult saw marriage itself growing less popular starting in the 1960s, non-college-educated men saw their economic prospects dimming. “We saw a reduction in male earnings or a reduction in male employment and a corresponding reduction in marriage and rise in the share of kids born outside of marital union. So, there is a causal effect here, economic shocks that have widened inequality hurt the economic security of non-college educated men, and this rising college gap and family structure.”

    Over time, new social norms were established, so even when the economic prospects of non-college-educated men rise, there is not a corresponding increase in marriage and decrease in non-marital births. “Once a social norm has been established, where this insistence on sort of having and raising kids in a marital union is broken, then we get this response to economic shocks that we might not have gotten if the social norm towards two-parent households and married-parent households was tighter.”

    In addition to her work at the University of Maryland, Kearney maintains a large footprint in the policy world. She is director of the Aspen Economic Strategy Group; a research associate at the National Bureau of Economic Research; a non-resident senior fellow at Brookings; a scholar affiliate and member of the board of the Notre Dame Wilson-Sheehan Lab for Economic Opportunities; and a scholar affiliate of the MIT Abdul Jameel Poverty Action Lab, known as J-PAL.

    So it’s no surprise that she closes her interview with some policy suggestions.

    “[I]mproving the economic position of non-college educated men, I think, is necessary but won't be sufficient. We need more wage subsidies. We need a lot of investment in community colleges throughout the country—they train workers throughout the country—we need to be shoring up those institutions. We need to be stopping bottlenecks in the workforce that make it harder for people without a four-year college degree, or for people who have criminal past, right, criminal history—all of those things. We need to be removing barriers to employment, investing in training, investing in skills, investing in paths to families to sustaining employment.”

  • While it seems intuitively obvious that good management is important to the success of an organization, perhaps that obvious point needs some evidence given how so many institutions seem to muddle through regardless. Enter Raffaela Sadun, the Charles E. Wilson Professor of Business Administration at Harvard Business School and co-leader of the Digital Reskilling Lab there. Working through several managerial mega-projects she co-founded, Sadun can both identify traits of successful management and even put a quantitative value to what good management can bring to a firm (spoiler alert – as Sadun will explain, it’s a big number).

    In this Social Science Bites podcast, Sadun discusses her research findings with host David Edmonds, who open his inquiry with a very basic question: What, exactly, do we mean by ‘management’?

    “It's a complicated answer,” Sadun replies. “I think that management is the consistent application of processes that relate to both the operations of the organization as well as the management of human resources. And at the end of the day, management is not that difficult. It’s being able to implement these processes and update them and sort of adapt them to the context of the organization.”

    In a practical sense, that involves things like monitoring workers, solving problems and coordinating disparate activities, activities that ultimately require someone “to be in charge.” But not just anyone, Sadun details, and not just someone who happens to be higher up. “The most effective managers are the ones that are able to empower and get information and reliable information from their team, which is fundamentally a bottom-up approach rather than a top-down approach.”

    If that sounds a little different from the adversarial relationship many expect between workers and managers, well, good management is a little different, she continues. “I can see how you can think of this as being a trade-off (profit versus well-being of workers), but if you look at the type of practices that we measure, as I said, they're not exploitations, but they are ways to get people engaged and empowered to sort of participate into the work. It’s always possible that there are organizations that push so much on one side of the equation that make people very unhappy. In my experience, these type of situations are not sustainable.”

    Good people – the ones employers prize -- won’t put up with too much garbage. “Talented people are attracted--to the extent that they want to work for somebody else—they're attracted to places where their life is not miserable.”

    Sadun came to her conclusions through projects like the World Management Survey, which she co-founded two decades ago. “We spoke with more than 20,000 managers to date—around 35 countries, [and ..] collected typically [by] talking with middle managers.” Other big projects include the Executive Time Use Study, and MOPS-H, the first large-scale management survey in hospitals and one conducted in partnership with the US Census Bureau. In her native Italy, Sadun was an economic adviser to the Italian government in the early 2020s, earning the highest honor possible from the government, the Grande Ufficiale dell'Ordine "Al Merito della Repubblica Italiana." In the United States, serves as director of the National Bureau of Economic Research Working Group in Organizational Economics, and is faculty co-chair of the Harvard Project on the Workforce.

  • “We have been evolving into a species that is super-cooperative: we work together with strangers, we can empathize with people, we are really an empathic flock,” begins Carsten de Dreu, a professor at Leiden University. “And at the same time, there is increasing evidence from archaeological excavations all around the world that already 10, 20 and 30 thousand years ago, people were actually violently killing each other.”

    Trained as a social psychologist, de Dreu uses behavioral science, history, economics, archaeology, primatology and biology, among other disciplines to study the basis of conflict and cooperation among humans. In this Social Science Bites podcast, he discusses how conflict and violence – which he takes pains to note are not the same – mark our shared humanity and offers some suggestions on how our species might tamp down the violence.

    “Violence,” de Dreu explains to host David Edmonds, “is not the same as conflict – you can’t have violence without conflict, but you can have conflict without violence.” Conflict, he continues, is a situation, while violence is a behavior. Conflict, he says, likely always will be with us, but resorting to violence need not be.

    The psychologist says behavior has a biological basis, and various hormones may ‘support’ violent actions. For example, greater concentrations of oxytocin – which helps maintain in-group bonds and has been dubbed “the love hormone” -- is found in primate poo after groups fights. But, he cautions, that is not to say we are innately violent.

    But when we do get violent, it’s worse when we’re in groups. Then, the potential for violence, as he put it, “to get out of hand,” increases, escalating faster and well beyond the violence seen between individuals (even if that one-on-one violence sometimes can be horrific).

    “In an interpersonal fight, the only trigger is the antagonist. In intergroup violence, what we see is that people are sometimes blinded to the enemy – they might not even recognize who they were because they were so concerned with each other.”

    What drives this violence is both obvious and not, de Dreu suggests. “Even among my colleagues, there is sometimes fierce debate - conflicts sometimes about what are conflicts! But if you zoom out, there are two core things that groups fight about:” resources and ideas.

    Fighting over resources is not unique to humans – groups of primates are known to battle over land or mates. But fighting over ideas is uniquely human. And unlike resource conflicts, which have the potential to be negotiated, “for these truth conflicts ... there is no middle ground, no trade-off.” Regardless, he argues, values have value.

    Citing recent work with colleagues, de Dreu says he thinks “these values, these truths, these worldviews that we have, that we share within our groups and our communities, within our countries sometimes, they are the ‘oil’ of the system. To work together so that we all can survive and prosper, we need certain rules, a certain shared view of how the world operates, what is good and bad, what is right and wrong. These are very important shared values we need to have in order to function as a complex social system.”

    But “when these values get questioned, or attacked, or debunked, that’s threatening.” Depending on how severe the threat is seen, violence is deployed. And sometimes, as even a casual observer may divine, it’s not the direct quest for resources or to protect values that sparks violence, but what de Dreu terms “collateral damage” from leaders cynically weaponizing these drivers – or even inventing threats to them -- while actually pursuing their own goals.

    But de Dreu ends the podcast on a (mostly) upbeat note. He says we can break the cycles of violence, even if there’s no neat linear trajectory to do so, and concludes by offering some rays of hope.

  • In the Global North, media and political depictions of migration tend to be relentless images of little boats crossing bodies of water or crowds of people stacking up at a dotted line on a map. These depictions presume two things – that this is a generally comprehensive picture of migration and that, regardless of where you stand, the situation around migration is relatively dire.

    Enter Heaven Crawley, who heads equitable development and migration at United Nations University Centre for Policy Research. She also holds a chair in international migration at Coventry University’s Centre for Trust, Peace and Social Relations, and directs the South-South Migration, Inequality and Development Hub since 2019, a project supported by UK Research and Innovation’s Global Challenges Research Fund. From her perch, spanning government, academe and field research, she says confidently in this Social Science Bites podcast that international migration “is not an entirely positive story, but neither is it an entirely negative one. What we’re lacking in the media conversation and in the political discussion is any nuance.”

    Connecting nearly all the regional debates about migration “is the lack of an honest conversation about what migration is and what it has been historically. It has historically been the very thing that has developed the societies in which we live, and it is something on which the clock cannot be turned back.

    “And none of us, frankly, if migration was to end tomorrow, would benefit from that.”

    Trying to bring a clear eye to the debate, she explains to host David Edmonds that roughly 3.6 percent of the world’s population, or 280 million people, could be considered migrants. Of that, about 32 million fit under the rubric of “refugee.” And while the sheer number of Migrants is growing, the percentage of the world’s population involved has been “more or less the same” last three decades.

    And while this might surprise European listeners, almost 40 percent of migration originates from Asia-- mostly India, Pakistan and Bangladesh -- followed by Mexico. There is a lot of migration from African countries, Crawley notes, which gibes with European media, but most of that migration isn’t to Europe, but within the African continent.

    Who are these migrants? Overall, she says, most people who move are less than 45. Nonetheless, “the gender, the age really depends on the category you’re looking at and also the region you are looking at.” Generalizations about their qualifications can be fraught: low-skills migrants ready to fill so-called “dirty, difficult and dangerous jobs” and high-skill migrants draining out their country’s brains can often depart from the same nation.

    Crawley agrees that migration currently is a politically potent wedge issue, but she notes it has been in the past, too. She suggests that migration per se isn’t even the issue in many migration debates. “A whole set of other things are going on in the world that people find very anxiety-producing” – rapid changes in society drawing from security, economy, demographics, and more, all against a backdrop of “migration simultaneously increasing (in the number of people on the move, not the proportion) and the variety of people also increasing.”

    This creates an easy out for policymakers, she says. “Politicians know that if they’ve got problems going on in society, it’s very easy to blame migration, to blame migrants. It really is a very good distraction from lots of other problems they really don’t want to deal with.”

    This is also why, she suggests, that responses such as deterrence are more popular than more successful interventions like addressing the inequalities that drive migration in the first place.

    Crawley’s career saw her sit as head of asylum and migration research at the UK Home Office, serve three separate times as a specialist adviser to the UK Parliament’s Home Affairs Committee and Joint Committee on Human Rights, and be associate director at the Institute for Public Policy Research. In 2012, in recognition of her contribution to the social sciences and to evidence-based policymaking, she was named a fellow of Britain’s Academy of Social Sciences.

  • In the 1970s and early 1980s, when Shinobu Kitayama was studying psychology at Kyoto University, Cognitive Dissonance Theory and Attribution Theory were “really hot topics” that he found “intellectually interesting” ways of describing human behavior.

    “But when I came here [to the University of Michigan] and looked at my graduate students, colleagues, and friends, I realized that those ideas are really active elements of their mind in a way they were not to me as Japanese individual.”

    He continues, “obviously there are many cultural shocks – for example, I felt hesitant in speaking up in graduate seminar, but I got the impression that American friends end up saying a lot of things seemingly without thinking anything. That’s the kind of experience that made me feel that something more profound might be going on in terms of culture and its influence on psychological processes.”

    His own perch, he explains in this Social Science Bites podcast, helped focus his personal research into comparing people from East Asia, such as Japan, China, and the Philippines, with people in America. His research ranges from simple exercises involving redrawing a line within a box to brain-scanning technology (“culture gets under the skin,” he jokes before adding, “I find neuroscience indispensable”) and examinations of subsistence agriculture. The Robert B. Zajonc Collegiate Professor of Psychology at Michigan since 2011 now runs the Culture & Cognition Lab at the school’s Psychology Department.

    He starts his conversation with interviewer David Edmonds offering a description of a prominent cultural difference between East Asia and Anglo-America - the idea of ‘independence’ and ‘interdependence.’

    “In some cultures, particularly in Western traditions, ‘self’ is believed to be the independent entity that is composed of internal attributes, maybe your attitudes, maybe your personality traits and aspirations, which guide your behavior. Social relationships come out of those individual preferences.

    “In many other cultures, the conception of the person is much more social and relational. There’s a fundamental belief that humans are humans because they are connected to formal social relationships.”

    Kitayama offers some examples of these differences. “Americans tend to believe that what you hear somebody say must be what this person believes. If somebody says ‘yes,’ he must mean yes. But in many countries, ‘yes’ and ‘no’ carry very different meanings, depending on the context.” While someone from, say, the West may realize this on an intellectual level, in practice they often forget and assume a yes, means, well, yes. “We found this fundamental attribution error,” he concludes, “is much less, and often even nonexistent, in East Asian, and particularly Japanese, contexts.”

    Or take happiness.

    “Oftentimes, we believe that happiness is happiness. If Americans are happy, it must be in the way that Japanese are happy. We try to challenge this conception to see what people might mean when they claim they are happy. One easy way to do this is to ask people to write down what they mean by happiness, reasons for happiness, conditions in which happiness happens. Core elements of happiness, like elation, relaxation, feeling of excitement, are fairly common between U.S. and Japan.”

    But what leads to those states are quite different, with Japanese respondents often citing social harmony while Americans cite personal achievement.

    In the interview, Kitayama touches on why these differences might have arisen, including one idea that the cultivation of mainstay grains across thousands of years helped create the conditions that led to the cultural traits. The Asian staple of rice, for example, requires a more collective effort – “tight social coordination,” as Kitayama puts it -- to raise and harvest. Meanwhile, the Western staple of wheat requires less collaboration. These underlying agrarian requirements for supremely important foodstuffs may in turn, he says, “promote very different ideologies and social structures and institutions which then lay the ground for contemporary culture.”

    Kitayama has published widely in English and in Japanese and served as editor of the Journal of Personality and Social Psychology: Attitudes and Social Cognition and the Personality and Social Psychology Bulletin. He was a fellow of the Center for Advanced Studies of Behavioral Science at Stanford in 1995 and in 2007, a Guggenheim Fellow in 2010, inducted as a fellow of the American Association for the Advancement of Science in 2012, and served as president of the Association for Psychological Science in 2020.

  • Everyone, it is said, is allowed their own opinion. But what if someone’s own opinion was in fact one foisted on them by someone else, and yet the original opinion holder in turn holds the changeling opinion as their own?

    Unlikely? Actually, not so unlikely, as the research of Petter Johansson and Lars Hall into ‘choice blindness’ shows. In this Social Science Bites podcast, Johansson – who with Hall runs the Choice Blindness Laboratory at Sweden’s Lund University – reveals some of the unexpected aspects of self-interpretation and how there’s been a very large natural example in the United States of this blindness in action.

    We are “less aware of the reasons for our choices than we think we are,” he has determined, and reasoning, as we call it, is often conducted post hoc.

    Johansson starts his discussion with host David Edmonds by giving his and Hall’s first forays into the study of “how we come to know our own minds.” Their work built on others’ research into something called “change blindness,” which describes not noticing a change – even a major one – that occurs before your eyes.

    (Inattentional bias – such as the famous gorilla basketball video – is when we miss something obvious but unexpected right before us because we’re focusing on something else in the tableau. “I’ve seen this at conferences on monster-sized screens, when it is practically King Kong walking in the background, but still people miss this.”)

    Johansson describes how the research partners ‘magically’ morphed this line of inquiry into studies of what they call “choice blindness” using a card trick. “When you have the appearance of free choice,” he says, “when you have the magician say, ‘Pick a card, any card you want,’ the only thing you know is that the choice is no longer free. This was the aspect we wanted to incorporate into our experiments.”

    In the initial experiment, subjects were shown pairs of faces on cards, and asked to choose which they found more attractive. The researcher then handed them that card and asked why they chose it over the other. But sometimes, using sleight of hand, the researcher handed the subject the card with the other face, and asked again why they chose that face.

    “Even when the faces were drastically dissimilar, and the [subjects] could look at the cards for as long as they want, only 25 to 30 percent of the participants detect that the switch has been made,” Johansson reveals. “But it’s not only that they pick it up – they then must start constructing reasons why they picked this face,” justifying a choice they didn’t make.

    Subsequent experimentation found that opinions on taste, smell, consumer choice, and more could be subject to such blindness. The researchers, for example, set up a tasting station at a local supermarket, and after having the ol’ switcheroo played on their choice of jam, the subjects came up with “similar types of elaborate explanations” for why the jam they didn’t choose was in fact the better one. The researchers also worked with pairs of people, asking them who they might choose to flat with. And here the resulting confabulation was collective.

    The researchers also found choice blindness in politics (especially when the other opinion had a reasonable case that could be made). People on the street were asked to participate in survey about a policy position, and the interviewer would respond with “you clearly believe …” in a position they didn’t choose. And as you now will expect, the subjects defended their ‘new’ stance.

    “This says something about what a belief is, or an attitude is,” Johansson says. The source of the opinion matters: if you think it comes from you – even when it in fact did not – there must be good reason to hold the opinion. “People don’t like being told what’s right or wrong. But if you can tell yourself what’s right or wrong, it’s much more likely to stick.”

    And this can also be outsourced when your “team” makes a call, and partisans “quickly change their own attitudes to match.”

    Which brings us to former U.S. President Donald Trump. Under Trump, Johansson says, “It felt like there was four years of showing this point almost every day. Trump would change the policies or long-held beliefs almost every day and Fox and Friends and all these voters would just fall in line and quickly construct arguments why this was the right view all along.”

    While this might seem a dour outcome with opinion chameleons calling the shots, Johansson sees a brightside. “It does show we are probably more flexible than we think. We have the ability to change.”

  • “Ah, but a man’s reach should exceed his grasp,” the poet Robert Browning once opined, “or what’s a heaven for?” That’s not a very satisfying maxim for someone trying to lose weight, learn a language, or improve themselves in general on this earthly plane. But there are ways to maximize one’s grasping ability, and that’s an area where psychologist Ayelet Fishbach can help.

    Fishbach, the Jeffrey Breakenridge Keller Professor of Behavioral Science and Marketing at the University of Chicago Booth School of Business, studies goals and motivations. It's work that saw her serve as president of the Society for the Science of Motivation and the International Social Cognition Network and to pen the 2022 book, Get it Done: Surprising Lessons from the Science of Motivation.

    In this Social Science Bites podcast, she tells interviewer David Edmonds that one tip for setting goals is to make them concrete. So, for example, resolving to ‘being a good husband’ works, but ‘being happy’ does not. ‘Being happy’ is just too abstract. “You need to get to the level of abstractness that is motivating … but not too abstract that it is no longer connected to an action,” Fishbach explains, adding that there must be “a clear connection between the goal and the means.”

    However, she continues, research suggests that people -- while focused on the ends -- tend to scrimp on the means. Fishbach notes research on MBA students found they were willing to pay $23 for a particular book – but only willing to pay $11 for a tote bag that they knew also contained the book. The value of the bag, which was negligible but still extra step to getting the book, was therefore negative. “Which makes no sense,” she acknowledges, “but it illustrates the point.”

    Goals, she says, should be things we can “do,” what we can achieve, as opposed to prohibitions on actions, those “do nots” that describe what we should avoid. “Do” prompts, she continues, “are more intrinsically motivating. You are more excited about them. It feels good and right.” Plus, focusing on what we’re avoiding puts that thing in front of mind – which makes it harder to ignore.

    Fishbach calls for measuring your “do” activities, setting targets. She cites a study that saw marathon running times in the United States were not being evenly distributed, but clumped around just-before milestone times like three-and-a-half or four hours, suggesting runners pushed themselves to hit their personal targets.

    And where there are targets, there can be rewards. “Rewards work better than punishments,” she says, “but they don’t always work in the way they were intended to work.” If we incentivize the wrong things, behavior bends toward the incentive rather than the underlying goal.

    Oddly enough, “uncertain incentives seem to work better than known ones." Fishbach was part of a research team that saw people would work harder for a $1 or $2 prize, with the amount determined by a coin flip, than they were for a $2 guaranteed prize. “The excitement of resolving uncertainty is always better than the reward you are getting.”

    Other topics Fishbach addresses in this episode include internal motivations (immediate returns trumped longer-term rewards), how to sustain motivation, and whether we truly learn more from failure than success.

  • In this Social Science Bites podcast, interviewer David Edmonds asks psychologist Kathryn Paige Harden what she could divine about his educational achievements if all she knew about him was his complete genome. “Based just on your genetic information,” she starts, “I would be able to guess about as well as I would be able to guess if I knew how much money your parents had made per year when you were growing up.”

    Based on current knowledge drawn from recent samples in the United States, Harden estimates an “educational attainment polygenetic score” accounts for 15 to 17 percent of the variance in educational attainment, which is defined by years of formal education. The strength of the relationship is similar to environmental factors such as that for family wealth and educational attainment, or between educational attainment and wages.

    Harden’s “guess” is as about as educated as someone in the realm could make – she directs the Developmental Behavior Genetics Lab and co-directs the Texas Twin Project at the University of Texas. Her first book was 2021’s The Genetic Lottery: Why DNA Matters for Social Equality.

    One thing she stresses is that genetic influence on human behavior is not the single-factor ideal youngsters learn about in their first brush with Gregor Mendel and his pea plants.

    “Almost nothing we study as psychologists is monogenetic, influenced by one gene. It’s all polygenetic, meaning that there are thousands of genetic variants, each of which has a tiny probabilistic effect. If you add up all of that information, all of that genetic difference, it ends up making a difference for people’s likelihood of developing schizophrenia or doing better on intelligence test scores or having an autism spectrum disorder – but none of these things are influenced by just one gene.”

    Plus, that “polygenetic score” varies based on environmental factors, such as whether you were raised in an authoritarian state. “If I had my exact DNA that I have now,” she details, “but I was raised in 1850s France compared to 1980s America, my educational output would be different, obviously, because my gender would have been interacting with those opportunity structures in a different way.”

    As those structures evolve into ladders instead of roadblocks, the more utility we can derive from knowing the role of genetics.

    “The more we ‘level the playing field,’ the more that people have environments that are rich and conducive to their individual flourishing, the more we should expect to see, and the more in empirical practice we do see, the role of genetic differences in people.”

    In the shadow of eugenics and other genetics-based pseudo-sciences legacy, is harnessing that genetic influence for policy use good or bad? As Harden has experienced since her book published, “you can’t really talk about genes and education without fairly quickly running into some contested issues about fairness and equality.”

    In fact, she argues that much of her on heritability doesn’t so much answer social science questions as much as it “poses a problem for the social sciences.”

    In the podcast Harden discusses the Genome-wide Association Study, which she describes with a laugh as “a giant fishing expedition” in which researchers measure the DNA – genotype – from thousands or even millions of individuals and then measure that across the genome, for what comes down to “ a giant correlational exercise. Which genes are more common in people who are high on a trait versus low on a trait, or who have a disease versus don’t have a disease?”

    Harden also addresses the reasons she studies identical twins in her research, the cooption of genetic tropes to advance toxic worldviews, and how race – which she rejects as a proxy for genetic differences — plays out in the real world as opposed to the lab.

  • In the most innocent interpretation, suggesting someone should ‘do their own research’ is a reasonable bit of advice. But in the superheated world of social media discourse, #DoYourOwnResearch is a spicy rejoinder that essentially challenges someone to Google the subject since they clearly don’t know what they’re talking about. But Googling, social psychologist David Dunning pointedly notes, is not research. “The beauty and the terror of the internet,” he tells interviewer David Edmonds in this Social Science Bites podcast, “is that there’s a lot of terrific information, but there’s also a lot of misinformation and sometimes outright fraud.

    “People often don’t have the wherewithal to distinguish.”

    This distinguishing is an area where Dunning, a professor at the University of Michigan, does his own research.

    While doing your own internet sleuthing isn’t toxic on its face, Dunning suggests that often “you don’t know when you’re researching your way into a false conclusion, and … you don’t know when to stop. The real hard problem with DYOR is when do you know when to stop: you go and you look at a couple of web pages, and ‘Well, you’ve learned something! Terrific!’ But you don’t know how much there is behind it that you still need to learn.”

    One driver of DYOR, Dunning adds, is the idea that gaining (and deploying) knowledge is one’s own responsibility, which pretty much runs counter to science, which sees gaining knowledge as a collective enterprise.

    One piece of collective effort in which Dunning has made a very public mark is in describing what’s come to be known as the Dunning-Kruger effect, named for Dunning and fellow social psychologist Justin Kruger of New York University, after work they originally described two decades ago in “Unskilled and unaware of it: How difficulties in recognizing one's own incompetence lead to inflated self-assessments” in the Journal of Personality and Social Psychology. The popular definition of the Dunning-Kruger effect, Dunning explains, is that “people who are incompetent or unskilled or not expert in a field lack expertise to recognize that they lack expertise. So they come to conclusions, decisions, opinions that they think are just fine when they’re, well, wrong.”

    Dunning and Kruger’s initial research was based on simple tests – of grammar, logical thinking, classical psychology quizzes, even sense of humor – asking subjects how well they think they’re doing relative to everyone else. They found that the bottom 25 percent of participants tended to think they were doing above average. “But no.”

    “To know what you don’t know,” he offers, “you need to know what you need to know to realize that your thinking diverges from that.”

    It’s not true in every endeavor, he adds. “I’m a terrible golfer,” Dunning says. “And I’m fully aware that I‘m a terrible golfer!” The effect tends to show up when the skill of assessing outcomes is roughly similar to the skill of achieving outcomes. So when your golf ball flies into the nearby body of water, you don’t need special skills to know that’s bad.

    Becoming an expert in everything is out of the question; the real skill will be in identifying who is a legitimate expert and drawing on their insights. (And the right expert, Dunning notes “is the right experts. With an S on the end.”)

    For the record, the pair – who just received the 2023 Grawemeyer Award in Psychology for their Dunning-Kruger effect work - did not name the concept after themselves, although, as Dunning says, they’re “tickled pink that our names will forever be associated with the nincompoops, incompetent ignorant cranks, if you will.”

  • Historically and into the present day, female workers overall make less than men. Looking at college-educated women in the United States, Harvard University economic historian Claudia Goldin studies the origins, causes and persistence of that gap, which she discusses in this Social Science Bites podcast.

    Goldin, whose most recent book is Career & Family: Women's Century-Long Journey toward Equity, details for host David Edmonds how the figures she uses are determined. Specifically, it’s the ratio of female-to-male weekly earnings for those working full-time and year-round, with the median woman compared to the median man. “Expressed in this way, there has been real progress” in the last century, she says. Today in the United States, where Goldin’s studies occur, that number is below 85 cents on the dollar.

    While that trend is good news, it’s not the whole story. “By expressing this gap in this single number we miss the really, really important dynamics, and that is that the gender earnings pay gap widens a lot with age and it widens a lot with [having] children, and it widens in the corporate, banking and finance, and law sectors.”

    And while the gap may have narrowed, it shows no evidence it’s about to close.

    Acknowledging the “persistent frustration” about the pay gap’s durability, Goldin pointed a finger at structural inequities, bias and sexual harassment, but she also argues that “greedy work” was a major factor. Greedy work “is a job that pays a disproportionately more on a per hour basis when someone works a greater number of hours or has less control over those hours.” Hence, the gap persists “not so much [because] men and women go into completely different occupations,” she explains, but that women are financially “penalized” for choosing work that allows flexibility within that occupation.

    “The important point,” she adds, “is that both lose. Men are able to have the family and step up because women step back in terms of their jobs, but both are deprived. Men forgo time with their family and women often forgo their career.”

    But losers can win – eventually. The more that workers say to their supervisors that “we want our own time” the more the labor market will change, she explains by pointing to current trends. One caveat, though, is that the situation is worse among women without college educations.

    Goldin is the Henry Lee Professor of Economics at Harvard University and was the director of the National Bureau of Economic Research’s Development of the American Economy program from 1989 to 2017. She is a co-director of the NBER's Gender in the Economy group.

    She was president of the American Economic Association in 2013 and was president of the Economic History Association in 1999/2000. She is a member of the National Academy of Sciences and the American Philosophical Society and a fellow of the American Academy of Political and Social Science, the American Academy of Arts and Sciences, the Society of Labor Economists (which awarded her its Mincer Prize for life-time contributions in 2009), the Econometric Society, and the Cliometric Society. She received the IZA Prize in Labor Economics in 2016, the 2019 BBVA Frontiers in Knowledge award, and the 2020 Nemmers award, the latter two both in economics.

  • Political economist and journalist Will Hutton, author of the influential 1995 book The State We’re In, offers a state of the field report on the social sciences in this Social Science Bites podcast. Hutton, who was appointed in 2021 to a six-year term as president of Britain’s Academy of Social Sciences, addresses various critiques of modern social science – especially in its British incarnations -- from host David Edmonds.

    As defined by the academy that he now heads, “social science is the understanding of society in all its dimensions,” and encompasses the societal, economic, behavioral and geospatial sciences. Despite that broad remit, the first question posed is whether social and behavioral sciences take a back seat to the natural sciences in the public imagination.

    Hutton, for his part, says no – although he does see them not always getting their due. He notes that in combatting the COVID-19 pandemic, yeoman’s work was conducted by social and behavioral science. “It wasn’t called social science, but it was driven by social science.” The same, he continues, is happening as Britain confronts its economic demons.

    “Academic prowess is a kind of team,” he details. “You need your humanities, you need your physical scientists, your natural scientists, your medical scientists and your social scientists on the pitch. Sometime the ball falls to their feet and you look to them to make the killer pass.”

    One thing that might help in achieving that overdue recognition, he explains later, would be if the social sciences themselves shared their commonality as opposed to denying it. “[T]he Academy of Social Science was established 40 years ago, because we felt that good as the British Academy is, it couldn’t represent humanities and social science co-equally. Social science needed its own voice. Four decades on, I would say that social science’s standing in the world is higher than it was 40 years ago. But if [a score of] 100 is what you want to get to, we probably haven’t gotten beyond 20 or 30.”

    Impacting society, meanwhile, is how the sciences must improve their score (although Hutton acknowledges the vagaries of what impact looks like by saying “I’m not willing to castigate people if it looks as if what they are immediately doing is not impactful or having an impact.”) Asked what he sees as the “most fundamental issue” social science should tackle straightaway, Hutton offers four broad avenues to move down: Economics, governance, change behavior to keep the planet in good shape, and constructing a civil society of institutions that serve both individual and community needs. Among those, he concludes, “I think combining ‘the we and the I’ is the most important thing that social science can do.”

    Hutton’s wide-ranging answers follow from a wide-ranging career. He served as editor-in-chief of The Observer newspaper, was chief executive of the then Industrial Society, was principal of Hertford College, Oxford from 2011 to 2020, and has authored a number of bestsellers since The State We’re In: Why Britain Is in Crisis and How to Overcome It. Those books include 2008’s The Writing on the Wall: China and the West in the 21st Century, 2011’s Them and Us, 2015’s How Good We Can Be, and 2018’s Saving Britain: How We Can Prosper in a New European Future (written with Andrew Adonis).

  • There’s the always charming notion that “deep down we’re all the same,” suggesting all of humanity shares a universal core of shared emotions.

    Batja Mesquita, a social psychologist at Belgium’s University of Leuven where she is director of the Center for Social and Cultural Psychology, begs to disagree. Based on her pioneering work into the field of cultural psychology, she theorizes that what many would consider universal emotions – say anger or maternal love – are actually products of culture. “We’re making these categories that obviously have things in common,” she acknowledges, “but they’re not a ‘thing’ that’s in your head. When you compare between cultures, the commonalities become fewer and fewer.”

    In this Social Science Bites podcast, she explains how this is so to interviewer David Edmonds. “In contrast to how many Western people think about emotions, there’s not a thing that you can see when you lift the skull – there’s not thing there for you to discover,” Mesquita says. “What we call emotions are often events in the world that feel a certain way … certain physical experiences.”

    She gives the example of anger.

    “In many cultures there is something like not liking what another person imposes on you, or not liking another person’s behavior, but anger, and all the instances of anger that we think about when we think about anger, that is not universal. I’m saying ‘instances of anger’ because I also don’t think that emotions are necessarily ‘in the head,’ that they’re inside you as feelings. What we recognize as emotions are often happening between people.”

    That idea that emotions are not some ‘thing’ residing individually in each of our collective heads informs much of Mesquita’s message, in particular her delineation between MINE and OUR emotions (a subject she fleshes out in depth in her latest book, Between Us: How cultures create emotion).

    MINE emotions, as the name suggests, are the mental feelings within the person. OUR emotions are the emotions that happen between people, emotions that are relational and dependent on the situation. Does this communal emotion-making sound revolutionary to many ears? Perhaps that’s because it deviates from the Western tradition.

    “We haven’t done very much research aside from university students in Western cultures,” Mesquita notes. “The people who have developed emotion theories were all from the same cultures and were mostly doing research with the same cultures, and so they were comfortably confirmed in their hypotheses.”

    Also, she continued, Western psychology looks at psychological processes as things, such as ‘memories’ or ‘cognition.’ “We like to think if we went deep enough into the brain we would find these things.

    “The new brain science doesn’t actually find these things. But it’s still a very attractive way to analyze human emotion.” Just, in her view, the wrong way.

  • In the West we routinely witness instances of intergenerational sniping – Boomers taking potshots at over-privileged and under-motivated Millennials, and Millennials responding with a curt, “OK, Boomer.” What do we make of this, and is it anything new?

    These are questions Bobby Duffy, professor of public policy and director of the Policy Institute at Kings College London, addresses in his latest book, Generations – Does when you’re born shape who you are? (published as The Generation Myth in the United States). In this Social Science Bites podcast, Duffy offers some key takeaways from the book and his research into the myths and stereotypes that have anchored themselves on generational trends.

    “My one-sentence overview of the book,” Duffy tells interviewer David Edmonds, “is that generational thinking is a really big idea throughout the history of sociology and philosophy, but it’s been horribly corrupted by a whole slew of terrible stereotypes, myths and cliches that we get fed from media and social media about these various differences between generations. My task is not to say whether it’s all nonsense or it’s all true; it’s really to separate the myth from reality so we don’t throw out the baby with the bathwater.”

    One thing he’s learned is that the template for generational conflict is fairly standard over time, even if the specifics of what’s being contested are not.

    “The issues change,” he explains, “but the gap between young and old at any one point in time is actually pretty constant. … We’re not living through a time of particularly ‘snowflake,’ ‘social justice warrior’ young people vs. a very reactionary older group – it’s just the issues have changed. The pattern is the same, but the issues have changed.”

    Taking a look at climate change, for example, he notes that there’s a narrative that caring young people are fighting a careless cadre of oldsters unwilling to sacrifice for the future good. Not so fast, Duffy says: “The myth that only young people care about climate is a myth. We are unthinkingly encouraging an ageism within climate campaigning that is not only incorrect, but it is self-destructive.” That example, he notes, adds evidence to his contention that “the fake generational battles we have set up between the generations are just that – they are fake.”

    In the podcast, Duffy outlines the breakdowns his book (and in general larger society) uses to identify cohorts of living generations:

    Pre-war generation, those born before the end of World War II in 1945. Duffy says this could be broken down further – the so-called Silent Generation or the Greatest Generation, for example – but for 2022 purposes the larger grouping serves well. Baby Boomers, born from 1945 to 1965 Generation X, 1966 to 1979 (This is Duffy’s own generation, and so, with tongue in cheek, he calls it “the best generation”!) Millennials, 1980 to around 1995 And Gen Z, ending around 2012

    He notes that people are already talking about Generation Alpha, but given that generation’s youth it’s hard to make good generalizations about them.

    These generation-based groupings are identity groups that only some people freely adopt. “We’re not as clearly defined by these types of groupings as we are by, say, our age or educational status or our gender or our ethnicity.” His research finds between a third and half of people do identify with their generation, and the only one with “a real demographic reality” (as opposed to a solely cultural one) is the Baby Boomers, who in two blasts really did create a demographic bulge.

    Duffy, in addition to his work at King’s College London, is currently the chair of the Campaign for Social Science, the advocacy arm of Britain’s Academy of Social Sciences. Over a 30-year career in policy research and evaluation, he has worked across most public policy areas, including being seconded to the Prime Minister’s Strategy Unit. Before joining KCL he was global director of the Ipsos Social Research Institute.

    His first book, 2018’s The Perils of Perception – Why we’re wrong about nearly everything, draws on Ipsos’s own Perils of Perception studies to examine how people misperceive key social realities.