top of page

Bonus Episode Transcript - Simine Vazire

Samara Greenwood: Hello and welcome to a very special bonus episode of the HPS podcast with Professor of Psychology Simine Vazire, discussing the ways in which HPS scholars and scientists can work together to create better science.

We are releasing this episode now to coincide with a campaign put together by Simine and others to support the legal defence of Data Colada, a group of professors who have worked hard to identify concerns about the integrity of published research.

Members of Data Colada are being sued by Francesca Gino, a Harvard Business School professor, after they published blog posts raising concerns about the data integrity of four papers on which Gino was a co-author. These blog posts were published only after Harvard Business School had completed their own investigation, and Gino had been placed on administrative leave.

Simine is one of the key organizers of the GoFundMe campaign to support the legal costs of Data Colada. As the group says, "defending science requires defending legitimate scientific criticism against legal bullying."

In this podcast episode, my co-host Indigo Keel talks with Simine about her connection to history and philosophy of science, about issues that have arisen out of the replication crisis, and about cases of alleged scientific misconduct, including the Francesca Gino case, highlighting the challenges faced by scholars today who attempt to improve the quality, reliability, and objectivity of science.

Indigo Keel: Hi all, from the team here at the HPS podcast, we welcome you to another episode. I'm your host Indigo Keel and my co-producer is Samara Greenwood. Today we welcome Simine Vazire to the podcast. It's a bit of a special episode as Simine is not officially an HPS scholar, but her work in psychology revolves around principles commonly discussed in the philosophy of science.

What is the state of psychology and social sciences today? How can we - both in HPS and in the sciences - work together to create a better science? And what is the value of HPS to those who aren’t strictly in the field.

Hi Simine, welcome to the HPS podcast.

Simine Vazire: Hi, thanks for having me.

Indigo Keel: I'm going to start with a slightly different question to what we normally do. What is your connection to HPS?

Simine Vazire: So, my connection to HPS is kind of as an outsider or, I would say, like an amateur philosopher of science. I don't have any formal training in HPS and my background is in psychology. I started connecting with people who do philosophy of science pretty early in my career, but for different reasons than I connect to them now. So, I had kind of several waves of connections to HPS. Early on, soon after my PhD, I joined a reading group with mostly philosophers who were interested in philosophy of social science and especially how to measure social science constructs like happiness - things that present a lot of interesting challenges of how to measure them.

This was when I lived in St. Louis, so we had this reading group with philosophers Anna Alexandrova and Dan Haybron and some other people. It was really fun and really stretched my mind to think about some of the issues that psychologists were talking about, but this was on a much deeper level and much more fundamental questions. I never really did much with that, I mostly just listened and kind of absorbed stuff.

Then, when the replication crisis hit psychology around 2011-2012, I started asking more questions that got very fundamental about like - Is psychology even a science? Is what we're doing scientific? What does it mean to be committed to scientific values?

Within psychology we were having a lot of debates. There were very different views about whether we were in a crisis and whether what we were doing was okay. I felt like we were talking in circles, and it became pretty clear that we weren't going to get anywhere unless we established what we had in common, what values we shared, and whether we could sort out our disagreements by appealing to those core values.

A lot of the debates early on were about whether we have an obligation to be transparent in our work. There was a lot of resistance to that, that that indicated mistrust, and that we should trust each other, and things like that. I found a lot of answers, or support, in philosophy of science and in HPS scholarship about what it means to have scientific values and to enact those values and so on. For me, it helped a lot with like communicating within my field about what we owe to the public, what we owe to society if we want to be called a science, if we want to be scientific. So I started kind of leaning heavily on history and philosophy of science to get through those debates and discussions which my field is still, I think, grappling with a lot.

I'd be curious to know whether a historian or philosopher of science thinks that what I'm doing now is HPS. I feel like I'm using a lot of HPS stuff to make arguments both in my scholarship but also in my kind of activism to try to get my field to raise its standards. I'm an observer and fan of HPS, I would say, and maybe dabbling in amateur HPS.

Indigo Keel: As we all, in HPS, think that more scientists should be doing. So, what do you see as the value of HPS?

Simine Vazire: In my experience, a lot of people in my field, at least - I don't know if I should say social scientists or psychology researchers - but a lot of us, and I suspect it's not specific just to us, but we don't really think about what it is we're signing up for when we sign up to be researchers or scientists. It's kind of shocking, actually, when you think about it.

I had taught undergraduate research methods for many, many years to psychology students, and actually it was a required class for many other majors besides psychology, too. For many of the students, it was the only class they were taking that kind of got to the fundamentals of like, "what do you have to do if you're saying that you're being scientific?"

I was teaching out of a textbook and it talked about the scientific method and I was repeating that stuff for a long time. But then, because of the replication crisis, because of the crisis in my field, I started rethinking that and asking myself, Well, yeah, "what is it that makes us scientific?" I learned about the demarcation question and then I realized, “Okay, I don't necessarily want to solve that big question”, but even just a more practical answer to that and HPS, and philosophers of science more generally, are the people that are grappling with that question. And I found the most satisfying answers from that field, specifically social epistemology and ideas about how objectivity emerges out of a social process. That resonated a lot with my experience in the field and I stopped teaching that what makes something scientific is the scientific method and stuff like that, and changed a lot how I approached my defence. Both how I approach my defence of psychology as a science, but also it really made me question whether or not psychology is a science.

I'm really on the fence about that, and I think that it's something we actually shouldn't take for granted. We shouldn't just repeat that we're scientific, and we use a scientific method, and so on, but actually ask ourselves - what is our commitment to being scientific? And how does that differ from pseudosciences or others who might claim to be a science? But we think that we're different than them.

Indigo Keel: Yeah, absolutely.

I was hoping you could talk a bit to what it is that you're doing day to day, whether we call it HPS or not, and what its relation to HPS is?

Simine Vazire: So, my day to day life is pretty weird for a social scientist, I would say, because I have started spending most of my time on these issues of improving my field and trying to get back to fundamental values and rethink things - question everything almost in our research practices, our publication practices, all of that from basics. Like, starting with the basics of what are our core values and then how would we build a system if we really were trying to increase our scientific-ness? I don't know. I do research on that.

So, together with my colleagues in my lab, we do research on things like - What are the common research practices in the field? And do those match our stated values, what we tell society is the value of our field and how we prioritize things?

So, for example, just to take a very concrete project, Beth Clarke, who's a PhD student here in psychology, she did a project looking at what limitations authors report in their psychology empirical papers. And I think that's a really good way of holding the mirror up to ourselves to say, Okay, what do we state publicly in our papers as the biggest weaknesses in our research and does that match other evidence about where our biggest flaws are?

Are we really owning our limitations? And also, how do we talk about them? Are we excusing them? Are we brushing them aside? Versus really grappling with them and talking about the implications of our limitations for our conclusions and actually constraining our conclusions in line with the limitations we acknowledge.

I think we would all agree that scientists ought to really grapple with their limitations, own them, incorporate them into their conclusions, and so on.

And so, if we find that that's not what researchers are doing - for example, their limitations, they might say, our design doesn't allow us to make causal inferences, but then their title has a causal claim in it, or something like that. I think it’s really useful to hold the mirror up and say, “we say we want to live up to this standard, but this is what we're doing, so, we need to reconcile that.”

We do a lot of research like that. Just trying to do very, very descriptive research about what is actually getting published, what are we actually doing in our field, and does that live up to what our standards are and what we want the public to see us as doing.

A lot of it is about research practices, but some of it is also about publication practices and peer review and journals. I find that we've made more progress as a field on research practice on what we expect individual researchers to do, and there's a lot more scrutiny of that now and more discussion of what are the best practices, but we kind of let journals and reviewers and editors off the hook.

It's the same people really in the field, but we don't put as much scrutiny on -

Is peer review actually incentivizing the right practices, and is it held accountable itself? If a journal has a lot of prestige, that means it has a lot of power to make or break people's careers. Are we making sure that they're actually handling that power appropriately by evaluating papers on the dimensions that the field thinks should be valued, and rewarding the right kinds of practices?

This is less on my research side and more on my, I guess activism, for lack of a better word - service to the field. I don't know if everyone sees it as a service to the field, but I do try to do a lot of raising questions and poking at journals and publication practices and trying to improve those, both in my own editorial work, so I'm a journal editor myself, and I try to implement better policies and practices at the journals that I have influence over, but also just doing research on journals.

We have, for example, one project we're doing is looking at how often editors publish in the journal that they're editor of, and counting how frequent that is, how that differs across journals, and over time, and things like that. And again, we don't have an answer of what it should be, but we think that it's important to know what it is, and then to ask questions about, is that what we want as a field? What do we want the norms to be?

So in my research and in my service, or whatever else you do, the things you do when you're not doing research or teaching, I'm doing a lot of stuff that I think, I guess it's like “HPS in action”. I know there's a conference on Philosophy of Science in Practice, and maybe that's the right label for it. Although I've never been to the conference, I don't know what the people who use that label think of themselves as doing, but I feel like that's kind of what I'm doing in my research and in my activism.

Indigo Keel: Brilliant. What is a topic that you believe would be of interest and value to a broader audience and how does it relate to the work that you do with HPS?

Simine Vazire: I think one really important topic is around science communication and I think that a lot of scientists think of science communication as giving away their findings or even selling the importance of their field and their research.

But I don't think there's enough attention on the incentives to exaggerate in those contexts.

I don't think researchers intentionally exaggerate. But if I think about what advice and training we get about science communication, it's almost all in the positive direction of how to reach more people and be more convincing and have more impact. And that's within the professional societies within the university. Impact is always seen as a good thing. Influence is always seen as a good thing. I find that really disturbing because obviously researchers are already going to have a kind of motivated reasoning and biases in favour of their findings and their hypotheses and their pet theories and so on. You don't really need to incentivize them even more to sell those things. And if anything, I think we need some safeguards to not let researchers exaggerate because it's so hard for anyone else and especially people outside the field, but even within the field, we aren't expert on each other's theories and constructs.

Obviously, the authors are often going to have more expertise than almost anyone else. So, if they come out and say “it has these implications, it should lead to these public policies or you should change this thing in your life because of my findings” very few people are in a position to question that with a lot of authority.

We really need to place a lot of the burden on not exaggerating, not over claiming, on the researchers themselves.

And right now, I think most researchers are getting the opposite message. That they should be making as big of a deal as possible about their findings. They're going to get promoted more if they do that. They're going to get more grants. In popular topics like psychology, they have opportunities for Ted Talks and book deals and speaking gigs and things like that that can bring in a lot of money as well as fame and so on.

I don't think there's enough discussion of what guardrails we have around that, what incentives we have to balance out the incentives to really hype up one's work. And that's something where I think HPS could have a role. I mean, certainly the intersection of scientists and the public and what scientists owe the public, not just in terms of their research, but in calibrating their claims and being responsible and keeping themselves in check because it's very hard for anyone else to.

Indigo Keel: Yeah, absolutely. And I was wondering how this relates back to the field of HPS. Why is the work that you do on trying to change or adjust the way that science works in practice important to those who work in HPS?

Simine Vazire: Yeah, that's maybe a question for people in HPS to answer. I see just so much how we need really good scholarship on core values and ethical responsibilities of scientists. And we couldn't just count on scientists to feel a responsibility to do those things. We need actual incentives and guardrails and checks and balances.

And I see HPS as one of the disciplines that's core to making that case that trust in science, the value of science, really depends on keeping it calibrated, keeping it honest, and that's not going to happen naturally just because scientists are noble. So, what are the social processes that are going to keep that balance?

I think that's a problem that scientists aren't going to solve. First of all, they don't have the incentives to solve it, but they also don't have the expertise. And so, I think it really needs to be a collaboration with scholars of science who can see the bigger picture, who can see the systemic issues, and who are a little bit outside of it, to be able to help us come up with systemic solutions to these challenges, identify when things are going off the rails, etc.

It's interesting, talking to some scholars of science, HPS, and other similar disciplines. Sometimes I get the reaction that, “that's not our job, we don't fix disciplines.” But I think that's underselling the value of their work.

Even if HPS scholars see themselves as working on more basic issues, not applied issues like identifying scientific communities that have gone wrong and helping them fix themselves, their work does have very clear relevance for that. And there's such a gap. There's just no solutions within science, I think, for that kind of thing. It has to come from these disciplines that study science a bit from the outside. So, I hope that HPS people see a connection and see the value of even their very kind of basic abstract intellectual work has these really practical implications for fields that are struggling. And I think every field should struggle with this question.

Some aren't reflecting enough to know that they should struggle with this question - this question of “Are we doing enough to stay calibrated, to stay scientific, to not let other incentives drag us, pull us, more towards hype and exaggeration?”

Indigo Keel: In talking about the crisis of replication in psychology and the ways in which you are working to remedy this, how would that be of value to a broader audience, to people in the everyday?

Simine Vazire: I suspect that a lot of people in their everyday life see a lot of headlines about psychology or other social sciences, especially, that they're kind of sceptical about.

I think that it's not that hard to laugh a little bit at some of the psychology headlines. And in the USA there's this comedian, Paula Poundstone, who often makes social science headlines the butt of her jokes. I used to listen to her on NPR's show, ‘Wait, Wait, Don't Tell Me’. And I think she speaks for a lot of people, kind of rolling her eyes at some of the headlines that come out. I mean it's not just psychology obviously. Nutrition science was often the butt of jokes because one week something's good for you and the next week it's bad for you or things like that. And so, I think the question of yeah, should we trust these people? Are they policing themselves? Are they keeping themselves honest? Are they committed to actually getting things, right? I think the question of how does an outsider - whether it's just a member of the public who's interested or someone who actually needs this research, like someone working for a non-profit who needs to know how to change people's behavior or things like that - how can they evaluate whether a scientific community deserves their trust?

I think it's going to be very hard for an outsider to evaluate individual findings. Maybe that's not an attainable goal. But I think it's possible for outsiders to evaluate scientific communities and be able to judge their culture and their values and their priorities. Try to look for really concrete signs that they are prioritizing getting it right rather than just being popular or something like that. And I think that's something that psychology is grappling with right now and watching how we deal with this replication crisis I think could inform more broadly how the public should allocate its trust in different scientific disciplines and communities.

Indigo Keel: I'm wondering if there's any examples of the replication crisis that have popped up recently.

Simine Vazire: Most of the replication crisis is about misconduct of the like very light variety, where researchers aren't aware that they're engaging in bad practices. So, things like P-Hacking, you might have heard that term, where researchers are unintentionally tweaking their data to get the result they want.

But it goes all the way to fraud and more serious misconduct which is, again, not the main problem, but they're connected.

I think the same incentives that lead researchers to inadvertently massage their results could also lead researchers to more consciously falsify or fabricate their results.

There's a couple of cases recently that have gotten a lot of attention. In my field of psychology and behavioral science, there's some papers by researchers Francesca Gino and Dan Ariely, and many co-authors of theirs as well.

Some of the papers have been retracted. Francesca Gino has been placed on leave without pay by Harvard University after an investigation. She has then sued Harvard and the researchers , who identified themselves, some of them remained anonymous, but some of the researchers who uncovered the evidence of problems in her and her co authors papers and publicised that evidence. So that's become a really big deal the last few weeks in my world.

Both, the question of, 'Yeah, what happened with those papers?' We still don't really have clear answers. I think pretty much everybody agrees that there was fraud that happened, but everybody's saying 'It wasn't me'.

And then, I mean, honestly, almost the bigger story is the fact that Gino has sued.

So, suing Harvard, I don't think much of that because Harvard's rich, they can afford it. If they didn't do anything wrong, they can defend themselves. But suing individual researchers! In this case it's three bloggers who are researchers themselves in the same field, and - full disclosure - they're friends of mine, but they're friends of mine because I respect their work. We met each other through working on Replication Crisis stuff. Suing them for posting very professional scholarly criticisms of her work - that seems crazy to me. That seems like clearly the point is to punish them and scare off other people who might do the same thing. There's nothing in their blog posts or anything they've said publicly that comes close to being defamatory, in my opinion, and that's what she's suing them for.

There's already so many disincentives to critique each other's work that the threat of a lawsuit on top of that just, uh, I'm just so depressed about how hard it's going to be for anyone to ever say anything critical.

The other part of it that really upsets me is that her allegations include allegations of sexism against both Harvard and the researchers who blogged about her work.

She brings up another case that got a lot of attention maybe five or ten years ago, Power Posing, where the author who popularised the idea of Power Posing - she has the second most viewed TED talk ever - Amy Cuddy, also accused her critics of sexism. But interestingly, the first author on that work, who's also a woman, did not accuse her critics of sexism, and in fact publicly said that, yes, they did P-Hack in their work, and she no longer stands by it, and other people should not believe it anymore. So, it's kind of interesting, the second author kind of took up this victim position ‘it's sexism’, that's why she's being criticized, when even the first author of the paper is admitting that the criticism is valid, and she, in fact, is convinced by it.

But what really upsets me about using sexism as a defence is that obviously there is a lot of sexism, and there's sexism in science, and there's sexism in scientific criticism, even. But I don't believe that the instances that they're pointing to, these blogs by these researchers, are examples of sexism. I don't think there's sexism motivating them. I don't think there's any evidence that there is. And to me, it minimizes the real problem of sexism and bullying and unfair treatment in science and in science criticism, which exists for real and probably even happened to Gino and to Amy Cuddy. I'm not saying it didn't, but the specific instances they're pointing to of really, I think, professional careful criticism are not instances of that.

And I think blurring that line does a disservice to women and to victims of sexism, and it does a disservice to criticism in science. It's going to be very hard for us as a field to be taken seriously if even extremely careful calibrated criticism is painted as inappropriate or crossing a line.

The other thing it might do is that it might lead to people being more hesitant to critique women's work, and then women scholars have a harder time earning the same credibility as men scholars because everyone understands that we're being handled with kid gloves or something like that. I mean, I don't think it's at that point, but I worry that that's one of the kind of downstream effects of painting completely appropriate careful criticism as biased or inappropriate.

Indigo Keel: How would you say that HPS and psychology can help each other moving forward?

Simine Vazire: One thing I want to make sure I say is that I really, really want to encourage students especially, but really anyone who's interested in building this bridge between HPS and science, especially psychology, because that's my field.

I think that people who have training in both are rare and extremely valuable, and some of the people I've learned the most from are people who've had some exposure to both fields, HPS and then my home discipline.

That probably goes for any other discipline that someone wants to specialize in both. I think those people bring a really unique perspective and toolkit to the work. And so, if anyone out there is thinking of maybe having a foot in both worlds, I think that's a really valuable career path.

Indigo Keel: I think so too, but I might be biased. Thank you so much for coming on the HPS podcast, Simine.

Simine Vazire: Thanks for having me.

Indigo Keel: Thank you for listening to the first season of the HPS podcast, where we discuss all things history, philosophy, and social studies of science. We want to thank the School of Historical and Philosophical Studies at the University of Melbourne for their support. To learn more, Check out our website at There, you can also find links to our blog, our social media, as well as show notes for today's topic. I'm Indigo Keel and my co-producer is Samara Greenwood. We look forward to having you back again next time.


bottom of page