top of page

Past seminars 2017

Wednesday 29 November

Questionable Research Practices in Psychology

​

Professor Franca Agnoli (Psychology, University of Padua, Italy)

​

Questionable research practices (QRPs) increase the likelihood of finding evidence in support of a hypothesis, but the evidence may be spurious. John, Loewenstein, and Prelec (2012) surveyed academic psychologists at U.S. universities and found that a surprisingly large percentage had engaged in QRPs. We investigated the prevalence of these practices within the Italian psychological research community. We surveyed 277 members of the Association of Italian Psychologists (AIP) regarding their use of the same 10 QRPs studied by John et al. The results are strikingly similar to those obtained for U.S. psychologists, showing that QRPs are about equally widespread in both research communities. For example, more than 50% in both research communities reported that they had decided whether to collect more data after first checking whether the results were significant, whereas less than 3% in both communities reported that they had falsified data. In this talk, we will describe the ten QRPs studied and the frequency of their use within the Italian research community. The frequencies of use differed systematically across QRPs, and some respondents explained in a free-text section of the survey why they considered the use of more frequently adopted practices to be justifiable. These results confirm that extreme forms of scientific misconduct are rare, but a large percentage of psychologists employ questionable practices. Some researchers consider these practices justifiable in certain circumstances, despite the risk of finding spurious evidence in support of the research hypothesis.

Wednesday 15 October

Ω

​

Professor Alan Hajek (Philosophy, ANU)

​

Probability theory is the dominant approach to modeling uncertainty. We begin with a set of possibilities or outcomes, usually designated ‘Ω’. We then assign probabilities—real numbers between 0 and 1 inclusive—to subsets of Ω. Nearly all of the action in the mathematics and philosophy of probability for over three and a half centuries has concerned the probabilities: their axiomatization, their associated theorems, and their interpretation.

​

I want instead to put Ω in the spotlight. Ω is a set of possibilities, but which possibilities? While the probability calculus constrains our numerical assignments, and its interpretation guides us further regarding them, we are entirely left to our own devices regarding Ω. What makes one Ω better than another? Its members are typically not exhaustive—but which possibilities should be excluded? Its members are typically not maximally fine-grained—but how refined should they be? I will discuss both philosophical and practical problems with the construction of a good Ω. I will offer some desirable features that a given Ω might have, and some heuristics for coming up with an Ω that has them, and for improving an Ω that we already have.

​

Alan Hájek’s research interests include the philosophical foundations of probability and decision theory, epistemology, the philosophy of science, metaphysics, and the philosophy of religion. His paper “What Conditional Probability Could Not Be” won the 2004 American Philosophical Association Article Prize for “the best article published in the previous two years” by a “younger scholar”. The Philosopher’s Annual selected his “Waging War on Pascal’s Wager” as one of the ten best articles in philosophy in 2003. He is a Fellow of the Australian Academy of the Humanities. He was the keynote speaker at the 2007 Chinese Analytic Philosophy Association conference, Wuhan (picture). He was the President of the Australasian Association of Philosophy in 2009-2010. He received the 2012 Award for Excellence in Supervision, ANU College of Arts and Social Sciences. In 2013 he won the ANU-wide Vice-Chancellor’s Award for Excellence in Supervision.

Wednesday 8 November

Dual Seminar

​

The Internet of Things - An Ontological Conversation

​

Paul Siemers (HPS, University of Melbourne)

​

The Internet of Things (IoT) is much heralded as a technological revolution, with tens of billions of things – from cars to light bulbs to artificial hearts – being connected via the internet.  IoT is widely credited with the potential to transform societies, economies, and human experience.  But discussion of what IoT really is – the ontology of IoT – is limited and often facile.  The aim of this research is to elucidate the ontology of IoT by bringing the IoT literature into dialogue with theories on the ontology of technology drawn from the philosophy of technology and from STS.

​

Paul Siemers holds an Honours Degree in Information Systems and a First Class Honours degree in Mathematics from the University of Cape Town, and an MBA from Deakin University. He has more than twenty years international experience as a technology strategist, and is currently the Manager of Digital Strategy, Architecture and Governance at Melbourne Water.

​

Sex, Sexism and Goolags: Initial Data from an Exploration of Online Conversations about James Damore's Diversity Memo

​

Ben Wills (HPS, University of Melbourne)

​

On August 5, 2017, a ten-page internal memo circulated by Google software engineer, James Damore, was leaked to the public. In it, Damore argued that Google’s approach to workplace diversity privileges certain identities (e.g., gender, race) over others (e.g., ideology) and ignores potential biological explanations for the gender gap in software engineering. He suggested that this is maintained through a “politically correct monoculture” where “some ideas are too sacred to be honestly discussed.” The public’s response to the memo, and his firing soon after it was leaked, was remarkable. Many saw rehashed themes of sexism and yet another example of science being used to naturalize inequality, while others were disturbed that he could be fired for writing something they saw as fact and noncontroversial. The very public discussion surrounding Damore’s memo and firing affords an opportunity to study the way the public entertains, engages with, and argues about the science of sex and gender differences and workplace diversity. This presentation offers a peek at our first forays into this subject.

​

Ben Wills is a Maguire Fellow at the University of Melbourne, studying public perceptions of gender science. He has previously worked as a legal assistant, psychology research assistant, and facilitator for men’s anti-domestic violence groups. He studied cognitive science at Vassar College.

Wednesday 1 November

Abductive Reasoning in Psychology

​

Professor Brian Haig (Psychology, University of Christchurch)

​

A broad theory of scientific method is sketched that has particular relevance for psychology and the behavioural sciences. This theory of method assembles a complex of specific strategies and methods that are used in the detection of empirical phenomena and the subsequent construction of explanatory theories. A characterization of the nature of phenomena is given, and the process of their detection is briefly described in terms of a multistage model of data analysis. The construction of explanatory theories is shown to involve their generation through abductive, or explanatory reasoning, their development through analogical modelling, and their fuller appraisal in terms of judgments of the best of competing explanations. The nature and limits of this abductive theory of method are discussed in the light of relevant developments in scientific methodology.

 

Brian Haig is a Professor in the Department of Psychology at the University of Canterbury, and a Visiting Professor in the Department of Education at the University of Bath. He is a theoretical psychologist who has published numerous articles in psychology, education, and philosophy journals on the conceptual foundations of quantitative and qualitative research methods, and the nature of psychological science more generally. He is the author of Investigating the Psychological World (MIT Press, 2014) and The Philosophy of Quantitative Methods (Oxford University Press, 2017), and co-author of Realist Inquiry in Social Science (Sage, 2016). He is a Fellow of the Association for Psychological Science and the New Zealand Psychological Society.

Wednesday 25 October

It’s a Euclidean man’s world: The ‘geometric order’ in the Dutch Republic

​

Dr Gerhard Wiesenfeldt (HPS, University of Melbourne)

​

At least since Alexandre Koyré’s account From the Closed World to the Infinite Universe, the concepts of absolute time and absolute space – emerging during the seventeenth century –  have been given a central role for the development of modern science. In the early modern era, the notion of absolute space was often interpreted as an outcome of Euclidean geometry. However, Euclidean geometry had also many other functions. It served as a paragon of truth and the human ability to achieve certain knowledge; it thus became a model for other kinds of knowledge to echo – from optics to ethics and even poetry. In particular, this pertained to a new rhetorical, i.e. argumentative, ideal of science as based on the axiomatic method. But Euclidean geometry was also used in a very practical way for tasks in surveying, navigation, engineering, and architecture. Hence, Euclid’s Elements were not only read by scholars, but also by mathematical practitioners, artisans and aspirational craftsmen, who in turn could use their mastery of geometry to increase their social status.

 

In my talk, I will look at the different ways, Euclid was used, read, and discussed in the Dutch Republic – a country in which the Elements had a particularly strong impact. I will look at the different audiences of Euclid and their understanding of geometry. A key interest will be on the question in how far Euclidean geometry was seen as an expression of practices that had been present in the Dutch life-world for generations and were now reinterpreted and reshaped in the geometric order.

Wednesday 18 October

The Evolution of Rationality

​

Dr Stephen Ames (HPS, University of Melbourne)
 

The non-reductive physicalist N. Murphy, informed by works of F.Destke, D. Campbell and J.Kim, offers an account of how a sequence of mental events ordered in terms of reasons can be reconciled with an account of those same events connected by neurobiological causes.   Reconciliation is one thing, but Murphy’s goal is to tell this story without having to presuppose rational agency. I will argue that Murphy’s account fails and that this conclusion is has implications for an account of the evolution of rationality, which I will examine in discussion with the work of W.S. Cooper, J. Wilkins and P. Griffiths, and P. Raatikainanen.

Wednesday 11 October

Using Concepts as Experimental Tools: Mental Imagery and Hallucinations

 

Eden Smith (HPS, University of Melbourne)

PhD Completion seminar

​

Using the concepts of mental imagery and hallucinations respectively, neuroimaging experiments can investigate discrete types of sensory-like mental phenomena (SLMP). Comparing these experiments highlights a known puzzle – equivalent neuroimaging data can support diverging knowledge-claims. I have taken a three-step approach to this puzzle: examining how mental imagery and hallucinations are used as independent concepts that reliably individuate discrete experiences of SLMP; exploring the intersecting histories within which mental imagery and hallucinations emerged as independent concepts; and comparing how each concept is used within the documented design and implementation of neuroimaging experiments. In this talk I will draw these analyses together by building on converging insights from diverse accounts of scientific practice. In doing so, I will propose that the structured uses of the concepts of mental imagery and hallucinations function as taken-for-granted tools that can actively contribute to the heterogeneous dynamics of neuroimaging experiments.

Wednesday 4 October

Cultural Canalization: Robustness without Genetic Change

​

Dr Rachael Brown (Director of The Centre for Philosophy of Sciences, ANU)

​

The reliable and robust development of phenotypic traits is typically taken to be an indication of their development and inheritance being genetically underwritten. One mechanism that can generate such reliable and robust development is the process of “Genetic Canalisation” famously proposed by Waddington (1942). In genetic canalisation the cumulative effects of persistent selection produce a suite of genetic resources that buffer trait development against genetic and environmental change. In this paper, we offer another route to developmental robustness in “Cultural Canalisation”. Using human social cognition and reading as examples, we will demonstrate how a robust cognitive phenotype can evolve over time via the effects of stabilising selection on culturally constructed developmental channels. 

This paper represents joint work undertaken with Richard Menary, Macquarie University.

Wednesday 27 September

Can we be rational? Computational complexity and human decision-making

​

Dr Carsten Murawski (Brain, Mind & Markets Laboratory, School of Finance, University of Melbourne)

​

Most modern theories of decision-making are based on the rationality principle, which postulates that decision-makers always choose the best action available to them. The principle does not take into account the difficulty of finding the best option. Here, we propose that computational complexity theory provides a framework for defining and quantifying difficulty of decisions. We review evidence showing that human decision-making is affected by computational complexity. Building on this evidence, we argue that most models of decision-making, and meta-cognition, are intractable from a computational perspective. We argue that in order to be plausible, future theories of decision-making will need to take into account both the resources required for implementing the computations implied by the theory, as well as the resource constraints imposed on the decision-maker by biology.

Wednesday 20 September
 

Imagining Solar Geoengineering

​

Dr Jeremy Baskin (Melbourne School of Government, University of Melbourne) â€‹

​

Solar geoengineering is a highly controversial, proposed, climate policy aimed at cooling the planet by injecting sulphate aerosols into the stratosphere to reflect incoming sunlight.  It has a number of promoters, including in parts of the climate science epistemic community.  It is not yet a deployed technology, nor has it been officially embraced by governments or the IPCC.

​

My interest is in understanding the ways in which scientific knowledge production is deeply enmeshed with power and values.  This is especially evident in the making of new technologies such as solar geoengineering.  What ‘is’ in regards to climate change, is intimately tied up with understandings of how the world ‘ought’ to be.  In this sense solar geoengineering is not simply a technoscientific endeavour.  It is a sociotechnical project, an exercise in world-making, and should be clearly understood as such.

 

In this talk I will look at some of the competing narratives, both past and present, of solar geoengineering.  I will show how Cold War notions of Mastery over nature and climate are no longer easily available, and how efforts to recruit climate ‘emergency’ arguments have seen both social science and climate science pushback.  Absent a coherent sociotechnical imaginary (Jasanoff’s term), or a unilateral intervention by the USA, solar geoengineering will struggle to be born as an operating technology.  Solar geoengineering’s boosters are working on both. This talk reflects provisional work as I prepare a monograph provisionally called ‘Imagining Geoengineering’.

 

Dr. Jeremy Baskin completed his PhD at the University of Melbourne in 2017.  He works at the intersection of Politics and STS.  He is currently a Senior Research Fellow at the Melbourne School of Government working on a project looking at the decline in trust in ‘experts’ and the making of public policy.  He has previously worked internationally in and with the private sector, government and civil society on issues related to environmental and social justice.

Wednesday 13 September

Empathy in Historical Understanding

​

Dr Tyson Retz (History, University of Melbourne)​

​

In this presentation I explain the links between empathy, the history discipline, the philosophy of history and history education. It explores how empathy was central to history defining itself as an autonomous discipline in the nineteenth century and follows its development through twentieth-century continental and analytical historical thought. I identify inadequacies in empathy’s methodological formulation and propose an alternative way of approaching the investigation of historical context grounded in the logic of question and answer.

Wednesday 6 September

The cargo cult of diabetes care

 

Professor Fergus Cameron (Royal Children’s Hospital)

​

Current medical practice is appropriately centred around notions of patient-centred care and personalised medicine. These laudable practices are occurring against a background increasing patient empowerment and disruptive patterns of knowledge transfer. Health care consumers are now interconnected and highly aware of biotechnological advances. Both health care providers and consumers want the latest and “best” in therapies, however all too frequently these therapies are both expensive and non-transformative. In a resource-constrained environment these new medical realities are increasingly problematic and potentially unsustainable. How then does one equitably and effectively run a clinical service in a public hospital; balancing the needs of the patient in the consulting room against the needs of the patient in the waiting room in a zero sum game? The diabetes service at RCH has had to consider these issues in an ongoing manner with a rapid increase in treatment complexity occurring against a background of increasing patient numbers and constrained resource allocation. These issues will be analysed through a prism of equity, credibility and control. Potential strategies will be explored looking at determinants of consumer attitudes, clinical decision making and knowledge transfer/implementation science.

 

Professor Fergus Cameron is a paediatric endocrinologist at The Royal Children’s Hospital and has been involved in diabetes care at RCH over a period of many changes in clinical practice. He has struggled to reconcile his Kantian inclinations with an intellectual legacy that has emerged from a long line of reductionists, cynics and doubters. His dilemma is unresolved but hopefully enlightening.

Wednesday 30 August

Causal learning algorithms find causal relationships without doing experiments. How do we know if they work?

​

Dr Lizzie Silver (HPS, University of Melbourne)

​

Causal structure learning algorithms take observational data from a set of random variables, and return a network representing the causal relationships among those variables. For my PhD, I applied causal structure learning algorithms to gene expression data in order to learn genetic regulatory networks. The algorithms are provably correct under a set of assumptions that all fail in the genetic context. How can we test how robust these methods are when their assumptions fail?

 

I’ll explain how to interpret the causal network representation so you understand the inputs and outputs, leaving the algorithm as a black box. I’ll describe four ways to evaluate the methods without understanding how they work: (1) performance on synthetic data, (2) agreement with background knowledge, (3) agreement with mechanistic information, and (4) experimental validation. I’ll also describe several different standards for “success”. 

Wednesday 23 August

Sympathy, Spirits and Masticating Corpses: Articulating a Dynamic Theory of Matter in Michael Ranfft’s Von dem Kauen und Schmatzen der Todten (1734).

​

Dr Michael Pickering (Trinity College, University of Melbourne)

​

In this paper, I consider the 1734 edition of the Mastication of the Dead in their Graves by Lutheran minister and historian, Michael Ranfft, in the context of the so-called ‘vampire debate’ of the early-mid 1730s. After describing Ranfft’s theory of post-mortem sentience and how this links to the purported vampire cases in Serbia in the 1720s, I delve into the question of whether we should ascribe to Ranfft a ‘spirit-based’ interpretation. Indeed, Ranfft appears, at first glance, to propose such an interpretation via recourse to aspects of the preternatural philosophy. However, I would like to suggest that a careful reading of his biological model (in tandem with part three of his treatise, newly added to the 1734 edition), as well as a consideration of the ontological place of spirit in Ranfft’s discussion, lends credence to the view that the Lutheran minister was supportive of a dynamic theory of matter.

​

Wednesday 16 August

What is Psychedelic Experience?

 

Dr Aidan Lyon (Philosophy, University of Maryland)

​

Recently, there has been an explosion of scientific research into the effects of psychedelic substances. Initial results indicate that psychedelic experiences can treat psychological problems such as depression, PTSD, and drug addiction. Initial results also seem to be shedding new light on the structure of the mind and the functional structure of the brain. For example, a common phenomenological component of psychedelic experience is the dissolution of the ego, and recent research suggests that this correlates with a reduction in the orthogonality of the default mode network and the task positive networks.

The concept of psychedelic experience has thus now made its way out from the wild 60’s counterculture and into serious contemporary scientific research. This research is also leading to the development of new theories. For example, it has lead some researchers to develop the theory that different conscious states correspond to brain states with different levels of entropy and that psychedelic experiences correspond to brain states with higher-than-usual entropy.

Like any concept that has recently transitioned from folk theorising to  scientific theorising, the concept of psychedelic experience needs to be examined carefully. In this talk, I’ll do this by raising the question “what is psychedelic experience?” and developing an answer to it.

Wednesday 9 August

Classic mechanics may not be deterministic, but…

 

Dr Keith Hutchison (HPS, University of Melbourne)

​

It is often said that Newtonian mechanics is deterministic, a claim haphazardly defended by remarks about the uniqueness of the solutions to some of the equations governing the motions of classical systems.  In philosophical circles, at least, it is well-known that these claims are exaggerated, for the solutions in question are not always unique. This fact is often presented as indicating that classical mechanics is not deterministic.  (I myself published such an interpretation some 20+ years ago).

In today’s talk, I argue against this understanding — pointing out that it relies on a simple error of logic.  For showing that an argument fails to establish some conclusion, does not show that the conclusion is false. Examining some of the examples adduced (by me and others) to support the fallacious conclusion, I observe that a deterministic understanding of the motions at issue is not just feasible, but quite normal, and the one we spontaneously adopt in our understandings of the world about us. Furthermore, the uniqueness theorems cited in this context miss the point, by invoking too narrow a characterisation of determinism, one that is often adequate, but one that patently fails in many familiar situations (like the toppling of a house of cards).

Such a discussion leaves the big question open, of course, and my analysis will not defuse all counterexamples to the deterministic thesis. To show this, I will quickly introduce the motion of a particle which mechanical theory declares to be speedily eject from the universe. What ultimately to make of this, I have no idea!

Wednesday 2 August

Using Corroboration to Evaluate Scientific Explanations

​

Ariel Kruger (Philosophy, University of Melbourne)

PhD Completion seminar

​

Popper’s corroboration function furnishes us with an objective measure by which we can evaluate the strength of competing scientific explanations. Specifically, in cases where causal scientific explanations compete with non-causal, a corroboration-based function will always favour the causal mode. In this presentation, I will argue that the corroborability of a theory is directly related to how manipulable it is; where the definition of ‘manipulable’ is informed by James Woodward’s interventionist account of causation. Insofar as the corroborability of a scientific theory is a good thing, it follows that we should prefer causal explanations to non-causal ones in the cases where they compete to explain the same phenomenon. 

Wednesday 26 July

Chance, Determinism, Levels of Description, and the Best System Analysis of Laws

​

David Kinney (Philosophy, London School of Economics)

​

Wednesday 19 July

A reference class problem for causal explanation?

​

Associate Professor Katie Steele (Philosophy, ANU)

​

Much ink has been spilled over the value judgments associated with
‘inductive risk’, i.e., the risk one assumes when determining whether
there is adequate evidence to assert some (scientific) claim.
Comparatively little emphasis is given to other potential ‘risks’ involved
in science communication, specifically, when providing causal explanations
of phenomena. This talk seeks to rectify this imbalance by focussing on
the choices associated with selecting particular causes as most
significant for an effect. I argue that value, and not just epistemic,
judgments are needed to resolve what is effectively a reference class
problem associated with causal selection.

Wednesday 7 June

Scientific expertise in the US courtroom: A recent history

​

Prof David Caudill (Law, Villanova University, US & Senior Fellow, Melbourne Law School)

​

About 20 years ago, in response to concerns over junk science in the courtroom, the US Supreme Court justices (or their clerks) read a little philosophy of science and, in the Daubert case, established a new regime for admissibility of scientific experts in the courtroom. The standards for reliability reflected an idealized view of science (and not, for example, the sociological study of scientific knowledge). There have, however, been some useful interventions by science studies scholars oriented to law, and the idealism of Daubert has been mediated by the criticism (by mainstream scientists) of most fields of forensic “science” that have appeared in the courtroom. At the same time, the so-called “third wave” of science studies—associated with the Cardiff School or “Studies in Expertise and Experience” (S.E.E)—tends to be less skeptical of science, such that (i) sociological approaches to expertise are not viewed as threatening to mainstream science in law, and (ii) the mainstream scientists themselves are more skeptical of the “science” allowed in the courts. This situation promises a new cooperative atmosphere regarding scientific expertise in law, quite the opposite of the science wars that took place at the time Daubert was decided.

Wednesday 31 May

Dual Seminar


When is it irrational to trust science?: The epistemic consequences of the replication crisis

​

Ashley Barnett (HPS, University of Melbourne)

​

The anti-science arguments of science skeptics would be much stronger if they could overcome their science aversion and read the meta-science on replication, peer review and publication bias. Some are starting to do this. I attempt to present the strongest case possible for the claim that it is irrational to trust the scientific consensus on almost any issue, outlining the empirical evidence for why the safeguards designed to ensure that science tracks the truth are so poorly implemented that deferring to scientific experts is rarely a good idea. Then I would like to discuss some objections to this argument, and decide if it is fatally flawed, or good enough to sell to a lobby group that advocates for alternative facts, or both. I will also present the results from a recent survey that shows that questionable research practices are as prevalent in ecology as they were previously found to be in psychology.

​

The replication crisis in science: How do publication bias and low statistical power contribute?

​

Steve Kambouris (HPS, University of Melbourne)

​

The replication crisis in the sciences has highlighted concerns about the quality and reliability of research through repeated failures to replicate published results. A number of reproducibility projects across different disciplines have found that at best, just under 50% of the studies chosen for replication could have their original results reproduced. Publication bias in academic journals has been nominated as a main underlying reason for replication failures; journals have been shown to tend to publish studies with statistically significant results rather than studies with non-significant results.

 

​

Another contributing factor is the publication of studies with significant results, but low statistical power. Surveys of the literature in psychology and in ecology have shown that on average statistical power is low, well below the threshold of 0.8, which is widely considered to be the minimum required. I show some preliminary work exploring how different levels of statistical power result in the over-estimation of effect sizes in the published literature (which will then be less likely to be replicated), for some simple models of publication bias.

Wednesday 24 May

History of Antarctic science and environments

​

Dr  Alessandro Antonello (McKenzie Postdoctoral Fellow, History, University of Melbourne)

​

The Antarctic region today is governed by a suite of international treaties that concentrate especially on the protection and management of the environment and natural resources. This paper will outline the period between 1959 and 1980, when the diplomats and scientists of twelve states laid the intellectual, scientific, legal, and geopolitical foundations of the contemporary Antarctic. It will describe the connected processes of conceptualising the Antarctic environment, the formation and settling of these conceptions through international diplomacy and scientific research and collaboration, and the codification of these ideas in international treaties; this is a story of Antarctica as imagined and real in the texts of scientists and diplomats. This paper charts profound conceptual changes during the first two decades of the Antarctic Treaty regime: from conceiving of it as a cold, abiotic, and bleak wilderness, a lifeless and inert stage for geopolitical competition, into a fragile environment and ecosystem demanding international protection and management. At the heart of the story are contests for geopolitical and epistemic power, institution building, and ever-developing knowledge and sensibilities.

​

Alessandro Antonello is a McKenzie Postdoctoral Fellow in the School of Historical and Philosophical Studies, University of Melbourne. Before joining Melbourne in mid-2016, he was for two years a postdoctoral research fellow in the Clark Honors College, University of Oregon, and before that completed his PhD at the Australian National University. His research investigates international environmental history in the twentieth century, currently concentrating on the Antarctic and Southern Ocean. His work has appeared in Environmental History, Australian Journal of Politics and History, Progress in Human Geography, and The Polar Journal as well as several edited collections.

Wednesday 17 May

Dual Seminar

​

Applied conservation consequences of philosophical taxonomy problems

​

Hannah Fraser (Centre for Excellence in Biosecurity Risk Assessment, University of Melbourne)

​

I recently submitted my PhD in the Quantitative and Applied Ecology Group (School of BioSciences) but a large part of my work was more in line with History and Philosophy of Science than with ecology. A passion for birds and a supervisor interested in woodlands ignited my interest for ‘woodland birds’. However, when I started researching ‘woodland birds’ it became clear that studies that ostensibly consider the same group refer to different sets of species but results from these studies are compared or combined to understand ‘woodland bird’ ecology. This seemed like a reasonably large scientific flaw and I spent my PhD trying to understand why it occurs and how problematic it is. I will discuss some of my key findings and conclusions in this seminar

​

Trusting Expert Judgement

​

Victoria Hemming (Centre for Excellence in Biosecurity Risk Assessment, University of Melbourne)

​

I’m in the final year of my PhD exploring how we can improve expert judgement for conservation decisions when data is absent is or uninformative, which, in conservation is often the rule rather than the exception. Studies into expert judgement show that poor selection and elicitation of experts’ opinion can lead to poor judgements. However, selecting a diverse group of knowledgeable individuals and eliciting their judgments using structured protocols can often yield accurate and well-calibrated judgements. My research aims to explore whether the group’s performance can be further improved through weighted aggregations and test questions. However, to develop reliable questions we need a model of the system in question, or at least an understanding of the limits of domain knowledge. In fields such as ecology, domains are not well defined, and models are rarely agreed upon, even if models are agreed data may not be available to test experts. So how do we develop good test questions in these domains? Is it possible? And if not is there still a purpose for test questions? In this presentation I will present some of my results from my PhD.

Wednesday 10 May

The Unity of the Individual

​

Dr Jane McDonnell (Philosophy, Monash University)

​

I contend that the problem of the unity of the individual is at bottom the problem of reconciling the phenomenological view of the world with the view of physics. The Problem of the Many is used as illustration. The world cannot be described without reference to observers and many intractable problems in physics stem from how observers are implicitly treated in the theory. This needs to be made explicit by either (i) defining observers fully in terms of the underlying physics, or (ii) accepting observers as fundamental. I argue that (i) can’t be done so we should go the way of (ii). Quantum monadology builds observers into theory at the fundamental level. Minds are true unities: “everything else is only phenomena, abstractions, or relations”.

Wednesday 3 May

Homeopathy and the Defence of Medical Pluralism in Nineteenth Century New South Wales

​

Tao Bak (Victoria University) â€‹

​

Unlike Victoria, the colony of new South Wales did not pass effective regulatory medical legislation until as late as 1900. In accounting for this, existing accounts have noted the emphasis on private practice, the general slowness in the process of professionalisation, and the large number of irregular modalities in the colony. This paper adds to the understanding of medicine in nineteenth century new South Wales by focussing on the most influential of the irregular modalities, homeopathy, and particular role it played in staving off restrictive medical legislation, particularly in the final third of the nineteenth century. The specific focus of this paper is on the public and political debates surrounding the proposed regulatory medical bills of 1975 and 1880. I suggest that examination of these debates reveals the extent to which the homeopathy successfully aligned itself with liberal sensibilities in the colony to engage in what amounted to an effective and sustained defence of medical pluralism, at a time when what constituted scientific medicine was itself undergoing significant transformation and renegotiation.

Wednesday 26 April

Galileo and the conflict between religion and science

​

Dr Greg Dawes (Philosophy & Religion, Otago University) 

​

It is common for writers on science and religion to reject what they call the “conflict” or “warfare” thesis, which they see exemplified in the work of John William Draper (1811–82) and Andrew Dickson White (1832–1918). Taking the Galileo affair as my case study, I shall defend the thesis they reject. In Galileo’s encounter with the Church, two tendencies came into conflict. The first was a reaffirmation of the certainty that was thought to accompany what is known “by faith.” The second was the gradual realization that scientific theories could claim nothing more than a high degree of probability. These tendencies put Galileo in an impossible position and made his trial inevitable. Have these attitudes changed? For the most part they have not, which means that conflicts between religion and science are just as likely to occur today.

Wednesday 19 April
 

The poverty of the philosophical/psychometrician/psychology  experiment tradition of understanding reasoning

​

Dr Neil Thomason (HPS, University of Melbourne)

​

There is a rich philosophical/psychometric/experimental psychology literature about good reasoning, one that dates back to Aristotle and has grown by leaps and bounds since the middle of the last century.   It is endlessly fascinating.

Yet, when you turn from artificial baby problems to assessing real reasoning (good and bad) done by intelligent people of goodwill, this tradition provides little guidance.  It fails in two ways:

(i) Despite widespread claims about its usefulness for thinking, even informal logic has remarkably limited use.  This is true even if you, like me, are happy to consider the discoveries of psychologists such as Wason, Kahnenan & Tversky as part of informal logic.

(ii) Within its self-chosen limited domain, important issues have not barely been explored, ones that should have been resolved centuries ago.

Wednesday 12 March

Three touchstones for evaluating norms of belief formation

​

John Wilcox (HPS, University of Melbourne)

​

Our beliefs guide our lives—or, at very least, the decisions we make in our lives. Yet our beliefs can fall prey to a variety of biases and fallacies that lead us astray. Thankfully, to avoid these cognitive pitfalls, norms have been proposed in fields as diverse as philosophy of science, psychology, statistics, epistemology, logic, social science and even computer science. These norms instruct us as to what beliefs we should have or how we should arrive at these beliefs; but not all norms are equally good, and some norms are both misled and misleading. How, then, do we determine which norms we should accept as guides for our belief formation? I will argue that scholars (tacitly) advocate at least one of three distinct touchstones for determining what norms to accept. These touchstones correspond to three considerations: 1) whether accepting a given norm is in our best interests, 2) whether the norm reflects the way that the world is structured or 3) whether the norm encodes our foundational intuitions about rational belief. I further argue that all of these touchstones are problematic, that all reflect distinct but somewhat unrecognised conceptions of rationality, that all are intimately related to each other and that appealing to all three touchstones to justify the same norm is an interesting project for future research. The seminar will focus on formal epistemology and degrees of belief. Nevertheless, it should be accessible for audiences outside of philosophy

 Wednesday 5 April

Confidence, confusion and scientific misconceptions: Using technology for conceptual change

​

Dr Jason Lodge (Centre for the Study of Higher Education, University of Melbourne)

​

People tend to hold intuitive or ‘folk’ notions of the world based on their everyday experiences. These experiences, however, do not always match with reality as understood through science. Often misconceptions about science tend to be held with high confidence and are therefore persistent. Recent advances in neuroscience and cognitive psychology are allowing for a deeper understanding of conceptual knowledge and how to update it. This research is suggesting that confusion and surprise are critical for shifting persistent misconceptions. This research will be discussed in relation to education, technology and the rising tide of anti-science and anti-expertise in the broader community.

Wednesday 29 March

Challenging the Dichotomy of Cognitive and Non-Cognitive Values: Feminist Values and Evolutionary Psychology

​

Silvia IV (Tilburg Centre for Logic, Ethics and Philosophy of Science, Tilburg University, The Netherlands)

​

Philosophy of science has seen a passionate debate over the influence of non- cognitive values on theory choice. In this talk, I argue against a dichotomous divide between cognitive and non-cognitive values and for the possibility of a dual role for feminist values. By analyzing the influence of feminist values on evolutionary psychology and evolutionary biology, I show how they have cognitive and non-cognitive functions at the same time.

Wednesday 22 March

The Crowdsourced Reasoning Evidence Argumentation Thinking and Evaluation (CREATE) project

​

Time Van Gelder (Biosciences), Richard de Rosario (Biosciences) & Fiona Fidler (Biosciences & HPS)

​

​

 Wednesday 15 March

Privacy’s Blueprint: The Battle to Control the Design of New Technologies

​

Professor Woodrow Hartzog (Samford University, USA)
 

“Technological design that affects our privacy is now so pervasive, we hardly even notice it. Every day the devices and software we use-social media, mobile apps, databases and smart phones-are built to give away our stories. And the law barely cares. In this talk based on his forthcoming book, Professor Woodrow Hartzog will argue that the law must address technological design and proposes how to do so in a way that is flexible and not unduly constraining. We must ask and answer hard questions like should the law prohibit malicious interfaces designed to trick us into personal disclosure?  Should designers be forced to build backdoors into encryption for the government? Should there be minimum data security standards for technologies in order to keep our information safe from hackers?  Privacy law must take design more seriously. To get it right, we need a blueprint.”

​

Woodrow Hartzog is the Starnes Professor of Law at Samford University’s Cumberland School of Law and an Affiliate Scholar at the Center for Internet and Society at Stanford Law School. His research on privacy, media, and robotics has been published in numerous law reviews and peer-reviewed publications such as the Yale Law Journal, Columbia Law Review, California Law Review, and Michigan Law Review. Has also written for popular publications such as The Guardian, Wired, The Atlantic, CNN and BBC. His book, Privacy’s Blueprint: The Battle to Control the Design of New Technologies, is under contract with Harvard University Press.

bottom of page