New publication in TICS! Developing the Metacognitive and Interoceptive Self-Inference (MISE) Model.

Yesterday we published a new mini-review article in Trends in Cognitive Sciences. Needless to say we’re pretty happy about this. Our article “Unravelling the Neurobiology of Interoceptive Inference” discusses two recent landmark papers in the systems neuroscience of interoception, and proposes a novel “interoceptive self-inference” model of insula function to account for these new findings.

You can read the article here:
https://www.sciencedirect.com/science/article/abs/pii/S1364661320300541

And an open access preprint is available here:
https://psyarxiv.com/7xgkr

What exactly is “self inference”? This is something we’ve been working on for a number of years. The basic concept is that the brain not only encodes a prediction of interoceptive states, but uses these states to estimate its own expected uncertainty or precision. Expected uncertainty is critical for optimizing learning, attention, and metacognition. Our model suggests then that when we estimate how reliable or certain our future beliefs and sensations will be, we take into account the constant influence of the body on the noise levels of our sensory apparatus. Put simply, our model argues that the “state noise” term considered in most formal accounts of decision-making is in large part driven by visceral fluctuations, and that the brain utilizes interoception to sample and control these noise trajectories, i.e. through metacognitive active inference.

So what has the ‘self’ got to do with that? That is something we are continuously expanding on – in the next year we have another major review in the pipeline, as well as a book chapter focused on consciousness science. In a very basic sense, we call it “self inference” because the brain is combining predictions about the body and the world to produce an estimate of its own reliability. You could think of it as something like a hybrid between global workspace, FEP, and higher-order thought theories of consciousness. In practical terms it is a bit like having a computing cluster which can monitor the temperature of the CPU to limit down-throttling, spreading load around to cooler CPUs when needed. By monitoring our own noise levels, we build more precise estimates of expected uncertainty, which can then be leveraged to determine how much we should update our models of the world (and self) when we encounter surprising events.

The implications of the self-inference model are fascinating, to us at least. For one, we pick up on existing strands of theory that suggest that Bayesian meta-cognitive inferences over expected precision underlie selfhood and perhaps even consciousness itself. By inferring (and controlling, through active inference) our own embodied noise trajectories, we are effectively estimating the influence of ‘the self’ on our re-afferent data streams. That is to say, we’re actively accounting for the minimal self or minimal embodied perspective as the constant source of noise influencing sensory flow and beliefs. The implication here is that, disruptions in self-belief can alter our visceral experiences (e.g., embodied hallucination), and conversely, disrupted interoception can ‘leak’ into our perceptual and metacognitive inferences, producing a variety of affective and decision biases. At the extreme, we think that maladaptive self-inference underlies a wide spectrum of psychiatric and even neurological disorders.

But that is just a sneak peak. You can follow the development of the self-inference model (perhaps we’ll call it MISE, for “metacognitive and interoceptive self inference”) in our recent publications:

1. forthcoming BBS commentary article:

https://psyarxiv.com/5j2f3/

2. An early book chapter with Manos Tsakiris:

https://www.oxfordscholarship.com/view/10.1093/oso/9780198811930.001.0001/oso-9780198811930-chapter-2

3. Our recent computational model of interoceptive self-inference:
https://www.biorxiv.org/content/10.1101/603928v1.full.pdf

Stay tuned for more! We have one or two more major theory articles in the works this year, which should fully ‘flesh’ out the self-inference model and its implications for the phenomenology of the self, consciousness, and computational psychiatry.




Systole: a Python Package for Processing, Analyzing, and Synchronizing Cardiac Data!

Today we are thrilled to announce the release of our lab’s first software package, Systole! Systole is a comprehensive python package intended to help you clean, transform, and analyze your cardiac time-series data, particularly in the context of psycho-physiological research. In addition to these basic functions, Systole offers built-in support for synchronizing your PsychoPy experiments with the heartbeat, making it easier to present stimuli at specific phases of the cardiac cycle. This is of particular interest for research in brain-body interaction and interoception. Unfortunately, most new studies in this area do no use open code, limiting reproducibility. Enter systole!

Example using Systole to present auditory stimuli at three cardiac phases.
Example presentation of auditory tones at three cardiac phases.
The Nonin Xpod USB pulse ox and soft-clip sensor. 
© 2020 - Nonin. All rights reserved.
The Nonin Xpod USB pulse ox and soft-clip sensor.

As of today, Systole is provided as an early ‘pre-release’, and is far from feature complete – so please use with caution! For our initial release, we’ve focused on offering native support for the popular Nonin USB  XPod pulse oximeter, which is frequently used in interoception experiments and offers a cheap (~300 GBP), easy to use platform for plethysmographic cardiac data collection. We hope that this will enable users to design robust interoception and cardiac synchrony tasks, and to share them with the community.

The toolbox also supports basic formats such as RR-interval time series, or instantaneous heart-rate data. Future packages and releases will include other data formats, such as ECG, electrogastrography, and respiration, and we hope the community will help to add support for other devices.

At present, Systole offers the following core features:

  • Online and offline beat-detection:
  • Detection and correction of outlier heartbeats (e.g., ectopic beats, recording artefacts):
  • Full-suite heart-rate variability (HRV) analysis including pre-processing and nonlinear time-frequency analyses
  • Tools for event-related analyses, e.g. data epoching and analysis of instantaneous heart rate:

… plus an entire suite of plotting functions to produce these graphics, and more!

Further, at the Systole website you can find complete documentation as well as interactive tutorials for a variety of workflows including:

We’ll be using Systole extensively in our ongoing Visceral Mind Project, and expect to continuously revise and improve it as we add new features and discover use-cases. Further, Systole is provided as a fully open software, so we invite you to contribute your own additions through Github! 

In the near future, we will publish a methods paper + tutorial detailing the Systole package. In the meantime, if you use Systole in your research, please cite:

Nicolas Legrand, & Micah Allen. (2020, January 14). embodied-computation-group/systole: Pre-Alpha (Version 0.0.1). Zenodo. http://doi.org/10.5281/zenodo.3607913

Please let us know what you think! Happy heartbeat counting 😉

ECG open lab meetings!

The Embodied Computation Group (ECG) will be hosting a bi-weekly open journal club/lab meeting, where both internal and external researchers at all career levels are invited to give brief informal presentations (20-30 minutes), followed by discussion and networking. Although our group focuses on decision-making, interoceptive inference and perception, we welcome any speaker who would like to present their work on these or other domains.

You are welcome to attend as many meetings as you like, but we ask that you also commit to give at least one presentation at the lab meeting as the ‘price of entry’. Note that you are also welcome to present the work of others if you would like to discuss, e.g. as a journal club.

The meetings will be held at 2pm DK time, on alternate Fridays, via Zoom. To do so, just subscribe to our new meeting list https://maillist.au.dk/mailman/listinfo/ecg.meetings, so you can receive regular updates with talk and zoom details.

The program for Winter 2020-2021 is booked (see below), but we’re still looking for speakers after June 2021, so get in touch!

Psychophysics @ Home

The COVID-19 lockdown has pushed us to find new ways of connecting and collaborating. 

Our ways of working are changing, from free online conferences to Zoom drinks with colleagues from around the world. Why not bring our experiments online as well? 

In a tweet last month, Micah suggested that we take this opportunity to do each other’s experiments at home:

The basic idea is this: let’s get our experiments online, and share them to collect data, check robustness over platforms and get feedback on them. Since many perception scientists are already sitting at home with a testing laptop, we could start to pool our time and energy to get some research done! 

As well as helping us acquire data during this time, by sharing our experimental scripts and discussing ways of conducting psychophysical and behavioural experiments remotely, we’ll be able to assemble a set of guidelines to help others do it in the future. We like to think of this project as a kind of guerrilla, low overhead “psychophysics accelerator”. By sharing experimental scripts directly, we can optimize them on our screens at home, and carry on developing tasks and collecting data even during the shut-down. Afterwards, Psychophysics @ Home might even become a lasting community resource! 

Of course, for many cognition experiments you may want to use Gorilla, Pavlovia, PsychoJS (i.e., PsychoPy) or other online testing tools, particularly for full blown experiments with random sampling. These are great, and there have recently been a number of discussions and resources made available, including a ‘virtual chinrest to control stimulus size and viewing distance remotely. We imagine Psychophysics @ Home to be more for piloting, testing, and collaborating on tasks where you may already have optimized code, or where you may need more rigorous control over visual presentation, response timing, and other experimental factors common in psychophysics experiments. And more simply, many in the community may not have the time or expertise to port these experiments to a fully online setting. So let’s staircase together! 

To get started, we’ve created an open slack channel so we can discuss how to share experiments and structure the initiative. We imagine that most experiments will be fairly standard psychophysics projects, i.e. with more trials than subjects in within-subject designs, as these are the most suited to this kind of testing. To share tasks, just use your favourite presentation tool (PsychoPy, Matlab, etc.), detail the dependencies (libraries, toolboxes), and running instructions, and recruit some users from the pool. We suggest that users participate in at least as many experiments as they expect to recruit subjects for. Since most tasks would be run in a few subjects (~10) there should be plenty of testing time to go around. We suggest creating separate channels for individual projects, to keep the #general channel readable. But it will also be nice to hear from the community and see what we can come up with. You can join here: 

Psychophysics @ Home Slack channel:

https://join.slack.com/t/psychophysicsathome/shared_invite/zt-e1f2yua8-aFp6OSLjE6zWYSe5r2BWbA

We’re looking forward to see your experiments!

CANNABODIES ERC Grant Submitted!

Last week we successfully submitted an ERC starting grant for our ongoing work investigating how cannabinoids influence learning, interoception, and brain-body interaction! The grant proposes a series of four interlocking experiments probing the neurovisceral mechanisms underlying behavioral, subjective, and interoceptive effects of pharmaceutical THC and CBD. This work continues our recent collaboration with Dr. Samia Joca, investigating the influence of CBD on affective bias. We’re excited to continue this work as it will open up fascinating new research questions for our lab, and will also provide us with new pharmacological manipulations and models we can apply in our computational psychiatry research. We’ll keep our fingers crossed for good reviews – watch this space in the next 6 months!