Modelling time-order effects in haptic perception with a Bayesian dynamical framework
Gastón Avetta, Jose Lobera, Juan José Zárate, Inés Samengo, Damián G. Hernández
TLDR
This paper introduces a Bayesian dynamical model to explain time-order effects in haptic perception, reproducing biases and individual variability.
Key contributions
- Proposes a dynamical Bayesian model to explain time-order effects in haptic perception.
- Formalizes perception as an inference process where prior expectations evolve over time.
- Quantitatively reproduces time-order effects and inter-individual variability in psychophysical data.
- Reveals a transformed stimulus space where perceptual judgments exhibit approximate symmetries.
Why it matters
This work provides a novel computational framework to understand how temporal biases arise in haptic perception. It suggests that these biases are a consequence of dynamic inference, imposing geometric constraints on perceptual representations. This offers new insights into the fundamental mechanisms of sensory processing.
Original Abstract
Perceptual judgments of sequential stimuli are systematically biased by prior expectations and by the temporal structure of sensory input. In haptic discrimination tasks, these effects often manifest as time-order asymmetries, whereby the perceived difference between two stimuli depends on their presentation order. Here, we introduce a dynamical Bayesian model that accounts for these biases by combining noisy sensory measurements with an evolving internal representation of stimulus intensity. The model formalizes perception as an inference process in which prior expectations are updated by incoming stimuli and propagate in time between observations. We test the model on psychophysical data from vibrotactile discrimination experiments, in which participants compare pairs of sequential stimuli with varying intensities. With a small number of parameters, the model quantitatively reproduces both the direction and magnitude of time-order effects across subjects, as well as the observed inter-individual variability. The inferred parameters provide a compact description of perceptual biases in terms of prior expectations and noise characteristics. Beyond fitting the data, the model induces a transformation of stimulus space, leading to a subject-dependent geometry of perceived stimuli. In this transformed space, perceptual judgments exhibit approximate symmetries that are absent in the physical stimulus coordinates. These results suggest that temporal biases in perception can be understood as a consequence of dynamical inference, and that they impose non-trivial geometric constraints on perceptual representations.
📬 Weekly AI Paper Digest
Get the top 10 AI/ML arXiv papers from the week — summarized, scored, and delivered to your inbox every Monday.