Locomotion generates a visual movement pattern characterized as optic flow. To explore how the locomotor adjustments are affected by this pattern, an experimental paradigm was developed to eliminate optic flow during obstacle avoidance. The aim was to investigate the contribution of optic flow in obstacle avoidance by using a stroboscopic lamp. Ten young adults walked on an 8m pathway and stepped over obstacles at two heights. Visual sampling was determined by a stroboscopic lamp (static and dynamic visual sampling). Three-dimensional kinematics data showed that the visual information about self-motion provided by the optic flow was crucial for estimating the distance from and the height of the obstacle. Participants presented conservative behavior for obstacle avoidance under experimental visual sampling conditions, which suggests that optic flow favors the coupling of vision to adaptive behavior for obstacle avoidance.
The representation of navigational optic flow across the inferior parietal lobule was assessed using optical imaging of intrinsic signals in behaving monkeys. The exposed cortex, corresponding to the dorsal-most portion of areas 7a and dorsal prelunate (DP), was imaged in two hemispheres of two rhesus monkeys. The monkeys actively attended to changes in motion stimuli while fixating. Radial expansion and contraction, and rotation clockwise and counter-clockwise optic flow stimuli were presented concentric to the fixation point at two angles of gaze to assess the interrelationship between the eye position and optic flow signal. The cortical response depended upon the type of flow and was modulated by eye position. The optic flow selectivity was embedded in a patchy architecture within the gain field architecture. All four optic flow stimuli tested were represented in areas 7a and DP. The location of the patches varied across days. However the spatial periodicity of the patches remained constant across days at ∼950 and 1100 µm for the two animals examined. These optical recordings agree with previous electrophysiological studies of area 7a, and provide new evidence for flow selectivity in DP and a fine scale description of its cortical topography. That the functional architectures for optic flow can change over time was unexpected. These and earlier results also from inferior parietal lobule support the inclusion of both static and dynamic functional architectures that define association cortical areas and ultimately support complex cognitive function.
Two strategies may guide walking to a stationary goal: (1) the optic flow strategy, in which one aligns the direction of locomotion or “heading” specified by optic flow with the visual goal [1, 2]; and (2) the egocentric direction strategy, in which one aligns the locomotor axis with the perceived egocentric direction of the goal [3, 4], and error results in optical target drift . Optic flow appears to dominate steering control in richly structured visual environments [2, 6-8], while the egocentric direction strategy prevails in visually sparse environments [2, 3, 9]. Here we determine whether optic flow also drives visuo-locomotor adaptation in visually structured environments. Participants adapted to walking with the virtual heading direction displaced 10° to the right of the actual walking direction, and were then tested with a normally aligned heading. Two environments, visually structured and visually sparse, were crossed in adaptation and test phases. Adaptation of the walking path was more rapid and complete in the structured environment, with twice the negative aftereffect on path deviation, indicating that optic flow contributes over and above target drift alone. Optic flow thus plays a central role in both online control of walking and adaptation of the visuo-locomotor mapping.
How do flying insects monitor foraging efficiency? Honeybees (Apis mellifera) use optic flow information as an odometer to estimate distance travelled, but here we tested whether optic flow informs estimation of foraging costs also. Bees were trained to feeders in flight tunnels such that bees experienced the greatest optic flow en route to the feeder closest to the hive. Analyses of dance communication showed that, as expected, bees indicated the close feeder as being further, but they also indicated this feeder as the more profitable, and preferentially visited this feeder when given a choice. We show that honeybee estimates of foraging cost are not reliant on optic flow information. Rather, bees can assess distance and profitability independently and signal these aspects as separate elements of their dances. The optic flow signal is sensitive to the nature of the environment travelled by the bee, and is therefore not a good index of flight energetic costs, but it provides a good indication of distance travelled for purpose of navigation and communication, as long as the dancer and recruit travel similar routes. This study suggests an adaptive dual processing system in honeybees for communicating and navigating distance flown and for evaluating its energetic costs.
For a small flying insect, correcting unplanned course perturbations is essential for navigating through the world. Visual course control relies on estimating optic flow patterns which, in flies, are encoded by interneurons of the third optic ganglion. However, the rules that translate optic flow into flight motor commands remain poorly understood. Here, we measured the temporal dynamics of optomotor responses in tethered flies to optic flow fields about three cardinal axes. For each condition, we used white noise analysis to determine the optimal linear filters linking optic flow to the sum and difference of left and right wing beat amplitudes. The estimated filters indicate that flies react very quickly to perturbations of the motion field, with pure delays in the order of ~20 ms and time-to-peak of ~100 ms. By convolution the filters also predict responses to arbitrary stimulus sequences, accounting for over half the variance in 5 of our 6 stimulus types, demonstrating the approximate linearity of the system with respect to optic flow variables. In the remaining case of yaw optic flow we improved predictability by measuring individual flies, which also allowed us to analyze the variability of optomotor responses within a population. Finally...
To avoid collisions when navigating through cluttered environments, flying insects must control their flight so that their sensory systems have time to detect obstacles and avoid them. To do this, day-active insects rely primarily on the pattern of apparent motion generated on the retina during flight (optic flow). However, many flying insects are active at night, when obtaining reliable visual information for flight control presents much more of a challenge. To assess whether nocturnal flying insects also rely on optic flow cues to control flight in dim light, we recorded flights of the nocturnal neotropical sweat bee, Megalopta genalis, flying along an experimental tunnel when: (i) the visual texture on each wall generated strong horizontal (front-to-back) optic flow cues, (ii) the texture on only one wall generated these cues, and (iii) horizontal optic flow cues were removed from both walls. We find that Megalopta increase their groundspeed when horizontal motion cues in the tunnel are reduced (conditions (ii) and (iii)). However, differences in the amount of horizontal optic flow on each wall of the tunnel (condition (ii)) do not affect the centred position of the bee within the flight tunnel. To better understand the behavioural response of Megalopta...
Information from the vestibular, sensorimotor, or visual systems can affect the firing of grid cells recorded in entorhinal cortex of rats. Optic flow provides information about the rat’s linear and rotational velocity and, thus, could influence the firing pattern of grid cells. To investigate this possible link, we model parts of the rat’s visual system and analyze their capability in estimating linear and rotational velocity. In our model a rat is simulated to move along trajectories recorded from rat’s foraging on a circular ground platform. Thus, we preserve the intrinsic statistics of real rats’ movements. Visual image motion is analytically computed for a spherical camera model and superimposed with noise in order to model the optic flow that would be available to the rat. This optic flow is fed into a template model to estimate the rat’s linear and rotational velocities, which in turn are fed into an oscillatory interference model of grid cell firing. Grid scores are reported while altering the flow noise, tilt angle of the optical axis with respect to the ground, the number of flow templates, and the frequency used in the oscillatory interference model. Activity patterns are compatible with those of grid cells, suggesting that optic flow can contribute to their firing.
The detection of looming, the motion of objects in depth, underlies many behavioral tasks, including the perception of self-motion and time-to-collision. A number of studies have demonstrated that one of the most important cues for looming detection is optic flow, the pattern of motion across the retina. Schrater et al. have suggested that changes in spatial frequency over time, or scale changes, may also support looming detection in the absence of optic flow (P. R. Schrater, D. C. Knill, & E. P. Simoncelli, 2001). Here we used an adaptation paradigm to determine whether the perception of looming from optic flow and scale changes is mediated by single or separate mechanisms. We show first that when the adaptation and test stimuli were the same (both optic flow or both scale change), observer performance was significantly impaired compared to a dynamic (non-motion, non-scale change) null adaptation control. Second, we found no evidence of cross-cue adaptation, either from optic flow to scale change, or vice versa. Taken together, our data suggest that optic flow and scale changes are processed by separate mechanisms, providing multiple pathways for the detection of looming.
The retinal image changes that occur during locomotion, the optic flow, carry information about self-motion and the three-dimensional structure of the environment. Especially fast moving animals with only little binocular vision depend on these depth cues for maneuvering. They actively control their gaze to facilitate perception of depth based on cues in the optic flow. In the visual system of birds, nucleus rotundus neurons were originally found to respond to object motion but not to background motion. However, when background and object were both moving, responses increased the more the direction and velocity of object and background motion on the retina differed. These properties may play a role in representing depth cues in the optic flow. We therefore investigated, how neurons in nucleus rotundus respond to optic flow that contains depth cues. We presented simplified and naturalistic optic flow on a panoramic LED display while recording from single neurons in nucleus rotundus of anaesthetized zebra finches. Unlike most studies on motion vision in birds, our stimuli included depth information. We found extensive responses of motion selective neurons in nucleus rotundus to optic flow stimuli. Simplified stimuli revealed preferences for optic flow reflecting translational or rotational self-motion. Naturalistic optic flow stimuli elicited complex response modulations...
We have recently suggested that neural flow parsing mechanisms act to subtract global optic flow consistent with observer movement to aid in detecting and assessing scene-relative object movement. Here, we examine whether flow parsing can occur independently from heading estimation. To address this question we used stimuli comprising two superimposed optic flow fields comprising limited lifetime dots (one planar and one radial). This stimulus gives rise to the so-called optic flow illusion (OFI) in which perceived heading is biased in the direction of the planar flow field. Observers were asked to report the perceived direction of motion of a probe object placed in the OFI stimulus. If flow parsing depends upon a prior estimate of heading then the perceived trajectory should reflect global subtraction of a field consistent with the heading experienced under the OFI. In Experiment 1 we tested this prediction directly, finding instead that the perceived trajectory was biased markedly in the direction opposite to that predicted under the OFI. In Experiment 2 we demonstrate that the results of Experiment 1 are consistent with a positively weighted vector sum of the effects seen when viewing the probe together with individual radial and planar flow fields. These results suggest that flow parsing is not necessarily dependent on prior estimation of heading direction. We discuss the implications of this finding for our understanding of the mechanisms of flow parsing.
[Purpose] We investigated the effects of modulation of the optic flow speed on gait
parameters in children with hemiplegic cerebral palsy. [Methods] We examined 10 children
with hemiplegic cerebral palsy. The children underwent gait analysis under 3 different
conditions of optic flow speed: slow, normal, and fast optic flow speed. The children
walked across the walkway of a GAITRite system, while watching a virtual reality screen,
and walking velocity, cadence, stride length, step length, single support time, and double
support time were recorded. [Results] Compared with the other applied flow speed
conditions, the fast optic flow speed (2 times the normal speed) significantly increased
walking velocity, cadence, normalized step length, base of support, and single support
cycle of both the paretic and non-paretic lower limbs. Moreover, compared with the other
applied flow speed conditions, the slow optic flow speed (0.25 times the normal speed)
yielded a significantly decreased walking velocity, cadence, normalized step length, base
of support, and single support cycle for both the paretic and non-paretic lower limbs.
[Conclusion] The gait parameters of children with hemiplegic cerebral palsy are altered by
modulation of the optic flow speed. Thus...
Visual input provides vital information for helping us modify our walking pattern. For example, artificial optic flow can drive changes in step length during locomotion and may also be useful for augmenting locomotor training for individuals with gait asymmetries. Here we asked whether optic flow could modify the acquisition of a symmetric walking pattern during split-belt treadmill adaptation. Participants walked on a split-belt treadmill while watching a virtual scene that produced artificial optic flow. For the Stance Congruent group, the scene moved at the slow belt speed at foot strike on the slow belt and then moved at the fast belt speed at foot strike on the fast belt. This approximates what participants would see if they moved over ground with the same walking pattern. For the Stance Incongruent group, the scene moved fast during slow stance and vice versa. In this case, flow speed does not match what the foot is experiencing, but predicts the belt speed for the next foot strike. Results showed that the Stance Incongruent group learned more quickly than the Stance Congruent group even though each group learned the same amount during adaptation. The increase in learning rate was primarily driven by changes in spatial control of each limb...
The visual input created by the relative motion between an individual and the environment, also called optic flow, influences the sense of self-motion, postural orientation, veering of gait, and visuospatial cognition. An optic flow network comprising visual motion areas V6, V3A, and MT+, as well as visuo-vestibular areas including posterior insula vestibular cortex (PIVC) and cingulate sulcus visual area (CSv), has been described as uniquely selective for parsing egomotion depth cues in humans. Individuals with Parkinson’s disease (PD) have known behavioral deficits in optic flow perception and visuospatial cognition compared to age- and education-matched control adults (MC). The present study used functional magnetic resonance imaging (fMRI) to investigate neural correlates related to impaired optic flow perception in PD. We conducted fMRI on 40 non-demented participants (23 PD and 17 MC) during passive viewing of simulated optic flow motion and random motion. We hypothesized that compared to the MC group, PD participants would show abnormal neural activity in regions comprising this optic flow network. MC participants showed robust activation across all regions in the optic flow network, consistent with studies in young adults...
In this paper we describe a variational approach to computing dense optic flow in the case of non-rigid motion. We optimise a global energy to compute the optic flow between each image in a sequence and a reference frame simultaneously. Our approach is based on subspace constraints which allow to express the optic flow at each pixel in a compact way as a linear combination of a 2D motion basis that can be pre-estimated from a set of reliable 2D tracks. We reformulate the multi-frame optic flow problem as the estimation of the coefficients that multiplied with the known basis will give the displacement vectors for each pixel. We adopt a variational framework in which we optimise a non-linearised global brightness constancy to cope with large displacements and impose homogeneous regularization on the multi-frame motion basis coefficients. Our approach has two strengths. First, the dramatic reduction in the number of variables to be computed (typically one order of magnitude) which has obvious computational advantages and second, the ability to deal with large displacements due to strong deformations. We conduct experiments on various sequences of non-rigid objects which show that our approach provides results comparable to state of the art variational multi-frame optic flow methods.; Ravi Garg...
Visuelle Flußfelder sind aus der Fovea expandierende Muster visueller Bewegung, die auf unserer Netzhaut entstehen, wenn bei einer Vorwärtsbewegung ein Punkt, auf den wir zusteuern, fixiert wird, und können zur Bestimmung unserer Bewegungsrichtung herangezogen werden. Die Analyse solcher Flußfelder wird jedoch erschwert, wenn wir zusätzlich langsame Augenfolgebewegungen (AFB) ausführen – in diesem Fall setzt sich das retinale Flußfeld aus zwei Kompontenten zusammen, einer ersten, die aus der eigenen Vorwärtsbewegung resultiert (radiale Komponente) und einer zweiten, die durch die Augenrotation hervorgerufen wird (horizontale Komponente). Für die korrekte Einschätzung der Richtung der Eigenbewegung aus dem visuellen Flußfeld muß jedoch die expandierende Komponente, der radiale Fluß, isoliert werden. Nach Entwicklung geeigneter Stimulationsbedingungen wurden mit Humanprobanden psychophysische Untersuchungen durchgeführt, die zur Klärung folgender Fragen beitragen sollten:
1. Welches sind die entscheidenden Signale, die die durch AFB hervorgerufenen Flußfeldverzerrungen kompensieren?
2. Werden extraretinale Signale den perzeptuellen Erfordernissen entsprechend angepaßt?
In einer ersten Versuchsreihe wurde getestet...
Humans are usually accurate when estimating heading or path from optic flow, even in the presence of independently moving objects (IMOs) in an otherwise rigid scene. To invoke significant biases in perceived heading, IMOs have to be large and obscure the focus of expansion (FOE) in the image plane, which is the point of approach. For the estimation of path during curvilinear self-motion no significant biases were found in the presence of IMOs. What makes humans robust in their estimation of heading or path using optic flow? We derive analytical models of optic flow for linear and curvilinear self-motion using geometric scene models. Heading biases of a linear least squares method, which builds upon these analytical models, are large, larger than those reported for humans. This motivated us to study segmentation cues that are available from optic flow. We derive models of accretion/deletion, expansion/contraction, acceleration/deceleration, local spatial curvature, and local temporal curvature, to be used as cues to segment an IMO from the background. Integrating these segmentation cues into our method of estimating heading or path now explains human psychophysical data and extends, as well as unifies, previous investigations. Our analysis suggests that various cues available from optic flow help to segment IMOs and...
Optic flow, the pattern of apparent motion elicited on the retina during movement, has been demonstrated to be widely used by animals living in the aerial habitat, whereas underwater optic flow has not been intensively studied so far. However optic flow would also provide aquatic animals with valuable information about their own movement relative to the environment; even under conditions in which vision is generally thought to be drastically impaired, e. g. in turbid waters. Here, we tested underwater optic flow perception for the first time in a semi-aquatic mammal, the harbor seal, by simulating a forward movement on a straight path through a cloud of dots on an underwater projection. The translatory motion pattern expanded radially out of a singular point along the direction of heading, the focus of expansion. We assessed the seal's accuracy in determining the simulated heading in a task, in which the seal had to judge whether a cross superimposed on the flow field was deviating from or congruent with the actual focus of expansion. The seal perceived optic flow and determined deviations from the simulated heading with a threshold of 0.6 deg of visual angle. Optic flow is thus a source of information seals, fish and most likely aquatic species in general may rely on for e. g. controlling locomotion and orientation under water. This leads to the notion that optic flow seems to be a tool universally used by any moving organism possessing eyes.
Optic flow informs moving observers about their heading direction. Neurons in monkey medial superior temporal (MST) cortex show heading selective responses to optic flow and planar direction selective responses to patches of local motion. We recorded MST neuronal responses to a 90 × 90° optic flow display and to a 3 × 3 array of local motion patches covering the same area. Our goal was to test the hypothesis that the optic flow responses reflect the sum of the local motion responses. The local motion responses of each neuron were modeled as mixtures of Gaussians, combining the effects of two Gaussian response functions derived using a genetic algorithm, and then used to predict that neuron's optic flow responses. Some neurons showed good correspondence between local motion models and optic flow responses, others showed substantial differences. We used the genetic algorithm to modulate the relative strength of each local motion segment's responses to accommodate interactions between segments that might modulate their relative efficacy during co-activation by global patterns of optic flow. These gain modulated models showed uniformly better fits to the optic flow responses, suggesting that coactivation of receptive field segments alters neuronal response properties. We tested this hypothesis by simultaneously presenting local motion stimuli at two different sites. These two-segment stimuli revealed that interactions between response segments have direction and location specific effects that can account for aspects of optic flow selectivity. We conclude that MST's optic flow selectivity reflects dynamic interactions between spatially distributed local planar motion response mechanisms.
The present study explored whether the optic flow deficit in Alzheimer’s disease (AD) reported in the literature transfers to different types of optic flow, in particular, one that specifies collision impacts with upcoming surfaces, with a special focus on the effect of retinal eccentricity. Displays simulated observer movement over a ground plane toward obstacles lying in the observer’s path. Optical expansion was modulated by varying τ˙. The visual field was masked either centrally (peripheral vision) or peripherally (central vision) using masks ranging from 10° to 30° in diameter in steps of 10°. Participants were asked to indicate whether their approach would result in “collision” or “no collision” with the obstacles. Results showed that AD patients’ sensitivity to τ˙ was severely compromised, not only for central vision but also for peripheral vision, compared to age- and education-matched elderly controls. The results demonstrated that AD patients’ optic flow deficit is not limited to radial optic flow but includes also the optical pattern engendered by τ˙. Further deterioration in the capacity to extract τ˙ to determine potential collisions in conjunction with the inability to extract heading information from radial optic flow would exacerbate AD patients’ difficulties in navigation and visuospatial orientation.
In birds, the nucleus of the basal optic root (nBOR) of the accessory optic system (AOS) and the pretectal nucleus lentiformis mesencephali (LM) are involved in the analysis of optic flow and the generation of the optokinetic response. In several species, it has been shown that the AOS and pretectum receive input from visual areas of the telencephalon. Previous studies in pigeons using anterograde tracers have shown that both nBOR and LM receive input from the visual Wulst, the putative homolog of mammalian primary visual cortex. In the present study, we used retrograde and anterograde tracing techniques to further characterize these projections in pigeons. After injections of the retrograde tracer cholera toxin subunit B (CTB) into either LM or nBOR, retrograde labeling in the telencephalon was restricted to the hyperpallium apicale (HA) of the Wulst. From the LM injections, retrograde labeling appeared as a discrete band of cells restricted to the lateral edge of HA. From the nBOR injections, the retrograde labeling was more distributed in HA, generally dorsal and dorso-medial to the LM-projecting neurons. In the anterograde experiments, biotinylated dextran amine (BDA) was injected into HA and individual axons were reconstructed to terminal fields in the LM and nBOR. Those fibers projecting to the nBOR also innervated the adjacent ventral tegmental area. However...