Research-related videos
Below, you will find a selection of short videos illustrating some of the work going on in my lab!
Modeling Deer-Vehicle Interactions to Improve Intelligent Vehicle Safety
The video below shows a simulation of a deer-vehicle interaction. The vehicle is simulated using a planar vehicle model with nonlinear tires under the direction of an MPC-based collision avoidance system. See Font and Brown (2020) for more information.
Fish-Robot Interaction
The video below shows an interaction between our robotic fish platform (See Utter and Brown, 2020 for more information) and a group of three banded archerfish. A target appears for the fish at the top left corner of the tank. The robot reacts to the appearance of the target, and engages in archerfish-like hunting behavior to elicit kleptoparasitism and/or other social behaviors in the fish. Fish and robot are tracked using a monocular camera for post-processing using a YOLOv4 object detection system coupled with a multi-target Kalman Filter. We are interested in manipulating the robot’s embodiment and goal-directed motion to explore what facets of the robot’s behavior are important in eliciting social hunting behavior in the fish.
Results from this experiment are being prepared for an invited submission to Biological Cybernetics as of Spring 2021.
Modeling Goal-Directed Aiming in Humans
The video below shows a single pilot trial in a study that looked at how humans aim at a series of targets that move in complex, difficult-to-predict trajectories (see Brown 2016 and Steinberg, Brown, and Schettino 2019 for more info). Competitive marksmanship tasks were chosen to study human aiming because they require both target tracking (point) and a decision to act (click), while requiring full-body biomechanics and hand-eye coordination. During these experiments, subject gaze was tracked using a binocular eye tracking system, and shot locations were tracked using an infra-red camera built in to the target simulator.
Driving Simulator Development
The video below showcases some of the functionality of the immersive driving simulator I developed collaboratively with Lafayette mechanical engineering students in 2015-2016. It features a vehicle simulation core and motion cueing algorithm that were both developed in-house. The system features a 6-DOF motion platform, and uses an OSVR open-source virtual reality headset for visual rendering.