Research-related videos

Below, you will find a selection of short videos illustrating some of the work going on in my lab!

Modeling and Experimentally Validating Motorcycle Dynamics

There are over 600 million motorcycles on the road globally, but at least in the United States, motorcycles are represented in a disproportionate percentage of fatal accidents. Making motorcycles safer by developing policies, infrastructure, and rider-assist technology is important, but using experiments with human riders carries substantial risk. In a perfect world, most of the technology, infrastructure, and policy development could be done using high-fidelity computer simulations. The problem is that commercial simulation packages are extremely expensive, and relatively inflexible– productive simulation work requires detailed manipulation of the virtual motorcycle, its rider, traffic, any advanced safety systems on the bike, and road surfaces, to name just a few things. In my lab, we have developed a flexible motorcycle simulation based on free, open-source software to predict dynamic motorcycle behavior. In order to validate the software to ensure that it predicts motorcycle dynamics accurately, we have built a robotic, self-driving electric motorcycle in the lab that will be capable of keeping itself upright, and steering automatically to follow paths defined by GPS points.

With the help of research students Bryson Kronheim, Sam Milhaven, Ben Arky, Paris Francis, and Wenjia Li, my lab’s latest robot, a self-driving electric minibike, is now balancing and steering on its own! Based on a Razor MX350, the bike features a 3 N-m steering motor along with custom sensors and motor controllers all coordinated by a Raspberry Pi 4 running ROS2.

Collecting data about the bike during real-life experiments will let us see how accurately our simulation software can replicate how the motorcycle behaves. Once we have confidence in our simulation software, we can use it to tackle more complex motorcycle safety issues. Development and evaluation of Advanced Rider Assist Technology (ARAS) is one of our long-term goals. These days, premium motorcycles are beginning to hit the market with features like adaptive cruise control and lean-sensitive anti-lock brakes. Motorcycles’ dynamic complexity makes developing this type of technology challenging, so it is a rich area for growth in the next decade or so as we work to make these advanced safety systems work better, and become more accessible and affordable. The video below shows our progress as of August 2023.

Modeling Deer-Vehicle Interactions to Improve Intelligent Vehicle Safety

The video below shows a simulation of a deer-vehicle interaction. The vehicle is simulated using a planar vehicle model with nonlinear tires under the direction of an MPC-based collision avoidance system. See Font and Brown (2020) for more information.

Fish-Robot Interaction

The video below shows an interaction between our robotic fish platform (See Utter and Brown, 2020 for more information) and a group of three banded archerfish. A target appears for the fish at the top left corner of the tank. The robot reacts to the appearance of the target, and engages in archerfish-like hunting behavior to elicit kleptoparasitism and/or other social behaviors in the fish. Fish and robot are tracked using a monocular camera for post-processing using a YOLOv4 object detection system coupled with a multi-target Kalman Filter. We are interested in manipulating the robot’s embodiment and goal-directed motion to explore what facets of the robot’s behavior are important in eliciting social hunting behavior in the fish.

Results from this experiment are being prepared for an invited submission to Biological Cybernetics as of Spring 2021.

Modeling Goal-Directed Aiming in Humans

The video below shows a single pilot trial in a study that looked at how humans aim at a series of targets that move in complex, difficult-to-predict trajectories (see Brown 2016 and Steinberg, Brown, and Schettino 2019 for more info). Competitive marksmanship tasks were chosen to study human aiming because they require both target tracking (point) and a decision to act (click), while requiring full-body biomechanics and hand-eye coordination. During these experiments, subject gaze was tracked using a binocular eye tracking system, and shot locations were tracked using an infra-red camera built in to the target simulator.


Driving Simulator Development

The video below showcases some of the functionality of the immersive driving simulator I developed collaboratively with Lafayette mechanical engineering students in 2015-2016. It features a vehicle simulation core and motion cueing algorithm that were both developed in-house. The system features a 6-DOF motion platform, and uses an OSVR open-source virtual reality headset for visual rendering.