Xray Eyeglasses: Mixed-reality Goggles Enhancing Situational Awareness of Astronauts and Soldiers
The long term goal of this project is to leverage mixed reality as a means to integrate human capability and machine intelligence in space and defense domains. The specific goal of this undergraduate research project is to explore the possibility of using Microsoft Hololens to enhance the situational awareness of our astronauts and soldiers, particularly to allow them to "see through" obstacles, e.g., walls of the ISS, and the fog of war. Frederica Finney-Long and Russel Evangelista are currently working on this project.
LegoDrones: Reconfigurable UAVs for Future Transportation and Space Exploration
One of the challenges we are facing is that currently UAVs are not flexible enough to conduct a wide range of missions. Let’s imagine the following scenario. Let’s say we have built a human habitat on the Mars and now we need to design a UAV (or a group of UAVs) in such a way that it can transport either a small container containing soil samples or a heavy rover back to the base. If we are going to use the state-of-the-art design methodology, we will need to design one type of drone for each mission. But an optimally designed UAV for the former mission might be quite inefficient or completely useless in the context of the latter mission. The long term goal of the project is to develop a swarm of UAVs that is adaptable for a wide range of missions. For instance, we may only use one UAV for the former mission but assemble four smaller, identical UAVs into a larger one for the latter mission. The specific goal of this capstone project is to develop and demonstrate a mechanical and power mechanism that allows us to attach and detach two quadrotor UAVs. The students working on this project are: Lea Serrar, Jonathan Doyle, William Michael Fish, Taylor Parker, and Diego Valenzuela.
RoboBees: Drone-based Pollinators
The long term goal of this project is to develop a swarm of drone-based pollinators (BoboBees) that are able to effectively and efficiently pollinate an agricultural field without any human intervention. For instance, the RoboBees must be able to autonomously detect target plants and optimal route, mitigate effects of wind, and return to their base (beehive) to resupply and recharge. The specific goal of this senior design project was to develop a drone-based mechanical pollinator that utilizes the downdraft of the drone to precision-release pollen grains on the target plants. A senior design team, consisting of five MAE undergraduates, Kaden Jeppesen, Kevin Mclaughlin, Bradley John Pluschkell, Vishal Jay Sharma, and Trevor Andrew Vidano, worked on this project. This project was co-supervised by Prof. Mohsen Mesgaran from the Department of Plant Sciences.
Controlling a Robot with Your Brain!
The goal of the project is to build an EEG-controlled robot that is able to perform some simple tasks such as navigation inside a 3D environment with multiple obstacles. A wheeled robot, Create 2 from iRobot, has been used as the robotic platform. In the future, quadcopters will also be used. Tools from signal processing and machine learning will be implemented to decode the operator's intention. The project is currently being performed by a group of undergraduate researchers from UC Davis, including Eric Po, Gabriel Simmons, Jonathan Summer, Anthony Thai, Hong Truong, Albert Yeh, and Chenxiong Yi. Alumni of the project include Ariana Jagodzinski, Eric Chavez, Brett Bacharach, and Guilherme Piovezan Otto, as well as a high school student, Nikhil Gupta from Mira Loma High School in Sacramento.
The goal of the project was to build a robot that is capable of following a person around and taking pictures of him/her upon request. A wheeled robot, Create 2 from iRobot, was used as the robotic platform. The project was conducted by a group of MAE and CS undergraduate students, Wilhelmena Figueiredo and Anthony Thai, and Siruis Zhang led by Wilhelmena Figueiredo.
3D Printing Micro-UAVs
This was a MAE senior design project. The goal of the project was to use 3D printing to build a VTOL (vertical takeoff and landing) aircraft that is capable of performing some basic maneuvers, such as hovering. The project was conducted by three MAE undergraduate students, Kyle Hossli, Mike Liszka and Mark Wong. It was co-supervised by Dr. Barbara Linke. The project was selected as one of the three finalists for the Sandia Design Award.
Dancing with A Quadcopter
The goal of the project was to build a quadcopter that is capable of dancing with a human partner. A Kinect sensor was used to recognize gestures of the human partner; artful motions were generated by the quadcopter to respond to its partner. The project was conducted by a group of six MAE undergraduate students, Eric Chavez, Ariana Jagodzinski, Salim Hasin, Jimmy Tran, Tung Tran, Hong Truong, and Albert Yeh led by Jimmy Tran.