FRC

The ever-growing applications of Unmanned Aerial Vehicles (UAVs) require UAVs to navigate at low altitude below 2000 feet. Traditionally, a UAV is equipped with a single GPS receiver. When flying at low altitude, a single GPS receiver may receive signals from less than four GPS satellites in the partially visible sky, not sufficient to conduct trilateration. In such a situation, GPS coordinates become unavailable and the partial GPS information is discarded. A GPS receiver may also suffer from multipath errors, causing the navigation solution to be inaccurate and unreliable.

In this talk, we present our recent work on UAV navigation using not one, but multiple GPS receivers, either on the same UAV or across different UAVs fused with other navigational sensors, such as IMUs and vision. We integrate and take use of the partial GPS information from peer GPS receivers and are able to dramatically improve GPS availability. We apply advanced filtering algorithms to multiple GPS measurements on the same UAV to mitigate multipath errors. Furthermore, multiple UAVs equipped with on-board communication capabilities can cooperate by forming a UAV network to further improve navigation accuracy, reliability and security.

Grace Xingxin Gao is an assistant professor in the Aerospace Engineering Department at University of Illinois at Urbana-Champaign. She obtained her Ph.D. degree in Electrical Engineering from the GPS Laboratory at Stanford University in 2008. Before joining Illinois at Urbana-Champaign as an assistant professor in 2012, Prof. Gao was a research associate at Stanford University.

Prof. Gao has won a number of awards, including RTCA William E. Jackson Award and Institute of Navigation Early Achievement Award. She was named one of 50 GNSS Leaders to Watch by the GPS World Magazine. She has won Best Paper/Presentation of the Session Awards 10 times at ION GNSS+ conferences. For her teaching, Prof. Gao has been on the List of Teachers Ranked as Excellent by Their Students at University of Illinois multiple times. She won the College of Engineering Everitt Award for Teaching Excellence at University of Illinois at Urbana-Champaign in 2015. She was chosen as American Institute of Aeronautics and Astronautics (AIAA) Illinois Chapter’s Teacher of the Year in 2016.

Lava tubes are caves that underlie the moon.  Until recently, lunar cave exploration was impossible, since there was no known means to enter the closed tubes. Great holes, or "pits" have recently been discovered from orbit, and some of these appear to offer robotic access to caves.

Space agencies and private institutions plan to visit these potential caves and investigate them as potential lunar habitat sites.

My research has investigated rover configuration, mobility, electronics, power and operations for exploring lunar pits and caves with small robots.   I will present some of my PhD research related to these issues.

John Walker completed his aerospace PhD at Tohoku University in 2016.   His Mechanical Engineering degree was earned at the University of Alberta in 2005.  In 2010 he attended the International Space University in Strasbourg, France. This was followed by an internship at the Space Robotics Lab at Tohoku University in Japan where he began doing lunar rover research to support Hakuto, a leading Google Lunar X-Prize Team. He joined Hakuto officially as the rover development leader and completed his PhD in the Space Exploration Lab with research in lunar cave exploration robots.

Autel Robotics was founded in 2014 based out of Shenzhen, China with presence in Seattle, Silicon Valley in USA and Munich, Germany. Autel (parent company) is the world's leading manufacturers and suppliers of professional diagnostic tools, equipments and accessories in the automotive aftermarket. Since mid-2014, we established Autel Robotics, dedicated to deliver ground-breaking solutions for new aerial exploration through our quadcopters and camera drone technology.

We are a team of industry professionals with a genuine passion for technology and years of engineering experience. We focus on transforming complex technology into simple solutions, creating easy-to-use aerial devices for photography/filming and imaging. Our current products include X-Star, a consumer quadcopter integrated with a 4K stabilized camera and indoor positioning system; It’s a combination of complex algorithms and advanced engineering delivered in one concise package. We also have Kestrol - a tilt-rotor UAV that combines the fixed-wing airplane’s energy-saving feature and the quadcopter's VTOL ability.

Please RSVP to assure an accurate headcount

About the Speakers:

Xinye Liu: Xinye is currently the Sr. Product Manager of Autel Robotics. Before that, she entered the drones industry right after graduation from Carnegie Mellon University in 2014. Since then, Xinye has combined her solid engineering background with business strategies, successfully led multiple projects including consumer quadcopters and even unmanned aerial vehicles. As an experienced drone industry professional, Xinye is willing to share her experience in this industry for the past 2 years.

Angela Wang: Angela is currently the Global Overseas HR Director of Autel Robotics. Angela has over 20 years of experiences in large multinational company including Procter & Gamble, Dell, Symantec in the international capacity working in China, Canada and USA. In recent years she has been in HR Business Partner roles helping business leaders to develop and implement people and organization development strategies. Angela Joined Autel since February 2016 based in Silicon Valley and she is responsible for all Autel International locations outside of China for Human Resources and Talent management.

We hope to see you there!

Branching structures are ubiquitous elements in several environments on Earth, from trees found in nature to man-made trusses and power lines. Being able to navigate such environments provides a difficult challenge to robots ill-equipped to handle the task. In nature, locomotion through such an environment is solved by apes through a process called brachiation, where movement is performed by hand-over-hand swinging.

This thesis outlines the development of a two-link Brachiating robot. We will present our work on implementing an Energy-based Controller where we inject or remove energy into the system before assuming the grasping posture. We will show that the controller can solve the ladder problem and swing-up for continuous contact brachiating gaits, and compare it to other control approaches in simulation. We will also show our work in developing a real-world brachiating robot, and show the implementation of our controller in this robot.

Zongyi Yang is a M.S. student in the Robotics Institute advised by David Wettergreen. He received a B.S. in Engineering Science ECE Option from University of Toronto in 2014. His current research focuses on robot brachiation.

Committee Members:
David Wettergreen (Advisor)
Hartmut Geyer
Nitish Thatte

Globally, horticulture is facing many challenges. The most significant of these challenges range from scalability to meet growing food demands, the impacts of constantly increasing labour requirements, produce loss (waste), yield security, hygiene and more. Most people can conceptualise how robotics will assist with many labour intensive horticultural roles to minimise or replace the direct labour requirement. This will ultimately help with other challenges like scalability, yield security, hygiene and food security but can the disruptive impact of robotics go beyond this? We think it will!

Each year around 50% of all produce is wasted. Taking an example from the Kiwifruit industry, we look at how the integration of MARS (mechanisation, automation, robotics and sensors) technologies through the value chain will be able to minimise waste, further adding to scalability and food security benefits of robotics. Two case studies are presented of technologies we have developed that will help deliver these robotic benefits. This will look at an autonomous orchard robot for tasks like harvesting and pollination, as well as a robotic apple packer that integrates into current packhouse systems.

Steven Saunders (Ngai Te Ahi) has 30 years’ experience in the Horticultural sector and is the founder, owner and Managing Director of the Plus Group of companies, specialising in horticulture management consultancy, global pollen production, robotics development, soil consultancy, international ventures, applied technology, research and development / innovation and science.

Steven is the co-founder of Newnham Park Innovation Centre in Te Puna, Tauranga, which hosts eight local export award winning independent companies predominately in the food sector. Steven is also a major stakeholder in the kiwifruit postharvest sector, an active Angel investor (a number of the Angel investments are food based), a Seed Co Investment Fund (SCIF) Director and investor representative, a Director of a number of privately owned companies, an elected member of the executive board for Priority One (driving economic growth in the Bay of Plenty), Board member of Enterprise Angels Tauranga, Crown appointed Director Landcare Research and a member of the Stanford Primary sector Alumni.
Pollen Plus Ltd won the 2010 Bay of Plenty (BOP) Emerging Exporter of the Year award and Gro Plus won the 2007 Environment BOP Gallagher Innovation award and the 2007 Balance Nutrient Management award.

Steven is a founder of “WNT Ventures” tech incubator which was one of the 3 NZ Callaghan awarded tech incubators (a partnership between private sector investment and the government). Robotics Plus was awarded a 10 Million targeted research grant in a collaboration between Auckland University, Waikato University and Plant and Food Research. Robotics Plus features on the NZTE New Zealand Story.

The Plus Group (beneficially owned by the Saunders Family Trust) has a strong history of supporting New Zealand based research and development activities including Vision Mātauranga. This support extends well beyond co-funding several government assisted projects directly into self-funding research in the commercial and Maori environment. Steven was awarded the Tauranga Chamber of Commerce, Westpac “Business Leadership Award in 2014”

Today’s planetary robotic exploration is carried out by large, lumbering rovers. Due to the expense of such rovers, the resulting missions are risk averse. Small, high-cadence, minimalist rovers are poised to break new ground by expanding space exploration capabilities. Whether by decreasing overall mission costs or enabling symbiotic exploration among multiple low-cost rovers, these minimalist rovers can allow missions to be more risk tolerant and more rapidly explore areas previously deemed too treacherous.

Existing methods of localization and route determination of planetary rovers are expensive, both in computational time and power requirements. As a result, they are limited in their speed and performance even on today’s large, expensive rovers. The ability to quickly and efficiently estimate a rover’s route becomes even more crucial as the size, mass, computation, and power budgets continue to shrink. A method for inexpensive route determination will enable safer, faster, and smarter navigation of these minimalist rovers.

This research presents a novel approach for computationally efficient visual odometry with an unactuated downward looking monocular fisheye camera which is feasible for minimalist rovers from both a computation and electromechanical configuration standpoint. This new visual odometry approach is combined with a sun compass and pose graph optimization to provide high fidelity route determination.

Eugene Fang is a M.S. student in the Robotics Institute advised by William “Red” Whittaker. He received a B.S. in Electrical Engineering and Computer Sciences from the University of California, Berkeley in 2014. His current research focuses on route determination for planetary rovers.

Committee Members:
Red Whittaker (Advisor)
David Wettergreen
Christopher Cunningham

This talk will serve as a RI MS speaking qualifier.

Propulsive spacecraft enable scientific discovery and exploration of the worlds beyond Earth. Autonomous spacecraft have landed on Earth, the Moon, Mars, Mercury, Venus, Titan, asteroids, and a comet. Recently discovered planetary pits allow access to subsurface voids valuable for scientific discovery and sustained exploration. With recent advancements in embedded convex optimization software and trajectory optimization theory, increasingly sophisticated autonomous missions will be able to safely and efficiently reach these unexplored destinations.

This research develops and tests an algorithm for fuel-optimal landing into planetary pits. By representing the safe regions outside and inside a planetary pit as distinct convex spaces, techniques for optimal guidance based on convex optimization are extended to find trajectories into pits. A search routine for time of flight and time of entry into the pit finds globally fuel-optimal landing trajectories. This time search softens constraints on maximum thrust and landed vehicle mass to reliably find solutions without sensitivity to initialization. The algorithm is implemented within a modeling language and uses an embedded solver for convex optimization. The resulting implementation is therefore practical and effective for use in future missions.

The algorithm is tested in landing scenarios that vary vehicle parameters, mission constraints, and pit dimensions. The feasibility and optimality of generated trajectory solutions are examined along with algorithm runtime. This research determines that fuel-optimal guidance capable of landing within planetary pits is viable for future missions.

Neal Bhasin is a M.S. student in the Robotics Institute advised by Prof. Red Whittaker. Neal received his bachelor's degree in Computer Science from Carnegie Mellon in 2015 and has done research in the Planetary Robotics Lab since 2012. He served as team leader on the NASA funded instrument project "Flyover Mapping and Modeling of Terrain Features", which culminated in a successful flight demonstration of high resolution 3D terrain modeling using camera and LIDAR data collected onboard a reusable launch vehicle.

Committee Members:
William "Red" Whittaker (Advisor)
Christopher Atkeson
Christopher Cunningham

Traditional planetary exploration missions have relied upon a single rover acting as the sole mission asset. As exploration pushes us closer and closer to high-risk high-reward locales on the Moon and Mars, such an approach is no longer ideal. Instead, multiple heterogeneous rovers operating in a symbiotic fashion, complementing each other’s strengths and weaknesses, can lead to much greater scientific payout while simultaneously reducing mission risk. This is the core idea behind Symbiotic Exploration.

Symbiotic Exploration offers many benefits, but poses algorithmic challenges within the context of path planning. Such challenges include resource-aware planning and rendezvous and maintaining communications between the rovers for the duration of their plans. While these have been addressed individually through previous work, this research proposes and implements a symbiotic path planning algorithm capable of simultaneously addressing all these constraints while planning routes through highly dynamic planetary environments.

This research shows that routes do exist to high-interest, permanently shadowed sites on the Moon while maintaining symbiotic constraints. The capability set required of each rover to explore these sites is analyzed and determined. Such regions have been previously considered inaccessible but, through the paradigm of Symbiotic Exploration, can be thoroughly explored with significantly reduced risk.

Joseph Amato is a M.S. student in the Robotics Institute at Carnegie Mellon University co-advised by Profs. William "Red" Whittaker and David Wettergreen. He received his B.S. in Robotics Engineering at Worcester Polytechnic Institute in 2012 and spent two years working for Army Operational Test Command at Ft Hood, Texas, before beginning graduate school. His current research focuses on path planning for multiple rovers in planetary environments.

Committee Members:
William "Red" Whittaker (Co-advisor)
David Wettergreen (Co-advisor)
Bernadine Dias
Christopher Cunningham

Human pilots are complex, adaptive and non-linear controllers. It is important for the safety of the manned aircrafts to study and understand the human pilot behavior. Considerable resources are invested during the design phase of an aircraft to ensure that the aircraft has desirable handling qualities. However human pilot exhibit wide range of control behavior which is a function of external stimulus, aircraft dynamics and human psychological properties (workload, stress factor, confidence, pucker factor etc..).

This variability is difficult to address comprehensively during the design phase and may lead to undesirable pilot-aircraft interaction such as Pilot Induced Oscillations (PIO). This emphasizes the need to keep track of human pilot performance during flights to monitor the Pilot Vehicle System (PVS) stability. It can be costly and dangerous to study human pilot on manned aircraft for all possible scenarios.

This work explores the use of remotely controlled aircraft for human pilot studies in the longitudinal axis of the aircraft. To replicate different flight conditions additional failures (time delay, elevator rate limit, and actuator failure) were injected during the flight. To model human pilot a McRuer pilot model was used and the pilot model parameters were estimated online using a Kalman Filter approach. The estimated parameters were then used to analyze the stability of closed loop PVS and predict the onset of pilot related Loss of Control (LOC) events.

Tanmay Kumar Mandal is an Aerospace Engineering Ph.D. candidate in Interactive Robotics Laboratory at West Virginia University. He received his Dual Degree (B.Tech + M.Tech) in Aerospace Engineering from Indian Institute of Technology Kharagpur in 2011. His current research interests are
aerial robotics, guidance, navigation, control, and sensor fusion. He has more than four years of hands on experience on designing and flight testing unmanned aerial systems.

Although the ocean spans most of the Earth’s surface, our ability to explore and perform tasks underwater is still limited to shallow depths and short missions. Towards expanding the possibilities of underwater operations, imaging sonar or forward looking sonar (FLS) is commonly used for autonomous underwater vehicle (AUV) navigation and perception. A FLS provides bearing and range information to a target, but the elevation of the target is unknown within the sensor’s field of view. Hence, current state-of-the-art techniques commonly make a flat surface (planar) assumption so that FLS data can be used for navigation.

A novel approach, entitled acoustic structure from motion (ASFM), is presented for recovering 3D scene structure from multiple 2D sonar images, while at the same time localizing the sonar. Unlike other methods, ASFM does not require a flat surface assumption and is capable of utilizing information from many frames, as opposed to pairwise methods that can only gather information from two frames at once. The optimization of several sonar readings of the same scene from different poses, the acoustic equivalent of bundle adjustment, and automatic data association is formulated and evaluated on both simulated data and real FLS sonar data.

Tiffany Huang is a M.S. student in the Robotics Institute at Carnegie Mellon University advised by Prof. Michael Kaess. She received her B.S. with honors in Mechanical Engineering from the California Institute of Technology in 2014. Her current research focuses on perception and simultaneous localization and mapping (SLAM) algorithms for autonomous underwater vehicles.

Committee Members:
Michael Kaess (advisor)
David Wettergreen
Sanjiban Choudhury

This talk will serve as a RI Speaking Qualifier.

Pages

Subscribe to FRC