Autel Robotics was founded in 2014 based out of Shenzhen, China with presence in Seattle, Silicon Valley in USA and Munich, Germany. Autel (parent company) is the world's leading manufacturers and suppliers of professional diagnostic tools, equipments and accessories in the automotive aftermarket. Since mid-2014, we established Autel Robotics, dedicated to deliver ground-breaking solutions for new aerial exploration through our quadcopters and camera drone technology.
We are a team of industry professionals with a genuine passion for technology and years of engineering experience. We focus on transforming complex technology into simple solutions, creating easy-to-use aerial devices for photography/filming and imaging. Our current products include X-Star, a consumer quadcopter integrated with a 4K stabilized camera and indoor positioning system; It’s a combination of complex algorithms and advanced engineering delivered in one concise package. We also have Kestrol - a tilt-rotor UAV that combines the fixed-wing airplane’s energy-saving feature and the quadcopter's VTOL ability.
Please RSVP to assure an accurate headcount
About the Speakers:
Xinye Liu: Xinye is currently the Sr. Product Manager of Autel Robotics. Before that, she entered the drones industry right after graduation from Carnegie Mellon University in 2014. Since then, Xinye has combined her solid engineering background with business strategies, successfully led multiple projects including consumer quadcopters and even unmanned aerial vehicles. As an experienced drone industry professional, Xinye is willing to share her experience in this industry for the past 2 years.
Angela Wang: Angela is currently the Global Overseas HR Director of Autel Robotics. Angela has over 20 years of experiences in large multinational company including Procter & Gamble, Dell, Symantec in the international capacity working in China, Canada and USA. In recent years she has been in HR Business Partner roles helping business leaders to develop and implement people and organization development strategies. Angela Joined Autel since February 2016 based in Silicon Valley and she is responsible for all Autel International locations outside of China for Human Resources and Talent management.
We hope to see you there!
Branching structures are ubiquitous elements in several environments on Earth, from trees found in nature to man-made trusses and power lines. Being able to navigate such environments provides a difficult challenge to robots ill-equipped to handle the task. In nature, locomotion through such an environment is solved by apes through a process called brachiation, where movement is performed by hand-over-hand swinging.
This thesis outlines the development of a two-link Brachiating robot. We will present our work on implementing an Energy-based Controller where we inject or remove energy into the system before assuming the grasping posture. We will show that the controller can solve the ladder problem and swing-up for continuous contact brachiating gaits, and compare it to other control approaches in simulation. We will also show our work in developing a real-world brachiating robot, and show the implementation of our controller in this robot.
Zongyi Yang is a M.S. student in the Robotics Institute advised by David Wettergreen. He received a B.S. in Engineering Science ECE Option from University of Toronto in 2014. His current research focuses on robot brachiation.
David Wettergreen (Advisor)
Globally, horticulture is facing many challenges. The most significant of these challenges range from scalability to meet growing food demands, the impacts of constantly increasing labour requirements, produce loss (waste), yield security, hygiene and more. Most people can conceptualise how robotics will assist with many labour intensive horticultural roles to minimise or replace the direct labour requirement. This will ultimately help with other challenges like scalability, yield security, hygiene and food security but can the disruptive impact of robotics go beyond this? We think it will!
Each year around 50% of all produce is wasted. Taking an example from the Kiwifruit industry, we look at how the integration of MARS (mechanisation, automation, robotics and sensors) technologies through the value chain will be able to minimise waste, further adding to scalability and food security benefits of robotics. Two case studies are presented of technologies we have developed that will help deliver these robotic benefits. This will look at an autonomous orchard robot for tasks like harvesting and pollination, as well as a robotic apple packer that integrates into current packhouse systems.
Steven Saunders (Ngai Te Ahi) has 30 years’ experience in the Horticultural sector and is the founder, owner and Managing Director of the Plus Group of companies, specialising in horticulture management consultancy, global pollen production, robotics development, soil consultancy, international ventures, applied technology, research and development / innovation and science.
Steven is the co-founder of Newnham Park Innovation Centre in Te Puna, Tauranga, which hosts eight local export award winning independent companies predominately in the food sector. Steven is also a major stakeholder in the kiwifruit postharvest sector, an active Angel investor (a number of the Angel investments are food based), a Seed Co Investment Fund (SCIF) Director and investor representative, a Director of a number of privately owned companies, an elected member of the executive board for Priority One (driving economic growth in the Bay of Plenty), Board member of Enterprise Angels Tauranga, Crown appointed Director Landcare Research and a member of the Stanford Primary sector Alumni. Pollen Plus Ltd won the 2010 Bay of Plenty (BOP) Emerging Exporter of the Year award and Gro Plus won the 2007 Environment BOP Gallagher Innovation award and the 2007 Balance Nutrient Management award.
Steven is a founder of “WNT Ventures” tech incubator which was one of the 3 NZ Callaghan awarded tech incubators (a partnership between private sector investment and the government). Robotics Plus was awarded a 10 Million targeted research grant in a collaboration between Auckland University, Waikato University and Plant and Food Research. Robotics Plus features on the NZTE New Zealand Story.
The Plus Group (beneficially owned by the Saunders Family Trust) has a strong history of supporting New Zealand based research and development activities including Vision Mātauranga. This support extends well beyond co-funding several government assisted projects directly into self-funding research in the commercial and Maori environment. Steven was awarded the Tauranga Chamber of Commerce, Westpac “Business Leadership Award in 2014”
Today’s planetary robotic exploration is carried out by large, lumbering rovers. Due to the expense of such rovers, the resulting missions are risk averse. Small, high-cadence, minimalist rovers are poised to break new ground by expanding space exploration capabilities. Whether by decreasing overall mission costs or enabling symbiotic exploration among multiple low-cost rovers, these minimalist rovers can allow missions to be more risk tolerant and more rapidly explore areas previously deemed too treacherous.
Existing methods of localization and route determination of planetary rovers are expensive, both in computational time and power requirements. As a result, they are limited in their speed and performance even on today’s large, expensive rovers. The ability to quickly and efficiently estimate a rover’s route becomes even more crucial as the size, mass, computation, and power budgets continue to shrink. A method for inexpensive route determination will enable safer, faster, and smarter navigation of these minimalist rovers.
This research presents a novel approach for computationally efficient visual odometry with an unactuated downward looking monocular fisheye camera which is feasible for minimalist rovers from both a computation and electromechanical configuration standpoint. This new visual odometry approach is combined with a sun compass and pose graph optimization to provide high fidelity route determination.
Eugene Fang is a M.S. student in the Robotics Institute advised by William “Red” Whittaker. He received a B.S. in Electrical Engineering and Computer Sciences from the University of California, Berkeley in 2014. His current research focuses on route determination for planetary rovers.
Red Whittaker (Advisor)
This talk will serve as a RI MS speaking qualifier.
Propulsive spacecraft enable scientific discovery and exploration of the worlds beyond Earth. Autonomous spacecraft have landed on Earth, the Moon, Mars, Mercury, Venus, Titan, asteroids, and a comet. Recently discovered planetary pits allow access to subsurface voids valuable for scientific discovery and sustained exploration. With recent advancements in embedded convex optimization software and trajectory optimization theory, increasingly sophisticated autonomous missions will be able to safely and efficiently reach these unexplored destinations.
This research develops and tests an algorithm for fuel-optimal landing into planetary pits. By representing the safe regions outside and inside a planetary pit as distinct convex spaces, techniques for optimal guidance based on convex optimization are extended to find trajectories into pits. A search routine for time of flight and time of entry into the pit finds globally fuel-optimal landing trajectories. This time search softens constraints on maximum thrust and landed vehicle mass to reliably find solutions without sensitivity to initialization. The algorithm is implemented within a modeling language and uses an embedded solver for convex optimization. The resulting implementation is therefore practical and effective for use in future missions.
The algorithm is tested in landing scenarios that vary vehicle parameters, mission constraints, and pit dimensions. The feasibility and optimality of generated trajectory solutions are examined along with algorithm runtime. This research determines that fuel-optimal guidance capable of landing within planetary pits is viable for future missions.
Neal Bhasin is a M.S. student in the Robotics Institute advised by Prof. Red Whittaker. Neal received his bachelor's degree in Computer Science from Carnegie Mellon in 2015 and has done research in the Planetary Robotics Lab since 2012. He served as team leader on the NASA funded instrument project "Flyover Mapping and Modeling of Terrain Features", which culminated in a successful flight demonstration of high resolution 3D terrain modeling using camera and LIDAR data collected onboard a reusable launch vehicle.
William "Red" Whittaker (Advisor)
Traditional planetary exploration missions have relied upon a single rover acting as the sole mission asset. As exploration pushes us closer and closer to high-risk high-reward locales on the Moon and Mars, such an approach is no longer ideal. Instead, multiple heterogeneous rovers operating in a symbiotic fashion, complementing each other’s strengths and weaknesses, can lead to much greater scientific payout while simultaneously reducing mission risk. This is the core idea behind Symbiotic Exploration.
Symbiotic Exploration offers many benefits, but poses algorithmic challenges within the context of path planning. Such challenges include resource-aware planning and rendezvous and maintaining communications between the rovers for the duration of their plans. While these have been addressed individually through previous work, this research proposes and implements a symbiotic path planning algorithm capable of simultaneously addressing all these constraints while planning routes through highly dynamic planetary environments.
This research shows that routes do exist to high-interest, permanently shadowed sites on the Moon while maintaining symbiotic constraints. The capability set required of each rover to explore these sites is analyzed and determined. Such regions have been previously considered inaccessible but, through the paradigm of Symbiotic Exploration, can be thoroughly explored with significantly reduced risk.
Joseph Amato is a M.S. student in the Robotics Institute at Carnegie Mellon University co-advised by Profs. William "Red" Whittaker and David Wettergreen. He received his B.S. in Robotics Engineering at Worcester Polytechnic Institute in 2012 and spent two years working for Army Operational Test Command at Ft Hood, Texas, before beginning graduate school. His current research focuses on path planning for multiple rovers in planetary environments.
William "Red" Whittaker (Co-advisor)
David Wettergreen (Co-advisor)
Human pilots are complex, adaptive and non-linear controllers. It is important for the safety of the manned aircrafts to study and understand the human pilot behavior. Considerable resources are invested during the design phase of an aircraft to ensure that the aircraft has desirable handling qualities. However human pilot exhibit wide range of control behavior which is a function of external stimulus, aircraft dynamics and human psychological properties (workload, stress factor, confidence, pucker factor etc..).
This variability is difficult to address comprehensively during the design phase and may lead to undesirable pilot-aircraft interaction such as Pilot Induced Oscillations (PIO). This emphasizes the need to keep track of human pilot performance during flights to monitor the Pilot Vehicle System (PVS) stability. It can be costly and dangerous to study human pilot on manned aircraft for all possible scenarios.
This work explores the use of remotely controlled aircraft for human pilot studies in the longitudinal axis of the aircraft. To replicate different flight conditions additional failures (time delay, elevator rate limit, and actuator failure) were injected during the flight. To model human pilot a McRuer pilot model was used and the pilot model parameters were estimated online using a Kalman Filter approach. The estimated parameters were then used to analyze the stability of closed loop PVS and predict the onset of pilot related Loss of Control (LOC) events.
Tanmay Kumar Mandal is an Aerospace Engineering Ph.D. candidate in Interactive Robotics Laboratory at West Virginia University. He received his Dual Degree (B.Tech + M.Tech) in Aerospace Engineering from Indian Institute of Technology Kharagpur in 2011. His current research interests are
aerial robotics, guidance, navigation, control, and sensor fusion. He has more than four years of hands on experience on designing and flight testing unmanned aerial systems.
Although the ocean spans most of the Earth’s surface, our ability to explore and perform tasks underwater is still limited to shallow depths and short missions. Towards expanding the possibilities of underwater operations, imaging sonar or forward looking sonar (FLS) is commonly used for autonomous underwater vehicle (AUV) navigation and perception. A FLS provides bearing and range information to a target, but the elevation of the target is unknown within the sensor’s field of view. Hence, current state-of-the-art techniques commonly make a flat surface (planar) assumption so that FLS data can be used for navigation.
A novel approach, entitled acoustic structure from motion (ASFM), is presented for recovering 3D scene structure from multiple 2D sonar images, while at the same time localizing the sonar. Unlike other methods, ASFM does not require a flat surface assumption and is capable of utilizing information from many frames, as opposed to pairwise methods that can only gather information from two frames at once. The optimization of several sonar readings of the same scene from different poses, the acoustic equivalent of bundle adjustment, and automatic data association is formulated and evaluated on both simulated data and real FLS sonar data.
Tiffany Huang is a M.S. student in the Robotics Institute at Carnegie Mellon University advised by Prof. Michael Kaess. She received her B.S. with honors in Mechanical Engineering from the California Institute of Technology in 2014. Her current research focuses on perception and simultaneous localization and mapping (SLAM) algorithms for autonomous underwater vehicles.
Michael Kaess (advisor)
This talk will serve as a RI Speaking Qualifier.
The use of aerial robots to inspect bridges, buildings, and other types of infrastructure is becoming widespread due to their advantages in providing sensor mobility, enabling consistent data collection over long time intervals, and improving the safety of human inspectors. Inspection missions require aerial robots to fly outdoors in close proximity to inspection targets, often under conditions where freestream wind and surface-induced airflow adversely impact flight performance.
First-principles computational fluid dynamics techniques for predicting the influence of aerodynamic disturbances on vehicle dynamics are generally too computationally expensive to run online, limiting their utility for inspection.
We develop a regression-based strategy to predict aerodynamic disturbances based on the vehicle's velocity, the geometry of its surrounding environment, and the freestream wind. Disturbances encountered in past flight experiences are used to train a model for predicting disturbances along potential inspection paths, which can be used in cost functions that enable planners to generate trajectories adapted to local aerodynamic conditions. To allow the model to generalize across different environments, we develop descriptors that parametrize a class of rectangular geometries commonly encountered in engineered structures.
Towards demonstrating the applicability of the proposed approach in outdoor inspection scenarios, we show preliminary results from flight tests in tunnel-like environments as well as in artificially generated wind under laboratory conditions.
John Yao is a Ph.D. student in the Robotics Institute at Carnegie Mellon University, advised by Dr. Nathan Michael. He received a B.A.Sc. in Engineering Science (Aerospace Major) from the University of Toronto in 2013. He is interested in applying inference-based approaches to improve state estimation and control for vehicles with dynamics that are significantly affected by stochastic, spatially varying environmental forces.
Achieving meaningful exploration and discovery in our universe pivots on knowing where we are as we navigate the unknown. In many terrestrial cases this problem - the localization problem - has been remedied by precise a priori mapping of environments and developing advanced infrastructure such as global positioning satellites (GPS). However, current extraterrestrial surface exploration robots do not have these absolute measurement tools at their disposal, and achieving precise localization is still a limiting and time consuming problem. Even though it is essential to modern spacecraft and naval vessels, observing the sun and stars as absolute orientation references had, until now, not yet been fully exploited in real-time on surface rovers. To this end, this work asserts that using a visual solar compass as a navigational aid to continuously determine absolute bearing vastly improves the ability of planetary rovers to navigate and localize, and supports this claim through the presentation of experimental results from the development and field testing of a visual solar compass.
Curtis Boirum is a M.S. student in the Robotics Institute advised by William “Red” Whittaker. He received an M.S. and B.S. in Mechanical Engineering from Bradley University in Peoria, IL in 2011 and 2009, respectively. He also received a B.S. in Physical Science from Eureka College in Eureka, IL in 2010. He has previously researched novel omnidirectional propulsion mechanisms for small multi-rotor aircraft and surface rovers. His current research is in novel methods to improve the capabilities of surface rovers through the application of new technology and theory. Navigation, and specifically localization for small planetary rovers has been his focus at Carnegie Mellon University.