Friday, April 26, 2019

UAS and Manned Aircraft Autonomy


Describe the levels of automation and how they are applied to UAS operations.

According to Marshall et al. (2011), there are several definitions for the level of automation that can be utilized on an Unmanned Aerial System (UAS).  One of these definition separates automation into four classes, information acquisition, information analysis, decision and action selection and action implementation.  What is interesting about this definition is that it mirrors the human cognition, the way we acquire and understand information, and the methods that we use to make decisions.  Another definition presented by Marshall et al. (2011), is a method described by NASA.  This definition utilized the OODA Loop (Observe, Orient, Decide and Act) process to describe what autonomy must be able to accomplish.  This definition assigned 5-levels to each step of the OODA Loop cycle.  At the lowest level, the entire process is completed by the human operator and at the highest level, the OODA Loop is completed by the UAS itself.

Are there different considerations for manned versus UAS operations when it comes to UAS?

There are several key differences between the use of automation on manned and unmanned aircraft.  The biggest difference is how automation reacts in abnormal or emergency situations.  For unmanned aircraft, automation must be able to control the aircraft in the event of a communication loss between the pilot and the aircraft.  This means that is must be extremely reliable and able to handle the aircraft in just about any situation and make decisions when the pilot is not able to.  However, for manned aircraft the pilot needs to be able to take control extremely quickly.  This became evident for the pilot of Qantas Flight 72 when its automated systems caused the aircraft to go outside of its normal operating parameters and enter an extreme descent.  When this happened, the pilot was completely helpless because the plane was ignoring his control inputs and prioritizing the automations logic above that of the pilots (Sydney Herald, 2017).  While this situation would be bad for manned and unmanned aircraft alike, for manned aircraft there is always a pilot available to take control.  For UAS this is not always the case, and the autopilot needs to take priority when no pilot control is available.

Do you think that the aviation industry currently uses the appropriate amount, too little, or too much automation?

Right now, I think that the aviation industry has just the right amount of automation.  The reason for this is that automation is not yet good enough for planes to fly completely on their own, but it is good enough to prevent pilots from being over task-saturated and to aid them during abnormal situations.  Because automation is not good enough to fly planes completely by itself in all scenarios, it is imperative that pilots maintain proficiency in their skills so that they can react adequately in abnormal and emergency situations.  If automation was to be pushed any further pilots would be manually flying less and thus losing proficiency, potentially increasing risk and the chances for an accident to occur.

References:

Marshall, D. M., Barnhart, R. K., Hottman, S. B., Shappee, E., & Most, M. T. (Eds.). (2011). Introduction to unmanned aircraft systems. Retrieved from https://ebookcentral.proquest.com

[The Sydney Morning Herald]. (2017, May 14). When ‘psycho’ automation left this pilot powerless [Video File]. Retrieved from https://www.youtube.com/watchv=2cSh_Wo_mcY&feature=youtu.be

Friday, April 19, 2019

Physiological Issues in UAS


Which OTC medications do you think pose the most significant risk to UAS operators?

There are many medications that can affect you while you are piloting a UAS or manned aircraft, and regardless of what you are flying they can have significant and dangerous side-effects.  Most over the counter cold and flu medications such as antihistamines and decongestants use ingredients that have significant effects on cognitive ability and that can cause significant drowsiness (Pilot Safety, n.d.).  This makes them extremely dangerous to use without first properly assessing how they affect you and the side-effects that you might experience.  Unfortunately, because they are over the counter medication, many people do not think twice about taking them prior to participating in cognitively demanding activities such as flying a UAS.  For this reason, I think that these types of medications are far more dangerous than narcotics or other drugs that more severe side-effects because of the relaxed nature that some people have with them.

What do you think are the most effective mitigation strategies from a human factors perspective that operators can use when conducting UAS operations?

The best way to mitigate the human factor issues that come with the use of medications and other physiological issues is education.  There are tons of studies, articles, and FAA circulations that discuss the effects of fatigue, stress, medication and other physiological factors that effect UAS and manned pilots alike.  Making sure that new and old pilots alike get continues exposure to training on the effects of medications, both prescription and over the counter, is imperative to pilots remaining safe and aware of the risks.  The second most important way to mitigate these physiological factors is to ensure pilots have adequate and easily remembered techniques to help them mitigate their effects by flying only when it is safe to do so.  The IMSAFE technique is a very popular and easy to remember acronym that can help pilots to evaluate themselves before taking the controls of a UAS.  IMSAFE stands for (Pilot | Health, n.d.):

            I – Illness: Are there any illnesses or recent illnesses effecting the pilot?
            M – Medication: Are any medications effecting or impairing your ability to fly?
            S – Stress: Are you experiencing any unusual stress or pressure?
            A – Alcohol: Any alcohol in the last 8 hours and/or feeling the effect of alcohol?
            F – Fatigue: Are you tired and/or not properly rested?
            E – Emotion: Are you upset or distracted by anything?

This popular checklist (or something similar) should be used by every pilot before every flight to assess their condition and assess whether they should be flying.  When combined with proper training and knowledge on the effects of medication, this simple checklist can be a very powerful tool in protecting UAS pilots and manned pilots alike (Pilot | Health, n.d.).

Describe how fatigue and stress affect the safe operation of UAS

Fatigue and stress affect the pilots of UAS in the same ways that they affect the pilots of manned aircraft.  Fatigue can cause a pilot to have reduced cognitive abilities, trouble concentrating, and reduced dexterity.  These symptoms, while just some of what fatigue can cause, can wreak havoc on a pilot’s ability to operate their UAS safely and effectively.  Reduced cognitive function can lead to unsafe situations developing by impairing the pilots decision-making abilities and data processing abilities.  These factors can lead to missed safety concerns or malfunction in the UAS going undetected, potentially leading to property or UAS damage and injury (Salazar, n.d.).

References:

Pilots | Health | Readiness | IMSAFE | Checklist. (n.d.). Retrieved April 19, 2019, from https://www.businessaircraftcenter.com/articles/pilot-s-health-readiness-IMSAFE-check-list-art1014.htm

Pilot Safety: Flying During Cold and Flu Season. (2019, January 03). Retrieved April 19, 2019, from http://hartzellprop.com/pilot-safety-flying-during-cold-and-flu-season/

Salazar, J. (n.d.). Fatigue In Aviation. Retrieved April 19, 2019, from https://www.faa.gov/pilots/safety/pilotsafetybrochures/media/Fatigue_Aviation.pdf

Saturday, April 13, 2019

Aeronautical Decision Making for UAS


            Aeronautical Decision Making (ADM) is a critical aspect of every flight that occurs in our skies.  Whether UAS operators realize it or not, they are always using aspects of ADM when evaluating if they should launch their UAS for their intended flight.  Proper use of ADM techniques and procedures can lead to enhanced safety for both the operator, UAS and bystanders by preventing unsafe situations from occurring or developing all together.  ADM involved several essential components that allow a UAS operator to assess risk.  There are several methods available including the IMSAFE and PAVE acronyms which give pilots a memory jogger to help them remember all the aspects of risk assessment and mitigation.  These checks include assessing the pilot’s health and stress levels as well as the mission and aircraft requirements as well as weather and external pressure that may exist and contribute to unsafe situations.  These procedures may seem tedious and unnecessary, but these methods have been proven to generate consistently safer flights, and in many cases prevent mishaps from occurring (United States, 2016).

            The biggest area of ADM and risk management that sticks out to me in the UAS realm is the lack of ADM use and risk assessment prior to flights.  Cody (2018), cited numerous incidents that occurred with UAS the were spotted flying near airports and manned aircraft as well as several incidents involving UAS striking and damaging manned aircraft.  These accidents that occur near airports are certainly almost completely avoidable by using sound judgment and risk assessment techniques. 

Another major area of ADM for UAS that jumps out to me is the requirement for UAS operators to understand a multitude of different local, state and federal laws and when each applies during their flights.  Traditional manned aircraft follow the same rules regardless of where they fly in the states.  UAS however must follow the local rules set by the municipality they are flying in, especially for knowing when and where they can launch and fly their UAS from.  Then, once they are airborne, they must understand and abide by all federal rules that govern the airspace over the local municipalities.  This creates a unique human factor challenge for UAS operators, greatly increasing the risk of misunderstandings and mistakes occurring.  This is exacerbated by the extreme portability of a lot of small UAS platforms which means that it is easy for operators to fly in many different locations during their travels.  This forces them to learn even more rules and regulations for all the areas they fly in, further increasing the chances of confusion occurring (Fact Sheet, 2016).

References:

Cody, N. (2018). Flight and Federalism: Federal Preemption of State and Local Drone Laws. Washington Law Review, 93(3). Retrieved April 13, 2019, from https://digital.law.washington.edu/dspace-law/handle/1773.1/1840.

Fact Sheet – Small Unmanned Aircraft Regulations (Part 107). (2016, June 21). Retrieved April 12, 2019, from https://www.faa.gov/news/fact_sheets/news_story.cfm?newsId=20516

United States, Federal Aviation Administration. (2016, August). Remote Pilot – Small Unmanned Aircraft Systems Study Guide. Retrieved April 13, 2019, from https://www.faa.gov/regulations_policies/handbooks_manuals/aviation/media/remote_pilot_study_guide.pdf

Saturday, April 6, 2019

UAM, UTM, and NextGen


UAM, UTM, NextGen

Urban Air Mobility (UAM) is an up-and-coming concept that will revolutionize passenger and cargo transportation in the dense urban environments.  This will help to not only reduce the delays of congested roads, but it will also benefit the environment through reduced carbon emissions.  These initiatives revolve highly around the development and integration of unmanned aerial systems (UAS) into the National Airspace System (NAS).  To enable this integration into the NAS, research and development into UAS Traffic Management (UTM) systems will be required.  There are many challenges to this integration including the current rules and regulations not having adequate aircraft density capacities to handle UAM platforms.  New and revolutionary methods of UTM will be required to enable safe and seamless integration with current and future commercial and general aviation industries (Mueller, 2017).  The FAA’s NextGen modernization movement is aimed at increasing the efficiency and effectiveness of the NAS through innovation and advanced technologies.  These technologies include ADS-B, increased automation, data communication, performance-based navigation, information management and decision support system (New Technology, 2019)

UAS and FAA NextGen

The FAA’s NextGen modernization includes many steps in help integrate UAS into the NAS.  Many of the new technologies included in NextGen will bring the FAA into the modern era of communication and data sharing, a feature that many UAS are already leveraging in their designs.  The integration of these technologies into the NAS will allow UAS to tap into them to overcome many of the shortcomings that they experience under the current system.  The NAS Voice System (NVS) will allow ground bound UAS pilots to communicate directly with ATC controllers instead of relying on current line-of-sight based radio communications methods.  Data Communications is another new system the FAA will be incorporating which will allow UAS pilots to communicate via digital, text-based messages, with ATC while also sharing critical flight information such as location, direction, speed and altitude.  The System Wide Information Management (SWIM) servers will also provide UAS pilots with real-time access to information about weather and of mission effecting data.  This will allow UAS pilots increased situational awareness and decision-making abilities, further enhancing safety and efficiency (Williams, 2015).

DSA and UAS NAS Integration

While the FAA’s NextGen initiative does a lot to see increased UAS integration into the NAS, there is one thing that it cannot fix by itself.  Manned aircraft pilots have the inherent ability to look outside of their aircraft and scan for obstacles and hazards such as terrain, weather and other aircraft.  This same task for a UAS pilot, who could be separated from their aircraft by thousands of miles, is nearly impossible.  This poses one of the greatest challenges to UAS integration into the NAS.  The ability for a UAS to Detect, Sense and Avoid (DSA) is essential to safe operation within the NAS.  Many aviation companies, including NASA, have begun to invest into technologies that will allow for safe Detect and Avoid (DAA) systems and standards to be developed.  These technologies will need to be capable of detecting, tracking and warning a UAS pilot of any potential threats to the UAS, and in when required, even redirect the UAS away from the threat automatically (Shively, 2018).  This technology is not only one of the biggest challenges faced by UAS of all shapes and sizes, it is essential to see safe integration of UAS into the NAS.

UAS Lost Link Implications

Lost link situations are an important and common occurrence for current UAS operations.  The effects of a UAS going lost link can ripple to aircraft operating around it, and without proper care and reaction during lost link scenarios, consequences can be catastrophic.  One of the biggest concerns of lost link scenarios is the loss of communication with air traffic control.  With current reliance on line-of-sight radio communications to ATC, UAS operators cannot immediately communicate with ATC when they do not have a communication link with their aircraft.  Some larger UAS that are flown from fixed or semi-permanent ground control stations may have telephone lines available, but this is not the case for all UAS.  The second consideration is the actions taken by the aircraft when it goes lost link.  In most cases, the UAS will fly a lost link flight plan that is preprogrammed into its operating system.  Sometimes this can be as simple as fly to a home point and in other cases it can be programmed by its operator before and during flight to meet mission requirements or ATC requirements.  Of course, human factors can come into play in these scenarios if the operators to not adequately plan for all lost link factors and contingencies, placing the aircraft on an unsafe flight path.  These factors can be further exacerbated operating around unpredictable general aviation and human operated manned aircraft.  This highlights the need for advanced DSA and DAA technologies to augment human controlled UAS when they go lost link as well as to keep autonomously controlled UAS safe as they fly their missions.

References:

Mueller, E. (2017, April 26). Enabling Airspace Integration for High Density Urban Air Mobility. Lecture presented at Uber Elevate Summit in Texas, Dallas. Retrieved April 6, 2019, from https://ntrs.nasa.gov/search.jsp?R=20180000385

New Technology. (2019, March 11). Retrieved April 6, 2019, from https://www.faa.gov/nextgen/how_nextgen_works/new_technology/

Shively, J. (2018, March 14). UAS Integration in the NAS: Detect and Avoid. Lecture. Retrieved April 6, 2019, from https://ntrs.nasa.gov/archive/nasa/casi.ntrs.nasa.gov/20180002420.pdf

Williams, J. H. (2015, January 21). Unmanned Aircraft Systems (UAS) Research and Development. Retrieved from US Department of Transportation: https://www.transportation.gov/content/unmanned-aircraft-systems-uas-research-and-development

Saturday, March 30, 2019

UAS Design in ARVL


            Designing a UAS takes a lot of carful consideration of the mission it will be asked to accomplish.  UAS can carry a multitude of payloads and sensors and can have a very wide performance envelope.  Because of these vast capabilities, one must be carful to not overdesign a UAS by adding features and components that are not required and that do not contribute to the mission.  For the crash lab mission that I selected I went through several design iterations to get the right combination of components which highlighted some of these challenges for me. 

UAS Design

            For my original design I started with the Gadfly Quadrotor base chassis, however I quickly realized that I was not able to fit the required components onto the design.  The chassis was not able to support a gimbaled camera or a camera with both Electro-Optical (EO) and Infrared (IR) spectrums.  Due to these limitations I was quickly forces to switch to the larger Condor Octorotor Professional base vehicle.  This chassis allowed for a gimbaled camera with both EO and IR capabilities to be fitted, as well as a Synthetic Aperture Radar (SAR).  For the ground control station, the Handheld controller was selected to allow for easy operation, minimal operator training and maximum portability.  While the ARVL controls were rather cumbersome to use, the real-world handheld controller would have been the perfect match for this system because it would allow for precise control of the UAS and the cameras fitted to the UAS (ERAU Hub, n.d.).

Performance

            Performance was an important aspect of this UAS design.  The initial design utilizing the Gadfly Quadcopter had the highest performance from both the speed and agility aspect and the battery performance aspect.  Infact, this high performance turned out to be somewhat of a challenge to control.  At maximum speed the Gadfly would reach an airspeed of about 40 (no units in ARVL).  This performance was just not necessary for the mission the UAS was being designed for and it increased the skill level required for the operator to ensure safe operation.  The first design change to the Condor Octorotor solved this problem by cutting the max speed in half and making the UAS much easier to control.  Unfortunately do to the increased power requirement from eight electric motors and increased sensor payloads, the battery life was cut from 18 minutes to 14 minutes.  However, due to the small size of the batteries this was not an issue because the operator could easily carry multiple batteries and change them out as required.  The final performance aspect was communication signal range.  The dipole antenna was selected on both systems, however range still seemed to be more limited on the Gadfly chassis.  This limitation was not an issue for the crash lab mission due to the short-range requirements.  The switch to the larger Condor chassis increased the range, but the change was almost un-noticeable (ERAU Hub, n.d.).

Mission

            The mission accomplishment forced one change to the UAS design.  The original design control scheme was automatic.  However, this present planning challenges due to the rudimentary planning software included in ARVL.  It was difficult to get the UAS into the right position and impossible to keep it there once it began the mission.  While the automatic flight planning did reduce the operator’s workload, making camera control easier, it made getting the right shot for the mission almost impossible.  For this reason, the second major design change was from automatic control to manual control.  This allowed the operator to fly the UAS into the proper position, allow it to automatically hover in position and then adjust the camera as required for the images they needed to capture.  This did increase operator workload, but it was essential to ensure mission accomplishment.  The only other change that as made during the mission testing phase was that the operator view had to be changed from the pilot perspective to the onboard perspective.  This was due to the UAS being to small to see clearly at the distances it was flying from the operator.  It was impossible to determine orientation and camera angles without the onboard view (ERAU Hub, n.d.).
In the end the UAS I designed was able to accomplish the mission with no limiting factors or problems.  While it took multiple iterations to the design, this was expected and had it not occurred it would have been due to pure luck, as modifications are always expected when designing a new system.


References:

ERAU Hub [Computer software]. (n.d.). Retrieved September 28, 2018, from https://erau.instructure.com/courses/84078/pages/resources-virtual-hub?module_item_id=4529679


Friday, March 22, 2019

UAS Human Factors


            Human factors are a multi-faceted issue that is faced by machines such as cars, aircraft, ships, etc.  Unmanned aerial systems are no exception, and in many ways suffer from some of the toughest human factors challenges.  According to Howe (2017), “human factors is the multidisciplinary study of human capabilities and limitations…applied to equipment, systems, facilities, procedures, jobs, environments, training, staffing, and personnel management…for safe, comfortable, and effective human performance.”  Translation: human factors is the study of how man and machine work together and how to improve the way a human’s interface with the machine to operate it safely and effectively.

            One of the biggest areas of study for human is how automation can be utilized to improve the working relationship between the human and the machine.  There are many ways that automation can increase safety, such as reducing workload and thus decreasing stress and fatigue, increasing an operator’s ability to monitor and think about what they are doing, which in turn increases safety and reduces the chances of human error occurring.  Of course, these benefits do not come without concerns.  Automation, if used improperly, can lead to complacency and over-reliance as well as misleading indications or warning in the event of malfunctions (Howe, 2017).  One of the other main human factor challenges that unmanned system face is the separation of the operator from the vehicle they are controlling.  This leads to the loss of most sensory cues, generally only leaving visual cues for operators to rely on.  There are solutions to this challenge by using haptic feedback systems and auditory cues, but many unmanned systems have yet to incorporate them into their designs (Landry, 2017).

            These challenges prevent a unique but very interesting hurdle for unmanned systems to overcome.  The bottom line is that human factors design is a balancing act of information.  To much information and the operator could become task saturated and overwhelmed leading to mistakes or things being missed.  To little information (or to much automation) and operators could be uninformed about what their system is doing which could lead to becoming complacent or unaware of malfunctions that are occurring.

References

Howe, S. (2017, June). The Leading Human Factors Deficiencies in Unmanned Aircraft Systems. Retrieved from https://ntrs.nasa.gov/archive/nasa/casi.ntrs.nasa.gov/20170005590.pdf

Landry, S. J. (Ed.). (2017). Handbook of human factors in air transportation systems. Retrieved from https://ebookcentral.proquest.com

Saturday, August 18, 2018

Sense and Avoid Sensor Selection


            Sense and avoid is a critical piece of operating all types of aircraft, both manned and unmanned alike.  Unmanned aircraft however, do are not afforded the same flexibility and reliability of the human eye and brain to see obstacles and other traffic.  For this reason, unmanned aircraft must rely on complex sensors and computer algorithms to detect and avoid objects.  Small unmanned aerial systems pose a particular challenge to this requirement due to their small nature and low power requirements. Due to these limiting factors, when designing detect and avoid sensor packages for small unmanned aerial vehicles is that they must be size, weight, power and cost (SWaP-C) optimized, while still being able to provide top-notch detection and avoidance capabilities.  There are many approaches that can be taken and sensors available to create a detect and avoid system for small unmanned system such as light detection and ranging (LIDAR), ultrasonic sensors, electro-optical sensors (cameras), and even small radar systems.  Proper selection of these systems is important, as each has its pros, cons and limitations; making no one type a perfect all-around solution (Corrigan, 2018).
            One system that is currently being developed is the Iris Automation Collision Avoidance System (Figure 1).  The system, designed and built by Iris Automation utilizes computer vision (Figure 2) and advanced algorithms to detect obstacles such as aircraft, terrain and even wildlife.  Utilizing a monocular vision camera system and advanced algorithms that generate a highly detailed and robust computer vision image, the collision avoidance system can reliably detect, track and avoid both static and moving obstacles (Technology, n.d.).  Monocular vision sensors enable single images to be processed to create three-dimensional spatial reconstructions.  With the reconstructions distances can be determined by comparing real-time images captured by the camera to pictorial depth cues that are programmed into complex algorithms.  Monocular systems are commonly seen on small unmanned vehicle due to their low cost, weight and power requirements (Corrigan, 2018).
 

Figure 1: Iris Collision Avoidance System (Technology, n.d.)
 

Figure 2: Computer Vision Example (Technology, n.d.)

The system is designed to be highly modular and configurable so that users can incorporate varying numbers of cameras for up to 360-degree situational awareness.  Additionally, the system will be available as a software only package so that users can integrate 3rd party sensors into the system via a seamless plug-and-play interface.  This allows the vision system to be fitted to nearly any type of small unmanned system including single and multi-rotor helicopters and fixed wing aircraft.  The collision avoidance system is designed to be rugged and is built to MIL Spec standards and has been tested in varying and extreme weather conditions and environments.  While the specific details of the size, weigh, power and cost are not yet available because the system is still in the design and testing phase, Iris Automation has promised that this system will be ultra-low SWAP-C optimized and the perfect addition for any small unmanned aerial vehicle (Technology, n.d.).
            To further prove the capabilities of the Iris Collision Avoidance System, several tests have been completed utilizing a life flight with a manned aircraft for the system to detect and avoid.  During one such test, the obstacle aircraft (manned Cessna 162) made three passes varying distances.  The first two passes were conducted at 150 meters and 200 meters respectively.  On each pass the detection system was able to not only detect and track the aircraft, but also determine its distance from the unmanned vehicle and provide a baseline classification of the size of the obstacle aircraft (Figure 3).  On the third pass, the obstacle aircraft can within approximately 500 meters of the unmanned vehicle.  During this pass the system was able to reliably detect and track the aircraft but was not able to classify the aircraft size.  This test proves the reliability of the system to detect and track even distant objects to ensure full awareness is realized (Technology, n.d.).
 

Figure 3: Example of what vision system sees.  Image captured of first pass at about 150 meters (Technology, n.d.)

            Overall the Iris Automation Collision Avoidance System is a perfect fit for almost all types of small unmanned aerial vehicles.  Its small size, weight and power requirements combined with its expected low cost will make it not only easy to integrate into all types of vehicles, but it will make it easily attainable to even the most entry level consumer in the drone industry and market.

References
Corrigan, F. (2018, June 17). Top Collision Avoidance Drones And Obstacle Detection Explained. Retrieved August 17, 2018, from https://www.dronezon.com/learn-about-drones-quadcopters/top-drones-with-obstacle-detection-collision-avoidance-sensors-explained/
Technology. (n.d.). Retrieved August 17, 2018, from https://www.irisonboard.com/technology/

UAS Crewmember/Operator Requirements

What do you think are the most important factors when selecting, certifying, and training UAS Operators?             There are many im...