Friday, July 27, 2018

Unmanned System Data Protocol and Format

            There are many highly capable and effective unmanned aerial systems (UAS) available for consumers for missions of all shapes and sizes. One of the most common types of system for everyday civilian use is one designed for aerial photography and videography. The DJI Mavic Air is one of those systems (Figure 1). Designed to be small, compact and portable, the Mavic Air can capture 4K video and take high resolution photos with ease (Mavic, n.d.).

Figure 1: DJI Mavic Air (Mavic, n.d.)

             The Mavic Air employs a host of technology for capturing or aiding in the capturing of high quality video and photography. The Mavic Air is designed about a small but capable 12-megapixel CMOS camera system. The camera is capable of capturing video from 720p at 120 frames per second all the way up to 4K video at 30 frames per second and still pictures in multiple modes including panorama, sphere, interval and high dynamic range (Mavic Air, n.d.). This camera, while highly capable, is only able to capture the high-quality images if given a smooth, stable platform. This is partly accomplished using a 3-axis gimbal system and partly accomplished through automation. One of the hallmark technologies of the DJI systems is their Flight Autonomy 2.0, a system that can do exactly what is needed, keep the camera as stable as possible. This system uses a suite of sensors to enable both precision autonomous flight and reliable obstacle avoidance, allowing the operator to focus on getting the perfect shot. This system consists of the primary camera described above, dual-vision cameras mounted on the front, rear and bottom of the vehicle, and a downward pointing infrared sensor (Figure 2). The information from these sensors is then combined with dual inertial measurement unit data and sent to a set of processors that interpret the data and generate precise control commands for vehicle control (Mavic, n.d.).

Figure 2: Mavic Air Dual-Vision Sensors for Flight Autonomy 2.0 (Mavic Air, n.d.)

            Each one of the sensors in the Flight Autonomy 2.0 system plays a vital role in the control and safety of the vehicle. The dual-vision cameras are utilized to the maximum extent possible and play a two-fold role in the system. The first role is positioning using vision analysis. The cameras constantly monitor the position and motion of the vehicle to determine its location in space, a function that is of particular importance during indoor operations where GPS signals can be limited. Secondly, the cameras provide a 3-dimentional map to the control system which can greatly increase the safety of the vehicle by increasing its obstacle avoidance abilities. The infrared sensor is also used for high determination and 3D map building, ensuring that not objects or obstacles are missed by the vision system (Mavic, n.d.).

            To power the system the Mavic Air utilized DJIs Intelligent Flight Battery System. The battery is a densely packed, 3-cell lithium polymer design with a capacity of 2375 milliamp-hours, operating at 11.55 volts (Figure 3). This allows for flight times reaching 21-minutes in ideal (no wind) conditions while flying at a speed of 25 kilometers per hour. Not only does this battery operate the 4 electric motors, but it also operates all the sensors and cameras installed on the vehicle. Power consumption data for each of the sensors and onboard computers was not available, just the overall system (Atkinson, 2018).

Figure 3: Mavic Air Intelligent Flight Battery (Mavic Air, n.d.)

             The data storage and processing for the Mavic Air is all done internally, and the video sent back to the smart controller is only used for control purposes for both the vehicle and camera system and is not saved on the receiving end. The Mavic Air contains eight-gigabytes of internal data storage utilizing the FAT32 file system. Video files are captured and stored using either the MP4 or MOV file formats and still imagery is saved utilizing either the JPEG or DNG (RAW) formats. The size of captured videos varies drastically based upon settings utilized for the camera system and the duration of the flights the operate is conducting. Since 4K video files (the worst case scenario for storage) can quickly become rather large, eight-gigabytes is often not enough storage. For this reason, the Mavic Air has expandable storage through the use of an external SD card slot. It can utilize SD cards ranging is size up to 128-gigabytes. Files stored on the SD card are processed and stored using the same file system and file formats. In general, data from the sensing suite utilized by the Flight Autonomy 2.0 system is not stored by the Mavic Air. The only exception to this would be the 3-dimensional map that the system generates during its flights. However, no information was readily available on how large these files would be or what file format is utilized (Mavic Air, n.d.).

             There are few ways that the Mavic Air could be improved upon when it comes to file storage and processing. One thing the system can never have enough of is storage. Currently, the vehicle only has one SD card slot, but because of the small size and light-weight of these SD cards it would be very easy to add additional slots to the vehicle to increase the amount of storage it has. Another data treatment enhancement could include the ability for the primary camera data to be stored external to the vehicle via a transmission system. Not only could this allow the storage to easily be increased because weight would be a much smaller factor, but it would also reduce the power consumption and weight of the vehicle which could help to increase flight time and performance characteristics. One limitation of this style of data management would be bandwidth. The current transmission system only has a limited amount of bandwidth which means that adjustments to bandwidth allocations would be required. Additionally, with high-resolution video transmission time would be longer meaning that delays in video feed could be seen. This would result in real-time, manual control, being impossible for the operator. However, despite these shortfalls, the advantage of increased flight times and storage capacities could greatly enhance the Mavic Air.

References

Atkinson, D. (2018, February 23). Heliguy Insider. Retrieved July 26, 2018, from https://www.heliguy.com/blog/2018/02/23/dji-mavic-air-in-depth-series-part-1-intelligent-flight-battery/

Mavic Air: Adventure Unfolds. (n.d.). Retrieved July 26, 2018, from https://www.dji.com/mavic-air?site=brandsite&from=nav

Mavic Air– Specs, FAQ, Tutorials and Downloads. (n.d.). Retrieved July 26, 2018, from https://www.dji.com/mavic-air/info#specs

Thursday, July 19, 2018

UAS Sensor Placement


            There are many things that one must take into consideration when purchasing an unmanned aerial system.  Of course, one must first decide what purpose the UAS will serve.  This will then lead to a decision on what sensors and capabilities the UAS will need to have.  One thing that a user must determine is where sensors must be places to best fulfill their purpose on the UAS.  While UASs designed for professional video and image capture and first-person view racing have very similar looks, they are designed to be quite different.  The remainder of this discussion will focus on the design criteria and selection of a commercially available UAS for both purposes.

There are many things that must be considered when selecting an unmanned aerial system (UAS) for aerial video and photography.  One of the most important factors to consider is the placement of the primary camera and the capabilities of the primary camera.  There are many purpose built unmanned aerial systems for video and image capture.  One of these systems, the DJI Phantom 4 Pro V2.0 (Figure 1), is arguably one of the best ones on the market.  Equipped with a one-inch, 20-megapixel camera which can capture 4K video at 60 frames-per-second and still imagery at up to 14 frames-per-second.  The Phantom 4 Pro V2.0 can stream live video back to the control during flight using DJIs OcuSync HD transmission system.  Utilizing a dual frequency transmission system, the OcuSync streaming video transmission can provide crystal clear video at ranges of up to seven kilometers.  Finally, the transmission system can constantly assess the environment it is operating in and adjust to help maintain the optimum signal and minimal interference (Phantom 4 pro v2, n.d.).

Figure 1: DJI Phantom 4 Pro V2.0 UAS for aerial video and imagery capture (Phantom 4 pro v2, n.d.)

When it comes to the Phantom 4 Pro V2.0, there are two main reasons that I selected it for use in video and still photography.  First, is the camera placement.  Mounted underneath the main body of the UAS it is free from interference by the propellers and the body of the vehicle.  The only obstacle that the camera must worry about is the landing gear for the vehicle.  Luckily, the highly capable Flight Autonomy system that DJI has created is capable of compensating for that.  This is exactly why the Flight Autonomy system is the second reason that I picked the Phantom 4 Pro V2.0.  This Flight Autonomy software features full, five-directions obstacle avoidance to help keep operators minds at ease and focused on getting the perfect shot.  Finally, the Flight Autonomy features a hold of software for tracking and safety to help operators efficiently operate the UAS (Phantom 4 Pro, n.d.).

A racing drone has many different parts that must be considered.  Eachine, makes a highly capable first-person-view racing drone called the Wizard X220 (Figure 2).  This racing UAS can achieve speeds of up to 50 miles per hour and is highly maneuverable due to the way they have the motors mounted and its light weight of only 364-grams (without a battery).  It can fly for up to twelve minutes utilizing either a 4S or 3S battery and has directional indicator lights for easy directional control.  It is controlled via a 5.8 GHz handheld transmitter which can also receive a live video stream back from the UAS for control purposes.  Many racers also utilize first-person-view goggles for flying the race, which this UAS is easily setup for.  The main reason for choosing this UAS for racing is the way its front mounted camera is setup.  Many racing UAS have the camera setup to only look directly forward.  However, the Wizard X220 has a front mounted camera that can tilt up and down.  This gives the operator maximum forward view, even when the UAs is flying in the forward tilted attitude for maximum forward speed (Eachine, n.d.).  This extra viewing angle could potentially help the operator to edge out the competition when using the Wizard X220.

Figure 2: Wizard X220 First Person View Racing UAS (Eachine, n.d.)

References:

Eachine Wizard X220 FPV Racing Drone. (n.d.). Retrieved July 19, 2018, from https://www.eachine.com/Eachine-Wizard-X220-FPV-Racing-Drone-Blheli_S-F3-6DOF-2205-2300KV-Motors-5_8G-48CH-200MW-VTX-700TVL-Camera-ARF-Version-p-569.html
Phantom 4 Pro - Professional aerial filmmaking made easy. (n.d.). Retrieved July 19, 2018, from https://www.dji.com/phantom-4-pro
Phantom 4 pro v2 | quadcopter for aerial photography. (n.d.). Retrieved July 19, 2018, from https://www.dji.com/phantom-4-pro-v2?site=brandsite&from=homepage

Saturday, July 14, 2018

Unmanned Systems Maritime Search and Rescue


The Malaysian Airlines MH370 disappeared somewhere over the Indian Ocean over four years ago, with all 239 souls lost.  The aircraft was believed to have crashed somewhere between 1,500 kilometers and 2,700 kilometers west of Australia (Figure 1) (A Fantastical, 2018).  The depth of the ocean in this area ranges anywhere from 1,500 meters deep to almost 6,000 meters deep, presenting a huge challenge for any search mission that takes place (Deep, 2014).   Over these past four years, Fugro has mounted a sizeable search, covering over 120,000 square kilometers of ocean floor, for the ill-fated MH370 using autonomous underwater vehicles (AUVs), such as the Bluefin-21, but none have returned any findings.  Many believe that with the amount of time that has passed, all hope of ever finding the missing aircraft on the unusually rugged seafloor has been lost (A Fantastical, 2018). 

Figure 1: Depth profile of Indian Ocean where MH370 was believed to have crashed (Deep, 2014)

One company believes there is still hope.  Ocean Infinity, a Houston, Texas based company has rented out the Seabed Constructor, a research ship, and fitted with eight HUGIN 6000 AUVs.  They believe that by using eight of the HUGIN 6000 AUVs, they will be able to cover over 1,200 square kilometers per day allowing them to search much more quickly than previous efforts have, and thus allowing them to cover considerably more ocean floor over the duration of their voyage.  Her new search area, covering roughly 25,000 square kilometers sits just north of the original search area and has been chosen by experts through careful evaluation of wreckage that has washed ashore since the crash (Figure 2).  If nothing is found, the vessel will then continue even further north a location some independent expert’s thing might yield results (A Fantastical, 2018).  So why did Ocean Infinity choose to launch as unmanned mission?  Well its simply, the ocean is a dangerous place, especially at the depths where they believe MH370 could be.  The HUGIN can stay submerged for up to 60 hours, well longer than any human mission could, and if anything goes wrong with the sub and it is lost no risk to life is seen.  A manned mission does not provide any benefits over an unmanned mission, the sensors are the same.  The only main difference would be duration and analysis.  The only benefit that a human could give is real time data analysis which would allow a manned sub to immediately investigate further if a possible discovery is made.

Figure 2: New search area, based off of wreckage that washed up (A Fantastical, 2018)

The HUGIN 6000 AUV, the vehicle of choice for Ocean Infinity, is made by Kongsberg, a company whose passion lies in the water.  The HUGIN AUV is a highly capable system that can reach depths of up to 6000 meters and that is fitted with numerous sensors for underwater search and surveillance.  Fitted with a massive 300-kilogram lithium-polymer battery pack, each of the eight AUVs will be able to stay submerged for up to 60 hours (A Fantastical, 2018).  This endurance, coupled with a sustainable speed of 3.6 knots will allow the AUVs to cover significant ground on each excursion to the deep (Figure 3).  For search and surveillance, the HUGIN is fitted with multiple sensors specifically made for use in maritime environments.  These sensors include a EdgeTech 2205 side scan sonar, a house made Kongsberg EM 2040 echosounder, an EdgeTech 2-16 KHz bottom profiler, a HD CathX color still camera, a turbidity sensor and a magnetometer (Figure 4).  The AUV is also equipped with an acoustic tether, allowing it to be operated in either a semi-autonomous or fully autonomous manner during its 60 hours missions (HUGIN, n.d.).
Figure 3: Internal view of HUGIN 6000 (HUGIN, n.d.)
Figure 4: External view of HUGIN 6000 (HUGIN, n.d.)

Not all the sensors fitted on the HUGIN are required to search for MH370.  However, three of its exteroceptive sensors are very well suited to the maritime environment that MH370 likely came to its final resting place in.  The bottom scanner, as sonar system, will be used to profile the area below the sub to map the ocean floor and aid in navigation and control, as well as to look for debris.  The EdgeTech 2205 side scan sonar is where the HUGIN really makes its money.  This sonar will be used to scan the ocean floor on each side of the sub and will essentially measure the reflectivity or intensity of the radar signals that return.  Since metal reflects more radar than sand, the aircraft should standout well if it is found.  Then, if a possible metallic object is found, the magnetometer will spring into action to confirm the discovery of metal on the ocean floor.  This data is all stored in internal storage on the sub for evaluation by human’s post mission.  If after human review, something in the stored data presents a possible finding, one of the eight HUGINs will be sent back to that area to a more detailed check and will be assigned to take pictures of the area with its onboard HD camera (A Fantastical, 2018).

While the HUGIN is a highly capable AUV, there are several things that could make it better.  First, is better autonomy for data analysis.  This would allow the sub to scrub through the information that it is collecting in real-time to determine if there is anything of interest on the sea floor.  This would allow the sub to then maneuver and activate its HD camera right away to capture images of any possible findings.  That information could then be stored for human analysis post mission and the sub could resume its original routing.  This would increase the efficiency of the mission by making it so subs did not have to return to locations except in the case of a likely positive discovery.  Another possible upgrade for the AUVs is the addition of unmanned aerial systems (UAS).  Each of the AUVs could be fitted with a communications relay that could send data to an orbiting UAS during short times at the surface throughout its mission.  This UAS could then send that data over the horizon to a mother ship for analysis in more near real-time.  If the data showed something of interest, new instructions could then be relayed back to the sub to gather more information in a specific location.  This could again increase the efficiency of the mission.

References:

A fantastical ship has set out to seek Malaysian Airlines flight 370. (2018, January 02). Retrieved July 13, 2018, from https://www.economist.com/science-and-technology/2018/01/02/a-fantastical-ship-has-set-out-to-seek-malaysian-airlines-flight-370

Deep sea challenge for MH370 search. (2014, April 15). Retrieved July 13, 2018, from https://www.bbc.com/news/world-asia-27033921

HUGIN 6000. (n.d.). Retrieved July 14, 2018, from https://oceaninfinity.com/wp-content/uploads/AUV_Factsheet.pdf

Saturday, July 7, 2018

Environmental Monitoring Using Unmanned Vehicles


Response to the article: Development and validation of a UAV based system for air pollution measurements by Villa, Salimi, Morton, Morawska, & Gonzalez (2016).

Environmental monitoring is one of the numerous applications that have yet to reach their full potential in the unmanned realm.  Villa, Salimi, Morton, Morawska & Gonzalez (2016) researched the possibility of using unmanned systems for pollution measurement and developed an unmanned aerial vehicle optimized for this task.  While the selection criteria of the vehicle itself is not covered extensively in this journal, the writer discussed some of the challenges faced when it comes to sensor placement on the vehicle due to the interference experienced from the wash of the propeller system on the vehicle.  For their vehicle platform, they chose the DJI S800 EVO, a hexacopter that can carry up to a 20-kilogram payload (Figure 1).  After modifications, the S800 EVO was able to sustain flight for about 12-13 minutes with the payload required for the study (Villa, Salimi, Morton, Marawska & Gonzalez, 2016).

Figure 1: S800 EVO Hexacopter (Villa et al., 2016)

For sensors, the S800 EVO was fitted with four exteroceptive gas sampling sensors, three for detection of CO, NO and NO2 and one for the detection of CO2 (Figure 1).  For CO, NO and NO2 sensing, the Alphasense gas sensor was selected, which utilizes electrochemical cells to detect gas levels.  For CO2 detection the SprintIR sensor was selected.  This sensor utilizes non-dispersive infrared (NDIR) technology to detect gas levels.  To evaluate the readings from the probes, the DISCmini was selected.  This monitoring system can detect gas concentrations from 103 to 106 p/cm3, while remaining small and light enough to fitted to the S800 EVO UAV platform (Villa et al., 2016).

            During the two tests conducted by these researchers, they explored the effects of the propeller wash on the readings of the probes as well as the best placement location for the sensors to minimize the effects that the vehicle had on the readings.  The first test explored the best positioning for the gas probes.  Using a boom with the gas sensors attached, experiments were conducted using 3 different locations for the boom: to the side, above and below the UAV.  During each of these tests the aimed to find where the effects of propeller air mixing a turbulence would be the lowest.  Figure 2 shows the results of the 4 variations that were completed during test one.  The results shown clearly indicate that the lowest effect was experienced to along the X-axis of the vehicle at distances between 1000 mm and 1200 mm. 

Figure 2: Results from test 1 (Villa et al., 2016)
            
          The second test the conducted was to determine the effect of the propeller wash on the concentration of the gases.  They conducted several variations of the test with the sensor placed in various positions relative to the gas and with the vehicles propellers on and off.  Figures 3 ad 4 describe the test results.  As expected, the propellers do have a significant effect on concentration levels.  Additionally, when the sensors are positioned within the gas plume the concentration levels rise (Villa et al., 2016).

Figure 3: Gas concentration comparison with UAV propellers on and off (Villa et al., 2016)


Figure 4: Gas concentration with sensor above, below and inside gas plume (Villa et al., 2016)
           
          While the results of these test are only conclusive for the gases that were tested, the results show that environmental monitoring using UAVs is a possibility.  There will be several limitations such as gas type, wind speed, and UAV design, but with further research some of these challenges could be overcome (Villa et al., 2016).  There are many developers and companies that have begun similar work.  The EPA has expressed interest in the use of UAV for many different types of environmental monitoring, highlighting the potential of these systems to revolutionize the industry.  There interests range from gas and pollutant monitoring to wildlife and wildfire monitoring as well as industrial emission monitoring (The Role, 2015).  This technology, once properly researched and perfected could have a profound effect on the health of our planet and the steps that we as humans take to protect it.


References:

The Role of Unmanned Aerial Systems-Sensors in Air Quality Research. (2015, November 16). Retrieved July 7, 2018, from https://cfpub.epa.gov/si/si_public_record_report.cfm?dirEntryId=310260
Villa, T. F., Salimi, F., Morton, K., Morawska, L., & Gonzalez, F. (2016). Development and validation of a UAV based system for air pollution measurements. Sensors, 16(12), 2202. doi:http://dx.doi.org.ezproxy.libproxy.db.erau.edu/10.3390/s16122202

About the Author

Welcome All!

My name is Douglas Weidman and I am a Officer in the United States Air Force.  I currently work as a Remotely Piloted Aircraft (RPA) Pilot.

I graduated from Missouri University of Science and Technology in 2012 with a Bachelors Degree in Aerospace Engineering.  After seven years of working in the unmanned systems field I have decided to pursue a Masters Degree in Unmanned Systems from Embry-Riddle Aeronautical University to further my knowledge and understanding of all things unmanned.  My passion is for aviation, but I am open to the exploration of unmanned system that encompass all domains.

This blog is dedicated to my studies of unmanned system at Embry-Riddle and will feature articles and discussions about all things unmanned.

You are welcome to leave a comment on any of my posts, but please remain professional and constructive.

UAS Crewmember/Operator Requirements

What do you think are the most important factors when selecting, certifying, and training UAS Operators?             There are many im...