Advances in Autonomous Systems

in #drones7 years ago (edited)

In 2013, within the context of the UAS Roadmap and the NextGen Airspace System (NextGen), Congress asked the FAA to integrate UAS into the national airspace (NAS) by 2015 but planners have revealed that this integration has proven to be challenging. NextGen will utilize satellite information to enhance the efficiency of the NAS, with the ability to integrate routine UAS flights. A primary concern of NextGen and the integration of UAS is the safety of manned aircraft. In order to ensure the safety of manned and other aircraft, UAS must have sense and avoid systems; communication or autonomous recognition of other aircraft. Some experts in the field have estimated that the necessary sense and avoid technology will not be available for at least another decade (Snow, 2016). Even though Congress’ deadline may have been overly ambitious, advances in sense and avoid technology are indeed coming along. In 2015, General Atomics successfully demonstrated it’s on board Due Regard Radar (DRR) air-to-air radar system that would allow aircraft to communicate with one another, even if they were not equipped with communicable technology (Merlin, 2015). “The DRR is comprised of a two panel Active Electronically Scanned Array (AESA) Antenna and a Radar Electronics Assembly (REA) that give the RPA pilot the ability to detect and track aircraft across the same Field-of-View (FOV) as a manned aircraft. AESA technology allows DRR to track multiple targets while simultaneously continuing to scan for new aircraft” (General Atomics, 2017). Additionally, companies like DJI and Intel, who produce small commercial UAS, have introduced sensor suites that allow them to see and react to their environment, obstacle avoidance, on popular product lines like the Phantom quad UAV. “The DJI Phantom 4 drone can intelligently navigate around obstacles by using computer vision to reconstruct a 3D model of its environment and then use that information to avoid objects on the return home” (J., C., 2016).

Another innovation in the field of unmanned systems and robotics comes in regards to the way in which they are controlled. Semi-autonomous unmanned systems that require a level of human in the loop control are controlled through a ground control station (GCS). GCS vary in complexity based on the platform, and the actuators for commands, for example, joysticks, or buttons, also vary. The Predator UAV, a large fixed wing UAV used by the military is controlled and managed by human pilots through a large, cockpit styled GCS often housed in trailers. Smaller commercial UAS are often controlled with a handheld controller linked with a mobile device. Brain computing interface (BCI) has been around since the 1970s and has primarily been used for medical purposes including mechanical appendages. However, BCI can also be applicable to the control systems of unmanned systems. BCI interprets electrical signals from the brain into machine codes or commands (Chen et al, 2010). These electrical signals are detected by over the surface electrodes within a cap or a headset and can also be surgically implanted (Han & Bin, 2004). A headset is “calibrated to identify the electrical activity associated with particular thoughts in each wearer’s brain — recording, for example, where neurons fire when the wearer imagines pushing a chair across the floor. Programmers write code to translate these “imaginary motion” signals into commands that computers send to the drones” (Dearen, 2016). In this way, BCI technology should be differentiated from mind control technology using will alone. The requirement of thinking specific thoughts to create specific brain waves is a current limitation to the technology. Still, BCI could reduce the natural latency of human pilots that inhibits reaction speed and streamline the control process reducing the signature of current GCS. Various departments at universities across the United States have invested time and study to this exciting field. The world’s first “mind control drone race” was held at the University of Florida in 2016 (Furness, 2016).
References:

Chen, W., Zhang, J., Zhang, J., Li, Y., Qi, Y., Wu, B., Zhang, S., Dai, J., Zheng, X., Xu, D.
(2010). A P300 based online brain-computer interface system for virtual hand control. Journal of Zhejiang University: Science C (Impact Factor: 0.42), 11(8), 587-597. doi:10.1631/jzus.C0910530 from http://www.researchgate.net/publication/225808225_A_P300_based_online_brain-computer_interface_system_for_virtual_hand_control

Dearen, Jason. (2016). Mind-controlled drones race to the future. Powering the New Engineer.
N.p., 2017. Web. 5 June 2017.

Furness, Dyllan. The university of flordia just held the world’s first mind-controlled drone
race. Digitaltrends.com. N.p., 2017. Web. 5 June 2017.

General Atomics. Due Regard Radar (Developmental). General Atomics Aeronautical Systems
Inc.. N.p., 2017. Web. 5 June 2017.

Han, Y., Bin, He. (2004). Brain–computer Interfaces Using Sensorimotor Rhythms:
Current State and Future Perspectives. Biomedical Engineering, IEEE Transactions on , vol.61, no.5, pp.1425-1435, doi: 10.1109/TBME.2014.2312397
from http://ieeexplore.ieee.org.ezproxy.libproxy.db.erau.edu/stamp/stamp.jsp?tp=&arnumber=6775293&isnumber=6800026

Brown , M. (2016). BCI Image. Retrieved from http://www.engineering.com/Education/EducationArticles/ArticleID/12046/Why-Not-Fly-a-Drone-With-Your-Mind.aspx

J., C. (2016). The New DJI Phantom 4 with Auto Obstacle Avoidance Technology. Air Drone
Craze. Web. 5 June 2017. Retrieved from http://www.airdronecraze.com/the-new-dji-phantom-4-with-auto-obstacle-avoidance-technology/

Merlin, P. (2015). NASA, FAA, industry conduct initial sense-and-avoid test. NASA. Web. 28
Oct. 2016. Retrieved from https://www.nasa.gov/centers/armstrong/Features/acas_xu_paves_the_way.html.

Snow, C. (2016). Sense and avoid for drones is no easy feat. Drone Analyst. Web. 28 Oct. 2016.
Retrieved
from http://droneanalyst.com/2016/09/22/sense-and-avoid-for-drones-is-no-easy-feat/.

Coin Marketplace

STEEM 0.20
TRX 0.18
JST 0.032
BTC 92446.68
ETH 3289.43
USDT 1.00
SBD 2.88