Human Factors in the Use of Detect-and-Avoid Decision Support Tools by Remote Pilots of Unmanned Aircraft Systems

Authors

  • Nimal Perera Department of Mechanical Engineering, University of Ruhuna, Matara Road No. 67, Matara 81000, Sri Lanka Author
  • Sajith Fernando Department of Mechanical Engineering, South Eastern University of Sri Lanka, University Park, Oluvil 32360, Sri Lanka Author
  • Dulanjan Jayasuriya Department of Mechanical and Manufacturing Engineering, University of Moratuwa, Katubedda, Moratuwa 10400, Sri Lanka Author

Abstract

Unmanned aircraft systems are being deployed in increasingly dense and heterogeneous airspace, with remote pilots operating beyond visual line of sight under constrained, mediated access to the external environment. Detect-and-avoid decision support tools have emerged to assist these operators in maintaining safe separation, resolving conflicts, and coordinating with conventional air traffic services. However, the effective use of such tools depends on how human cognitive, perceptual, and strategic processes adapt to complex automation that filters, transforms, and prioritizes information about surrounding traffic and environmental constraints. This paper examines human factors in the use of detect-and-avoid decision support tools by remote pilots of unmanned aircraft systems through an integrated, model-based lens that links operator workload, trust calibration, attention allocation, and decision dynamics to tool design characteristics and operational demands. A conceptual task analysis is combined with formal modeling of alert processing, evidence accumulation, and compliance with recommended maneuvers, and with a simulation-based framework that represents variable traffic geometries, uncertainty in sensor and surveillance inputs, and differing display configurations. Results from these models are used to articulate conditions under which detect-and-avoid support may mitigate, preserve, or shift error modes for remote pilots supervising single or multiple aircraft. The discussion emphasizes parameterized trade-offs, highlighting how apparently incremental changes in alerting thresholds or visualization methods can alter cognitive demands and decision latencies. The paper concludes with implications for design, training, and regulation that aim to support reliable, transparent, and predictable human use of detect-and-avoid tools, without assuming automation infallibility.

Downloads

Published

2025-11-04

How to Cite

Human Factors in the Use of Detect-and-Avoid Decision Support Tools by Remote Pilots of Unmanned Aircraft Systems. (2025). Journal of Experimental and Computational Methods in Applied Sciences, 10(11), 1-13. https://openscis.com/index.php/JECMAS/article/view/2025-11-04