Airside Surveillance by Computer Vision in Low-Visibility and Low-Fidelity Environment
Paper ID
Conference
Year
Theme
Project Name
Keywords:
Authors
DOI
Project Number
Abstract
Low visibility can severely reduce the airside capacity of an airport and can cause ground delays and runway/taxiway incursions. With the advent of digital towers, enabled through live camera feeds, computer vision can contribute to airside surveillance to enhance safety and improve operational efficiency. However, digital camera technology presents its challenges where technical issues may affect the video quality, resulting in low-fidelity transmission effects such as blurring, pixelation, or JPEG compression. Furthermore, poor weather conditions in an aerodrome, including rain, fog, and mist, can greatly reduce visibility, whether based on digital video or out-of- tower view, which can reduce visual situational awareness for tower controllers. This paper proposes a computer vision framework and deep learning algorithms to detect and track aircraft in low-visibility (due to bad weather) and low-fidelity (due to technical issues) environments to enhance visibility using digital video input. The framework adopts a Convolutional Neural Network to detect aircraft and applies a Kalman Filter technique to track aircraft, especially under low visibility conditions. The performance of the proposed framework is further improved by pre/post-processing algorithms, including object filtering, corrupted image detection, and image enhancement. The proposed framework achieves a tracking accuracy of 0.91 for clean videos and 0.79 and 0.74 for low-fidelity and low-visibility environments, respectively. The framework is found to be effective on the airport video dataset from Houston airport in improving visibility in poor weather conditions.