Skip to main content
Article
Autonomous Aerial Vehicle Vision and Sensor Guided Landing
School of Computer Science & Engineering Faculty Publications
  • Gabriel Bitencourt, Sacred Heart University
  • Elijah J. Brown, Sacred Heart University
  • Cedric Bleimling, Sacred Heart University
  • Gilbert Lai, Quanser, Markham, Canada
  • Arman Molki, Quanser, Markham, Canada
  • Tolga Kaya, Sacred Heart University
Document Type
Peer-Reviewed Article
Publication Date
5-1-2021
Abstract

The use of autonomous landing of aerial vehicles is increasing in demand. Applications of this ability can range from simple drone delivery to unmanned military missions. To be able to land at a spot identified by local information, such as a visual marker, creates an efficient and versatile solution. This allows for a more user/consumer friendly device overall. To achieve this goal the use of computer vision and an array of ranging sensors will be explored. In our approach we utilized an April Tag as our location identifier and point of reference. MATLAB/Simulink interface was used to develop the platform environment.

"Walmart has since upgraded its experimentation to delivering COVID-19 tests in the area around its store location in North Las Vegas"--p.1.

Comments

This research was a collaboration between SHU Engineering and Quanser on a drone system. First international conference paper by undergraduate students Bitencourt and Brown.

This project was also presented at the 2021 Academic Festival, where it received Honorable mention, Dean's Prize: Welch College of Business and Technology.

Creative Commons License
Creative Commons Attribution-NonCommercial-Share Alike 4.0 International
Citation Information

Bitencourt, G., Brown, E. J., Bleimling, C., Lai, G., Molki, A., & Kaya, T. (2021, May). Autonomous aerial vehicle vision and sensor guided landing. IEEE International Conference of Electro/Information Technology, Central Michigan University, Mount Pleasant, MI.