The objective of this project is to explore how an autonomous vehicle must safely respond to different classes of emergency vehicles using sound, vision and other onboard sensors. Emergency vehicles can belong to police, fire, hospital and other responders. The autonomous vehicle in the presence of an emergency vehicle must have the ability to accurately sense its surroundings in real-time and be able to safely yield to the emergency vehicles. System safety is the main theme where we work with TEEX Law Enforcement and Security Training, and through them with local police/fire department. Currently, an integrated system approach which considers both sensing and control in addition to the test procedures and safety analysis is not available for an autonomous vehicle in emergency scenarios. This project will address this gap and experimentally verify the performance of the proposed algorithms on our autonomous vehicle (Ford-Lincoln MKZ) available through the Connected Autonomous Safe Transportation (CAST) program. The research performed through this project will also be published in peer-reviewed conferences and journals, and will support the thesis of two graduate students .
- Novel vision-based techniques used for identifying Emergency Vehicles (EV), and control algorithms for safely parking the autonomous vehicle on the curbside after an EV is identified were developed.
- A large set of videos containing emergency responders in action was collected at RELLIS Campus to test and benchmark the performance of the vision algorithms used in this project.
EWD & T2 Products
S. Rathinam and Dr. S. Gopalswamy presented a summary of the demonstrations performed in this project at the Autonomous Vehicles Conference, Brookings Institution, Washington D.C. on July 25, 2019 Research slides and videos presented: https://sites.google.com/tamu.edu/ravev/publications?authuser=0
Other products (e.g. databases, physical collections, audio or video products, software or NetWare, models, educational aids or curricula, instruments or equipment, data, and research material) Videos of all our demonstrations, data sets can be seen in the following links: Publications: https://sites.google.com/tamu.edu/ravev/publications?authuser=0; Research Data: https://sites.google.com/tamu.edu/ravev/research-data?authuser=0
Student Impact Statement – Abhishek Nayak and Mengke Liu (pdf): The student(s) working on this project provided an impact statement describing what the project allowed them to learn/do/practice and how it benefited their education.
Nayak, A., Gopalswamy, S., and Rathinam, S. Vision-Based Techniques for Identifying Emergency Vehicles. SAE Technical Paper 2019-01-0889, 2019, doi:10.4271/2019-01-0889. (Published)
Nayak, A. S. Rathinam, S. Gopalswamy, Texas A&M Transportation Technology Conference, 2018. (Accepted)
Research Investigators (PI*)
Sivakumar Rathinam (TAMU)*
Swaminathan Gopalswamy (TTI/TAMU)
Start Date: 01/01/2018
End Date: 10/15/2019
Grant Number: 69A3551747115
Total Funding: $272,758
Source Organization: Safe-D National UTC
Project Number: 03-051
Safe-D Theme Areas
Safe-D Application Areas
Planning for Safety
UTC Project Information Form
Office of the Assistant Secretary for Research and Technology
University Transportation Centers Program
Department of Transportation
Washington, DC 20590 United States
Texas A&M University
Texas A&M Transportation Institute
College Station, Texas 77843-3135