Bitte benutzen Sie diese Kennung, um auf die Ressource zu verweisen: http://dx.doi.org/10.18419/opus-12373
Autor(en): Patel, Kanil
Titel: Improving automotive radar spectra object classification using deep learning and multi-class uncertainty calibration
Erscheinungsdatum: 2022
Dokumentart: Dissertation
Seiten: 197
URI: http://nbn-resolving.de/urn:nbn:de:bsz:93-opus-ds-123923
http://elib.uni-stuttgart.de/handle/11682/12392
http://dx.doi.org/10.18419/opus-12373
Zusammenfassung: Being a prerequisite for successful automated driving, ameliorating the perception capabilities of vehicles is of paramount importance for reliable and robust scene understanding. Required for decision-making in autonomous vehicles, scene understanding becomes particularly challenging in adverse weather and lighting conditions; situations also often posing challenges for human drivers. Automotive radars can greatly assist sensors currently deployed on vehicles for robust measurements, especially in challenging conditions where other sensors often fail to operate reliably. However, classification using radar sensors is often limited to a few classes (e.g. cars, humans, and stop signs), controlled laboratory settings, and/or simulations. Already offering reliable distance, azimuth and velocity estimates of the objects in the scene, improving radar-based classification greatly expands the usage of radar sensors for tackling multiple driving-related tasks which are often performed by other less robust sensors. This thesis investigates how automated driving perception can be improved using multi-class radar classification by using deep learning algorithms for exploiting object class characteristics captured in the radar spectra. Despite the highly-accurate predictions of deep learning models, such classifiers exhibit severe over-confidence which can lead decision-making systems to false conclusions, with possibly catastrophic consequences - often a matter of life and death for automated driving. Consequently, high-quality, robust, and interpretable uncertainty estimates are indispensable characteristics of any unassailable automated driving system. With the goal of uncertainty estimates for real-time predictive systems, this thesis also aims at tackling the prominent over-confidence of deep learning classification models, which persists for all data modalities. Being an important measure for the quality of uncertainty estimates, this work focuses on the accurate estimation of the calibration of trained classifiers, as well as present novel techniques for improving their calibration. The presented solutions offer high-quality real-time confidence estimates for classification models of all data modalities (e.g. non-radar applications), as well as classifiers which are already trained and used in practise and new training strategies for learning new classifiers. Furthermore, the presented uncertainty calibration algorithms could also be extended to tasks other than classification, for example, regression and segmentation. On a challenging new realistic automated driving radar dataset, the solutions proposed in this thesis show that radar classifiers are able to generalize to novel driving environments, driving patterns, and object instances in realistic static driving scenes. To further replicate realistic encounters of autonomous vehicles, we study the behaviour of the classifiers to spectra corruptions and outlier detection of unknown objects, showing significant performance improvements in safely handling these prevalent encounters through accurate uncertainty estimates. With the proposed generalization and requisite accurate uncertainty estimation techniques, the radar classifiers in this study greatly improve radar-based perception for scene understanding and lay a solid foundation for current sensor fusion techniques to leverage radar measurements for object classification.
Enthalten in den Sammlungen:05 Fakultät Informatik, Elektrotechnik und Informationstechnik

Dateien zu dieser Ressource:
Datei Beschreibung GrößeFormat 
final_thesis_publications.pdf63,27 MBAdobe PDFÖffnen/Anzeigen


Alle Ressourcen in diesem Repositorium sind urheberrechtlich geschützt.