Improving 3d pedestrian detection for wearable sensor data with 2d human pose

Show simple item record

dc.identifier.uri http://dx.doi.org/10.15488/15933
dc.identifier.uri https://www.repo.uni-hannover.de/handle/123456789/16059
dc.contributor.author Kamalasanan, V.
dc.contributor.author Feng, Y.
dc.contributor.author Sester, M.
dc.contributor.editor Zlatanova, S.
dc.contributor.editor Sithole, G.
dc.contributor.editor Barton, J.
dc.date.accessioned 2024-01-17T11:03:02Z
dc.date.available 2024-01-17T11:03:02Z
dc.date.issued 2022
dc.identifier.citation Kamalasanan, V.; Feng, Y.; Sester, M.: Improving 3d pedestrian detection for wearable sensor data with 2d human pose. In: Zlatanova, S.; Sithole, G.; Barton, J. (Eds.): XXIV ISPRS Congress “Imaging today, foreseeing tomorrow”, Commission IV. Katlenburg-Lindau : Copernicus Publications, 2022 (ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences ; V-4-2022), S. 219-226. DOI: https://doi.org/10.5194/isprs-annals-v-4-2022-219-2022
dc.description.abstract Collisions and safety are important concepts when dealing with urban designs like shared spaces. As pedestrians (especially the elderly and disabled people) are more vulnerable to accidents, realising an intelligent mobility aid to avoid collisions is a direction of research that could improve safety using a wearable device. Also, with the improvements in technologies for visualisation and their capabilities to render 3D virtual content, AR devices could be used to realise virtual infrastructure and virtual traffic systems. Such devices (e.g., Hololens) scan the environment using stereo and ToF (Time-of-Flight) sensors, which in principle can be used to detect surrounding objects, including dynamic agents such as pedestrians. This can be used as basis to predict collisions. To envision an AR device as a safety aid and demonstrate its 3D object detection capability (in particular: pedestrian detection), we propose an improvement to the 3D object detection framework Frustum Pointnet with human pose and apply it on the data from an AR device. Using the data from such a device in an indoor setting, we conducted a comparative study to investigate how high level 2D human pose features in our approach could help to improve the detection performance of orientated 3D pedestrian instances over Frustum Pointnet. eng
dc.language.iso eng
dc.publisher Katlenburg-Lindau : Copernicus Publications
dc.relation.ispartof XXIV ISPRS Congress “Imaging today, foreseeing tomorrow”, Commission IV
dc.relation.ispartofseries ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences ; V-4-2022
dc.rights CC BY 4.0 Unported
dc.rights.uri https://creativecommons.org/licenses/by/4.0
dc.subject 3D pedestrian detection eng
dc.subject augmented reality eng
dc.subject human pose estimation eng
dc.subject shared space eng
dc.subject wearable sensor eng
dc.subject.classification Konferenzschrift ger
dc.subject.ddc 550 | Geowissenschaften
dc.title Improving 3d pedestrian detection for wearable sensor data with 2d human pose eng
dc.type BookPart
dc.type Text
dc.relation.essn 2194-9050
dc.relation.doi https://doi.org/10.5194/isprs-annals-v-4-2022-219-2022
dc.bibliographicCitation.volume V-4-2022
dc.bibliographicCitation.firstPage 219
dc.bibliographicCitation.lastPage 226
dc.description.version publishedVersion
tib.accessRights frei zug�nglich


Files in this item

This item appears in the following Collection(s):

Show simple item record

 

Search the repository


Browse

My Account

Usage Statistics