Gensytskyy Oleksiy, Nandi Pratyush, Otis Martin J.-D., Tabi Clinton Enow et Ayena Johannes C.. (2023). Soil friction coefficient estimation using CNN included in an assistive system for walking in urban areas. Journal of Ambient Intelligence and Humanized Computing,
Prévisualisation |
PDF
- Version acceptée
1MB |
URL officielle: http://dx.doi.org/10.1007/s12652-023-04667-w
Résumé
We present a smartphone-based solution to help visually impaired people autonomously navigate in urban environments. Contrary to previous works in the literature, the newly proposed system in this paper aims to determine the coefficient of friction (COF) of a soil for aiding in the safe movement of blind and visually impaired people (BVIP). Through convolutional neural networks (CNNs) trained to output this measure, our new investigation then offers the possibility of predicting a risk of falling in their next step by determining the maximum static friction force of the ground. Indeed, a commercial smartphone’s camera captures the video of the ground and sends frame as inputs to the CNN model for image segmentation and COF computation. We validated our proposed model in real experiments carried out on 8 types of soils, while also experimenting different CNN models and different optimizers. The proposed ResNet50 CNN-based system provides an accuracy of 96% in the classification of soil type, enabling to guide visually impaired persons. Combined with the associated COF of the soil type, it’s possible to estimate a risk of fall (stick or slip) for the next step in the front of the user from the past measured interaction force (using an instrumented insole) between the soil and the sole. Traditional navigation approaches do not consider the soil characteristics such as COF to guide blind and visually impaired person.
Type de document: | Article publié dans une revue avec comité d'évaluation |
---|---|
ISSN: | 1868-5137 |
Version évaluée par les pairs: | Oui |
Date: | 2023 |
Identifiant unique: | 10.1007/s12652-023-04667-w |
Sujets: | Sciences naturelles et génie > Génie Sciences naturelles et génie > Génie > Génie informatique et génie logiciel Sciences naturelles et génie > Sciences appliquées |
Département, module, service et unité de recherche: | Unités de recherche > Laboratoire d’automatique et de robotique interactive (LAR.i) Départements et modules > Département des sciences appliquées > Module d'ingénierie |
Mots-clés: | smartphone, risk of fall, deep learning, computer vision, téléphone intelligent, risque de chute, apprentissage en profondeur, vision par ordinateur |
Déposé le: | 18 août 2023 16:28 |
---|---|
Dernière modification: | 10 août 2024 04:00 |
Éditer le document (administrateurs uniquement)