Constellation, le dépôt institutionnel de l'Université du Québec à Chicoutimi

Soil friction coefficient estimation using CNN included in an assistive system for walking in urban areas

Gensytskyy Oleksiy, Nandi Pratyush, Otis Martin J.-D., Tabi Clinton Enow et Ayena Johannes C.. (2023). Soil friction coefficient estimation using CNN included in an assistive system for walking in urban areas. Journal of Ambient Intelligence and Humanized Computing,

[thumbnail of Soil_COF_paper_2round.pdf] PDF - Version acceptée
Administrateurs seulement jusqu´à 10 Août 2024.


URL officielle:


We present a smartphone-based solution to help visually impaired people autonomously navigate in urban environments. Contrary to previous works in the literature, the newly proposed system in this paper aims to determine the coefficient of friction (COF) of a soil for aiding in the safe movement of blind and visually impaired people (BVIP). Through convolutional neural networks (CNNs) trained to output this measure, our new investigation then offers the possibility of predicting a risk of falling in their next step by determining the maximum static friction force of the ground. Indeed, a commercial smartphone’s camera captures the video of the ground and sends frame as inputs to the CNN model for image segmentation and COF computation. We validated our proposed model in real experiments carried out on 8 types of soils, while also experimenting different CNN models and different optimizers. The proposed ResNet50 CNN-based system provides an accuracy of 96% in the classification of soil type, enabling to guide visually impaired persons. Combined with the associated COF of the soil type, it’s possible to estimate a risk of fall (stick or slip) for the next step in the front of the user from the past measured interaction force (using an instrumented insole) between the soil and the sole. Traditional navigation approaches do not consider the soil characteristics such as COF to guide blind and visually impaired person.

Type de document:Article publié dans une revue avec comité d'évaluation
Version évaluée par les pairs:Oui
Identifiant unique:10.1007/s12652-023-04667-w
Sujets:Sciences naturelles et génie > Génie
Sciences naturelles et génie > Génie > Génie informatique et génie logiciel
Sciences naturelles et génie > Sciences appliquées
Département, module, service et unité de recherche:Unités de recherche > Laboratoire d’automatique et de robotique interactive (LAR.i)
Départements et modules > Département des sciences appliquées > Module d'ingénierie
Mots-clés:smartphone, risk of fall, deep learning, computer vision, téléphone intelligent, risque de chute, apprentissage en profondeur, vision par ordinateur
Déposé le:18 août 2023 16:28
Dernière modification:18 août 2023 16:28
Afficher les statistiques de telechargements

Éditer le document (administrateurs uniquement)

Creative Commons LicenseSauf indication contraire, les documents archivés dans Constellation sont rendus disponibles selon les termes de la licence Creative Commons "Paternité, pas d'utilisation commerciale, pas de modification" 2.5 Canada.

Bibliothèque Paul-Émile-Boulet, UQAC
555, boulevard de l'Université
Chicoutimi (Québec)  CANADA G7H 2B1
418 545-5011, poste 5630