In Press                   Back to the articles list | Back to browse issues page

XML Print


Abstract:   (100 Views)
Touch, one of the fundamental human senses, is essential for understanding the environment by enabling object identification and stable movements. This ability has inspired significant advancements in artificial neural networks for object recognition, texture identification, and slip detection applications. However, despite their remarkable capacity to simulate tactile perception, artificial neural networks consume considerable energy, limiting their broader adoption. Recent developments in electronic skin technology have brought robots closer to achieving human-like tactile perception by enabling asynchronous responses to temperature and pressure changes, thereby enhancing robotic precision in tasks like object manipulation and grasping.
This research presents a Spiking Graph Convolutional Network (SGCN) designed for processing tactile data in object recognition tasks. The model addresses the redundancy in spiking-format input data by employing two key techniques: (1) data compression to reduce the input size and (2) batch normalization to standardize the data. Experimental results demonstrated a 93.75% accuracy on the EvTouch-Objects dataset, reflecting a 4.31% improvement, and a 78.33% accuracy on the EvTouch-Containers dataset, representing an 18% improvement. These results underscore the SGCN's effectiveness in reducing data redundancy, decreasing required time steps, and optimizing tactile data processing to enhance robotic performance in object recognition.
Full-Text [PDF 1589 kb]   (29 Downloads)    
Type of Study: Research Paper | Subject: Robotics
Received: 2024/09/16 | Revised: 2025/01/22 | Accepted: 2025/01/13

Rights and permissions
Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.

Creative Commons License
© 2022 by the authors. Licensee IUST, Tehran, Iran. This is an open access journal distributed under the terms and conditions of the Creative Commons Attribution-NonCommercial 4.0 International (CC BY-NC 4.0) license.