Sunday 21st of January 2018

A New Method Combining Neural Networks and Hough Transform (HT) in Robotic Vision

Mahfoud Hamada and Abdelhalim Boutarfa

Vision is an important sense for the navigation of mobile robots. Indeed this work presents a solution to an interesting and important problem, i.e. visual beacon detection for mobile robots. The proposed approach is based on a combination of a neural network pixel classifier and the Hough Transform to detect shapes in the incoming images. One of the objectives is to enable the robot (CESA) to move in an unspecified environment and acquire the necessary information for its vision. In view of the positive results obtained with a momentum of 0.001 and a coefficient of training equal to 0.015, we can conclude that our system is robust. Also, our algorithm allows a significant reduction of the computation time and can be therefore used in real-time applications. Moreover the proposed architecture can be easily fitted into Field Programmable Gate Array (FPGA) reconfigurable devices, since the present performance increase of this technology allows the implementation of complex applications while real-time constraints in Hough Transform for lines detection are respected for most of the video transmission standards. From this perspective, the present work constitutes an important step toward a better comprehension of the problem and proposes a solution that is robust under diverse conditions.

Keywords: Hough Transform, Neural Network, Robot Vision, FPGA.

Download Full-Text


Mahfoud Hamada
Electrotechnical Laboratory of Batna (LEB), HL University of BATNA, Algeria

Abdelhalim Boutarfa
Electronics Advanced Laboratory (LEA), HL University of BATNA, Algeria

IJCSI Published Papers Indexed By:





IJCSI is a refereed open access international journal for scientific papers dealing in all areas of computer science research...

Learn more »
Join Us

Read the most frequently asked questions about IJCSI.

Frequently Asked Questions (FAQs) »
Get in touch

Phone: +230 911 5482

More contact details »