Development learning algorithm of mobile robot to detect obstacles in confined space
Автор: Avseeva O.V., Larina M.V.
Журнал: Вестник Воронежского государственного университета инженерных технологий @vestnik-vsuet
Рубрика: Информационные технологии, моделирование и управление
Статья в выпуске: 3 (73), 2017 года.
Бесплатный доступ
XXI century – the age of high technology. Now our life is surrounded by information technology: everyone has a cell phone, computer and other appliances (sometimes called gadgets). Themselves without noticing, we increasingly use them in their lives. Already it reached the point that we begin to use the robots for their own purposes. In less than ten years, each will have their own personal mobile robot. Properly programmed robot can perform many tasks. One of the common problems encountered in the design of robots, is the problem of developing an effective obstacle avoidance algorithm. Robot during their movement should not get stuck and stop until they have accomplished the task. To successfully perform this task in robot memory card should be stored workspace, which marked the places where there are obstacles to the robot. It is necessary to develop an algorithm that allows the robot to build a map of the area for the final (minimal) time. The map is a surface enclosed space that is divided into squares. The area of each square equal to the area of the robot base or square robot step length. To study the space used by an algorithm based on the algorithms of “Spiral” movement of the robot and the “movement along the wall”. There are many obstacles crawling algorithms. The most effective is Dijkstra's algorithm. This algorithm applies only to graphs with non-negative weights. Dijkstra's algorithm finds the shortest way to the top of the graph in order to remove them from a given source vertex. Central to Dijkstra's algorithm is that it is enough to compare the length of such paths
Mobile robot, algorithm, map of the area
Короткий адрес: https://sciup.org/140229882
IDR: 140229882 | DOI: 10.20914/2310-1202-2017-3-65-67