Uji Akurasi Grid-Edge-Depth Map Pada Purwarupa Alat Bantu Melihat Menggunakan Kamera Stereo

Budi Rahmani(1*)
(1) STMIK Banjarbaru
(*) Corresponding Author
DOI : 10.35889/progresif.v18i2.924

Abstract

The Grid-Edge-Depth Map (GED-map) algorithm can measure objects with distances from 64 cm to 500 cm and has been tested on a wheeled robot to help it avoid obstacles in the experimental environment. This study examines the accuracy of the GED-map that can be used to help people with visual impairments. The research was conducted by modifying a stereo camera into a device that can be used by humans such as a helmet and processed by a laptop. The test was carried out with three scenarios that required the user to stand at a distance of 100 cm, 125 cm and 150 cm from a table, chair, or wall with 15 tests in each scenario. The results of the distance measurement by the system are then computed based on the angle of the installed stereo camera and compared the results with the real distance. The test results show an average accuracy of 94.86% with 3 experimental scenarios, which means that this tool is feasible to implement. It's just that research and improvement need to be done to reduce the size of the distance measurement processing computer that currently uses a laptop.

Keywords: GED-map; Visual impairment; Stereo cameras; Viewing aids

 

Abstrak. Algoritme Grid-Edge-Depth Map (GED-map) dapat mengukur objek dengan jarak antara 64 cm sampai dengan 500 cm dan telah diuji pada robot beroda untuk membantunya menghindari halangan yang ada di lingkungan percobaan. Penelitian ini menguji akurasi GED-map yang dapat dimanfaatkan untuk membantu para penyandang disabilitas penglihatan. Penelitian dilakukan dengan memodifikasi kamera stereo menjadi perangkat yang bisa digunakan oleh manusia seperti helm dan diproses oleh sebuah laptop. Pengujian dilakukan dengan tiga skenario yang menyaratkan pengguna alat untuk berdiri pada jarak 100 cm, 125 cm dan 150 cm terhadap meja, kursi, maupun tembok dengan 15 kali pengujian pada setiap skenarionya. Hasil pengukuran jarak oleh sistem kemudian dikomputasi berdasarkan sudut kamera stereo yang terpasang dan dibandingkan hasilnya terhadap jarak riil. Hasil pengujian menunjukkan rerata akurasi sebesar 94,86% dengan 3 skenario percobaan, yang artinya alat ini layak untuk diimplementasikan. Hanya saja penelitian dan penyempurnaan perlu dilakukan untuk mengurangi ukuran komputer pemroses pengukuran jarak yang sementara ini masih menggunakan laptop.

Kata kunci: GED-map; Disabilitas penglihatan; Stereo camera; Alat bantu melihat

References


B. Rahmani, A. Harjoko, and T. K. Priyambodo, “Grid-edge-depth map building employing sad with sobel edge detector,” Int. J. Smart Sens. Intell. Syst., vol. 10, no. 3, pp. 551–566, 2017, doi: 10.21307/ijssis-2017-223.

A. Maftuhin, “Mengikat Makna Diskriminasi: Penyandang Cacat, Difabel, dan Penyandang Disabilitas,” Inklusi, vol. 3, no. 2, pp. 1–24, 2016, doi: 10.14421/ijds.030201.

S. Briesen, H. Roberts, and R. P. Finger, “The impact of visual impairment on health-related quality of life in rural Africa,” Ophthalmic Epidemiol., vol. 21, no. 5, pp. 297–306, 2016, doi: 10.3109/09286586.2014.950281.

T. Octastefani and B. M. A. Kusuma, “The Rise of Ojek Difa: Positioning Difabel as Subject in Providing Inclusive Public Transportation Service for Yogyakarta,” in The 5th ASIAN ACADEMIC SOCIETY INTERNATIONAL CONFERENCE, pp. 364–371, April 2017.

A. Kurniawan, “Alat Bantu Jalan Sensorik bagi Tunanetra,” Inklusi, vol. 6, no. 2, p. 285, 2019, doi: 10.14421/ijds.060205.

T. B. Pamungkas, “Rancang Bangun Tongkat Ultrasonik Pendeteksi Halangan Dan Jalan Berlubang Untuk Penyandang Tunanetra Berbasis Atmega16,” Repository, 2013. .

M. J. Arrofi, M. Ramdani, and Estananto, “Perancangan Alat Bantu Untuk Penderita Tunanetra Dengan Sensor Ultrasonik Menggunakan Logika Fuzzy Aiding Tool Design for Blind People Using Ultrasonic,” e-Proceeding Eng., vol. 4, no. 2, pp. 1497–1504, 2017.

M. N. Al Hasan, C. I. Partha, and Y. Divayana, “Rancang Bangun Pemandu Tuna Netra Menggunakan Sensor Ultrasonik Berbasis Mikrokontroler,” Maj. Ilm. Teknol. Elektro, vol. 16, no. 3, p. 27-32, 2017, doi: 10.24843/mite.2017.v16i03p05.

B. Rahmani et al., “Review of Vision-Based Robot Navigation Method,” IAES Int. J. Robot. Autom., vol. 4, no. 4, pp. 254–261, 2015, doi: 10.11591/ijra.v4i4.8514.

A. Alsaab and R. Bicker, “Behavioral Strategy for Indoor Mobile Robot Navigation in Dynamic Environments,” Int. J. Eng. Sci. Innov. Technol., vol. 3, no. 1, pp. 533–542, 2014.

C. Ezequiel, “Real-Time Map Manipulation for Mobile Robot Navigation,” ProQuest, 2013.

B. Rahmani, H. Aprilianto, H. Ismanto, and H. Hamdani, “Distance estimation based on color-block: A simple big-O analysis,” Int. J. Electr. Comput. Eng., vol. 7, no. 4, pp. 2169–2175, 2017, doi: 10.11591/ijece.v7i4.pp2169-2175.

B. Rahmani, A. Harjoko, T. K. . K. Priyambodo, and H. Aprilianto, “Research of Smart Real-time Robot Navigation System,” in AIP The 7th SEAMS-UGM Conference 2015, vol. 1707, pp. 1–8, 2015, doi: 10.1063/1.4940854.

Y. Kim and S. Kwon, “A heuristic obstacle avoidance algorithm using vanishing point and obstacle angle,” Intell. Serv. Robot., pp. 175–183, 2015, doi: 10.1007/s11370-015-0171-4.

B. Rahmani, A. Harjoko, and T. K. Priyambodo, “A vision-based real-time obstacle avoidance’s rules utilising grid-edge-depth map,” Indones. J. Electr. Eng. Comput. Sci., vol. 19, no. 1, pp. 513–525, 2020, doi: 10.11591/ijeecs.v19.i1.pp513-525.

B. Rahmani, A. Harjoko, T. K. Priyambodo, and H. Aprilianto, “Early Model of Vision-Based Obstacle Mapping Utilizing Grid-Edge-Depth Map,” Int. J. Eng. Adv. Technol., vol. 9, no. 2, pp. 4519–4523, 2019, doi: 10.35940/ijeat.B4550.129219.

M. Nguyen, “3D Stereo Camera Calibration.” 2016.

S. Camera and C. Overview, “Stereo Calibration App,” 2016. .

H. Syahputra, A. Harjoko, R. Wardoyo, and R. Pulungan, “Improving Disparity Map of A Specific Object in A Stereo Image Using Camera Calibration , Image Rectification, and Object Segmentation,” Int. J. Appl. Eng. Res., vol. 9, no. 22, pp. 17939–17949, 2014.


How To Cite This :

Refbacks

  • There are currently no refbacks.