視覚障がい者のための物体認識と物体認識支援

Saved in:
Bibliographic Details
Published in知能と情報 Vol. 32; no. 3; pp. 75 - 79
Main Author 滝沢, 穂高
Format Journal Article
LanguageEnglish
Published 日本知能情報ファジィ学会 15.06.2020
Online AccessGet full text
ISSN1347-7986
1881-7203
DOI10.3156/jsoft.32.3_75

Cover

Author 滝沢, 穂高
Author_xml – sequence: 1
  fullname: 滝沢, 穂高
  organization: 筑波大学システム情報系
BookMark eNpNj0tLw0AcxBepYFs9-jES95Hsf_eiSPEFBS96XrbJRhtqK0ku3lzRg1DxoIgUwVtRQb149dOs1PgtjA_Ey8xvGBiYBqr1B32D0DzBPiMhX0jzQVL4jPpMQTiF6kQI4gHFrFYxC8ADKfgMauR5ijGXOCR1tFSOr8rx6GN04-zQ2ePy8MTZJ2dv3ZGt4P30_u31onw4Kx-vnb37HyeXz5Pzl1k0nehebuZ-vYm2V1e2Wutee3Nto7Xc9lJCifY0ljFoiE2CBU4EaEq4MQZjHOCkI2JtiOQmZjGHiiSXIqQQiQ4PICASgDXR4s9umhd6x6j9rLunswOls6Ib9Yz6_q4YVexLIPwrol2dqVSzT5xCbEU
ContentType Journal Article
Copyright 2020 日本知能情報ファジィ学会
Copyright_xml – notice: 2020 日本知能情報ファジィ学会
DOI 10.3156/jsoft.32.3_75
DeliveryMethod fulltext_linktorsrc
Discipline Engineering
Mathematics
EISSN 1881-7203
EndPage 79
ExternalDocumentID article_jsoft_32_3_32_75_article_char_ja
GroupedDBID 29K
2WC
5GY
ABJNI
ACGFS
ALMA_UNASSIGNED_HOLDINGS
CS3
D-I
E3Z
EBS
EJD
JSF
KQ8
OK1
RJT
RNS
ID FETCH-LOGICAL-j121a-a09d7a7def080f87a216eee00040fb8dae196ed3d67e199698527c8b647419773
ISSN 1347-7986
IngestDate Wed Sep 03 06:31:10 EDT 2025
IsDoiOpenAccess true
IsOpenAccess true
IsPeerReviewed false
IsScholarly true
Issue 3
Language English
LinkModel OpenURL
MergedId FETCHMERGED-LOGICAL-j121a-a09d7a7def080f87a216eee00040fb8dae196ed3d67e199698527c8b647419773
OpenAccessLink https://www.jstage.jst.go.jp/article/jsoft/32/3/32_75/_article/-char/ja
PageCount 5
ParticipantIDs jstage_primary_article_jsoft_32_3_32_75_article_char_ja
PublicationCentury 2000
PublicationDate 2020/06/15
PublicationDateYYYYMMDD 2020-06-15
PublicationDate_xml – month: 06
  year: 2020
  text: 2020/06/15
  day: 15
PublicationDecade 2020
PublicationTitle 知能と情報
PublicationTitleAlternate 日本知能情報ファジィ学会誌
PublicationYear 2020
Publisher 日本知能情報ファジィ学会
Publisher_xml – name: 日本知能情報ファジィ学会
References [18] I. Ulrich and J. Borenstein: “The GuideCane – Applying mobile robot technologies to assist the visually impaired,” IEEE Trans. on Systems, Man, and Cybernetics Part A: Systems and Humans, Vol.31, No.2, pp. 131-136, 2001.
[9] Y. Yasumuro, M. Murakami, M. Imura, T. Kuroda, Y. Manabe, and K. Chihara: “E-cane with situation presumption for the visually impaired,” Proc. of the User Interfaces for All 7th Int. Conf. on Universal Access: Theoretical Perspectives, Practice, and Experience, pp. 409-421, 2003.
[29] J. Zelek, R. Audette, J. Balthazaar, and C. Dunk: “A stereo-vision system for the visually impaired,” Technical report, University of Guelph, 2000.
[43] H. Takizawa and M. Aoyagi: “Assistive systems for the visually impaired based on image processing,” in Causes and Coping with Visual Impairment and Blindness, S. Rumelt ed., IntechOpen, Chapter 7, 2018.
[47] S. Nakagawa, H. Takizawa, and M. Aoyagi: “Development of a xtion pro live cane system and comparison with our kinect cane system in object recognition,” IEICE Technical Report, Vol.116, No.139, pp. 7-10, 2016.
[10] S. Saegusa, Y. Yasuda, Y. Uratani, E. Tanaka, T. Makino, and J. Y. Chang: “Development of a guide-dog robot: Human-robot interface considering walking conditions for a visually handicapped person,” Microsystem Technologies, Vol.17, Nos.5-7, pp. 1169-1174, 2011.
[7] J. M. Benjamin, Jr., and M. S. E. E: “The laser cane,” J. of Rehabilitation Research & Development, Vol. BPR10-22, pp. 443-450, 1974.
[39] V.-N. Hoang, T.-H. Nguyen, T.-L. Le, T.-H. Tran, T.-P. Vuong, and N. Vuillerme: “Obstacle detection and warning system for visually impaired people based on electrode matrix and mobile kinect,” Vietnam J. of Computer Science, Vol.4, No.2, pp. 71-83, 2017.
[15] S. Tachi, K. Tanie, K. Komoriya, and M. Abe: “Electrocutaneous communication in a guide dog robot (MELDOG),” IEEE Trans. on Biomedical Engineering, Vol.BME-32, No.7, pp. 461-469, 1985.
[36] A. Rodrguez, J. J. Yebes, P. F. Alcantarilla, L. M. Bergasa, J. Almazn, and A. Cela: “Assisting the visually impaired: Obstacle detection and warning system by acoustic feedback,” Sensors, Vol.12, No.12, pp. 17476-17496, 2012.
[41] O. Halabi, M. Al-Ansari, Y. Halwani, F. Al-Mesaifri, and R. Al-Shaabi: “Navigation aid for blind people using depth information and augmented reality technology,” The Proc. of NICOGRAPH Int. 2012, pp. 120-125, 2012.
[50] T. Watanabe, H. Kaga, M. Kobayashi, and K. Minatani: “A survey of smartphone and tablet usage by blind people 2017,” IEICE Technical Report, Vol.117, No.251, pp. 69-74, 2017.
[33] G. Balakrishnan, G. Sainarayanan, R. Nagarajan, and S. Yaacob: “Wearable real-time stereo vision for the visually impaired,” Engineering Letters, Vol.14, No.2, pp. 1-9, 2007.
[35] Y. H. Lee and G. Medioni: “RGB-D camera based navigation for the visually impaired,” RSS 2011 RGB-D: Advanced Reasoning with Depth Camera Workshop, pp. 1-6, 2011.
[42] F. Ribeiro, D. Florencio, P. A. Chou, and Z. Zhang: “Auditory augmented reality: Object sonification for the visually impaired,” 2012 IEEE 14th Int. Workshop on Multimedia Signal Processing (MMSP), pp. 319-324, 2012.
[48] S. Nakagawa, H. Takizawa, and M. Aoyagi: “Preliminary study on seat recognition by use of a realsense 200 cane system for the visually impaired,” Proc. of the Sensory Substitution Symposium, pp. 1-3, 2017.
[1] WHO, Blindness and vision impairment: https://www.who.int/news-room/fact-sheets/detail/blindness-and-visual-impairment [accessed Oct. 8, 2019]
[20] S. Dambhare and A. Sakhare: “Smart stick for blind: Obstacle detection, artificial vision and real-time assistance via GPS,” IJCA Proc. on 2nd National Conf. on Information and Communication Technology (NCICT), No.6, pp. 31-33, 2011.
[32] S. Meers and K. Ward: “Substitute three-dimensional perception using depth and colour sensors,” The 2007 Australasian Conf. on Robotics and Automation, pp. 1-5, 2007.
[4] D. Dakopoulos and N. G. Bourbakis: “Wearable obstacle avoidance electronic travel aids for blind: a survey,” IEEE Trans. on Systems, Man, and Cybernetics, Part C: Applications and Reviews, Vol.40, No.1, pp. 25-35,2010.
[23] M. H. Mahmud, R. Saha, and S. Islam: “Smart walking stick - an electronic approach to assist visually disabled persons,” Int. J. of Scientific & Engineering Research, Vol.4, No.10, pp. 111-114, 2013.
[12] J. V. Gomez and F. E. Sandnes: “Roboguidedog: Guiding blind users through physical environments with laser range scanners,” Procedia Computer Science, Vol.14, pp. 218-225, 2012.
[44] Labour Ministry of Health and Japan Welfare: https://www.mhlw.go.jp/www1/topics/kenko21_11/s1.html [accessed Apr. 30, 2020]
[49] D. Nakamura, H. Takizawa, M. Aoyagi, N. Ezaki, and S. Mizuno: “Smartphone-based escalator recognition for the visually impaired,” Sensors, Vol.17, No.5, 2017.
[19] M. Okayasu: “Newly developed walking apparatus for identification of obstructions by visually impaired people,” J. of Mechanical Science and Technology, Vol.24, No.6, pp. 1261-1264, 2010.
[46] H. Takizawa, Y. Kuramochi, and M. Aoyagi: “Kinect cane system: Recognition aid of available seats for the visually impaired,” Proc. of 2019 IEEE 1st Global Conf. on Life Sciences and Technologies, pp. 189-193, 2019.
[11] A. Imadu, T. Kawai, Y. Takada, and T. Tajiri: “Walking guide interface mechanism and navigation system for the visually impaired,” Proc. of the 4th Int. Conf. on Human System Interactions, pp. 34-39, 2011.
[6] J. M. Benjamin, N. A. Ali, and A. F. Schepis: “A laser cane for the blind,” Proc. of the San Diego Biomedical Symp., Vol.12, pp. 53-57, 1973.
[45] H. Takizawa, S. Yamaguchi, M. Aoyagi, N. Ezaki, and Shinji Mizuno: “Kinect cane: an assistive system for the visually impaired based on the concept of object recognition aid,” Personal and Ubiquitous Computing, Vol.19, No. 5-6, pp. 955-965, 2015.
[17] S. Shoval, J. Borenstein, and Y. Koren: “The navbelt – A computerized travel aid for the blind based on mobile robotics technology,” IEEE Trans. on Biomedical Engineering, Vol.45, No.11, pp. 1376-1386, 1998.
[27] B. Mocanu, R. Tapu, and T. Zaharia: “When ultrasonic sensors and computer vision join forces for efficient obstacle detection and recognition,” Sensors, Vol.16, No.11, p. 1807, 2016.
[28] N. Molton, S. Se, J. M. Brady, D. Lee, and P. Probert: “A stereo vision-based aid for the visually impaired,” Image and Vision Computing, Vol.16, pp. 251-263, 1998.
[14] Q. K. Dang, Y. Chee, D. D. Pham, and Y. S. Suh: “A virtual blind cane using a line laser-based vision system and an inertial measurement unit,” Sensors, Vol.16, No.1, 2016.
[25] D. Ni, A. Song, L. Tian, X. Xu, and D. Chen: “A walking assistant robotic system for the visually impaired based on computer vision and tactile perception,” Int. J. of Social Robotics, Vol.7, No.5, pp. 617-628, Nov 2015.
[5] D. Bolgiano and E. Meeks: “A laser cane for the blind,” IEEE J. of Quantum Electronics, Vol.3, No.6, p. 268, 1967.
[13] P. Vera, D. Zenteno, and J. Salas: “A smartphone-based virtual white cane,” Pattern Analysis and Applications, Vol.17, No.3, pp. 623-632, 2014.
[22] S. K. Bahadir, V. Koncar, and F. Kalaoglu: “Wearable obstacle detection system fully integrated to textile structures for visually impaired people,” Sensors and Actuators A: Physical, Vol.179, pp. 297-311, 2012.
[24] G. Gayathr, M. Vishnupriya, R. Nandhini, and M. Banupriya: “Smart walking stick for visually impaired,” Int. J. of Engineering and Computer Science, Vol.3, No.3, pp. 4057-4061, 2014.
[26] A. S, N. S, P. Alekhya, R. S N, and L. Jain: “Blind guide – An outdoor navigation application for visually impaired people,” Int. J. of Advances in Electronics and Computer Science, Vol.3, No.Sp, pp. 102-106, 2016.
[8] R. Farcy and R. Damaschini: “Triangulating laser profilometer as a three-dimensional space perception system for the blind,” Appl. Opt., Vol.36, No.31, pp. 8227-8232, 1997.
[3] Labour Ministry of Health and Japan Welfare: https://www.mhlw.go.jp/stf/seisakunitsuite/bunya/0000165273.html [accessed Apr. 30, 2020]
[38] H. Pham, T. Le, and N. Vuillerme: “Real-time obstacle detection system in indoor environment for the visually impaired using microsoft kinect sensor,” J. of Sensors, Vol.2016, pp. 1-14, 2016.
[37] A. Khan, F. Moideen, J. Lopez, W. L. Khoo, and Z. Zhu: “KinDetect: Kinect detecting objects,” 13th Int. Conf. on Computers Helping People with Special Needs, Vol.LNCS 7383, No.II, pp. 588-595, 2012.
[2] Japanese Service Dog Resource Center: https://www.jsdrc.jp/ [accessed Apr. 30, 2020]
[30] Y. Kawai and F. Tomita: “A support system for visually impaired persons to understand three-dimensional visual information using acoustic interface,” Proc. of the 16th Int. Conf. on Pattern Recognition, Vol.3, pp. 974-977, 2002.
[31] G. Balakrishnan, G. Sainarayanan, R. Nagarajan, and S. Yaacob: “A stereo image processing system for visually impaired,” World Academy of Science, Engineering and Technology, Vol.20, pp. 206-215, 2006.
[16] S. Kotani, H. Mori, and N. Kiyohiro: “Development of the robotic travel aid “HITOMI”,” Robotics and Autonomous Systems, Vol.17, Nos.1-2, pp. 119-128, 1996.
[21] M. H. A. Wahab, A. A. Talib, H. A. Kadir, A. Johari, A. Noraziah, R. M. Sidek, and A. A. Mutalib: “Smart cane: Assistive cane for visually-impaired people,” Int. J. of Computer Science Issues, Vol.8, Nos.4-2, pp. 21-27, 2011.
[34] L. Dunai, G. P. Fajarnes, V. S. Praderas, B. D. Garcia, and I. L. Lengua: “Real-time assistance prototype – A new navigation aid for blind people,” IECON 2010 – 36th Annual Conf. on IEEE Industrial Electronics Society, pp. 1173-1178, 2010.
[40] M. Zöllner, S. Huber, H.-C Jetter, and H. Reiterer: “NAVI – A proof-of-concept of a mobile navigational aid for visually impaired based on the microsoft kinect,” 13th IFIP TC13 Conf. on Human-Computer Interaction – INTERACT 2011, pp. 584-587, 2011.
References_xml – reference: [45] H. Takizawa, S. Yamaguchi, M. Aoyagi, N. Ezaki, and Shinji Mizuno: “Kinect cane: an assistive system for the visually impaired based on the concept of object recognition aid,” Personal and Ubiquitous Computing, Vol.19, No. 5-6, pp. 955-965, 2015.
– reference: [43] H. Takizawa and M. Aoyagi: “Assistive systems for the visually impaired based on image processing,” in Causes and Coping with Visual Impairment and Blindness, S. Rumelt ed., IntechOpen, Chapter 7, 2018.
– reference: [21] M. H. A. Wahab, A. A. Talib, H. A. Kadir, A. Johari, A. Noraziah, R. M. Sidek, and A. A. Mutalib: “Smart cane: Assistive cane for visually-impaired people,” Int. J. of Computer Science Issues, Vol.8, Nos.4-2, pp. 21-27, 2011.
– reference: [47] S. Nakagawa, H. Takizawa, and M. Aoyagi: “Development of a xtion pro live cane system and comparison with our kinect cane system in object recognition,” IEICE Technical Report, Vol.116, No.139, pp. 7-10, 2016.
– reference: [19] M. Okayasu: “Newly developed walking apparatus for identification of obstructions by visually impaired people,” J. of Mechanical Science and Technology, Vol.24, No.6, pp. 1261-1264, 2010.
– reference: [5] D. Bolgiano and E. Meeks: “A laser cane for the blind,” IEEE J. of Quantum Electronics, Vol.3, No.6, p. 268, 1967.
– reference: [2] Japanese Service Dog Resource Center: https://www.jsdrc.jp/ [accessed Apr. 30, 2020]
– reference: [30] Y. Kawai and F. Tomita: “A support system for visually impaired persons to understand three-dimensional visual information using acoustic interface,” Proc. of the 16th Int. Conf. on Pattern Recognition, Vol.3, pp. 974-977, 2002.
– reference: [15] S. Tachi, K. Tanie, K. Komoriya, and M. Abe: “Electrocutaneous communication in a guide dog robot (MELDOG),” IEEE Trans. on Biomedical Engineering, Vol.BME-32, No.7, pp. 461-469, 1985.
– reference: [18] I. Ulrich and J. Borenstein: “The GuideCane – Applying mobile robot technologies to assist the visually impaired,” IEEE Trans. on Systems, Man, and Cybernetics Part A: Systems and Humans, Vol.31, No.2, pp. 131-136, 2001.
– reference: [37] A. Khan, F. Moideen, J. Lopez, W. L. Khoo, and Z. Zhu: “KinDetect: Kinect detecting objects,” 13th Int. Conf. on Computers Helping People with Special Needs, Vol.LNCS 7383, No.II, pp. 588-595, 2012.
– reference: [33] G. Balakrishnan, G. Sainarayanan, R. Nagarajan, and S. Yaacob: “Wearable real-time stereo vision for the visually impaired,” Engineering Letters, Vol.14, No.2, pp. 1-9, 2007.
– reference: [8] R. Farcy and R. Damaschini: “Triangulating laser profilometer as a three-dimensional space perception system for the blind,” Appl. Opt., Vol.36, No.31, pp. 8227-8232, 1997.
– reference: [10] S. Saegusa, Y. Yasuda, Y. Uratani, E. Tanaka, T. Makino, and J. Y. Chang: “Development of a guide-dog robot: Human-robot interface considering walking conditions for a visually handicapped person,” Microsystem Technologies, Vol.17, Nos.5-7, pp. 1169-1174, 2011.
– reference: [32] S. Meers and K. Ward: “Substitute three-dimensional perception using depth and colour sensors,” The 2007 Australasian Conf. on Robotics and Automation, pp. 1-5, 2007.
– reference: [49] D. Nakamura, H. Takizawa, M. Aoyagi, N. Ezaki, and S. Mizuno: “Smartphone-based escalator recognition for the visually impaired,” Sensors, Vol.17, No.5, 2017.
– reference: [38] H. Pham, T. Le, and N. Vuillerme: “Real-time obstacle detection system in indoor environment for the visually impaired using microsoft kinect sensor,” J. of Sensors, Vol.2016, pp. 1-14, 2016.
– reference: [13] P. Vera, D. Zenteno, and J. Salas: “A smartphone-based virtual white cane,” Pattern Analysis and Applications, Vol.17, No.3, pp. 623-632, 2014.
– reference: [11] A. Imadu, T. Kawai, Y. Takada, and T. Tajiri: “Walking guide interface mechanism and navigation system for the visually impaired,” Proc. of the 4th Int. Conf. on Human System Interactions, pp. 34-39, 2011.
– reference: [20] S. Dambhare and A. Sakhare: “Smart stick for blind: Obstacle detection, artificial vision and real-time assistance via GPS,” IJCA Proc. on 2nd National Conf. on Information and Communication Technology (NCICT), No.6, pp. 31-33, 2011.
– reference: [35] Y. H. Lee and G. Medioni: “RGB-D camera based navigation for the visually impaired,” RSS 2011 RGB-D: Advanced Reasoning with Depth Camera Workshop, pp. 1-6, 2011.
– reference: [44] Labour Ministry of Health and Japan Welfare: https://www.mhlw.go.jp/www1/topics/kenko21_11/s1.html [accessed Apr. 30, 2020]
– reference: [41] O. Halabi, M. Al-Ansari, Y. Halwani, F. Al-Mesaifri, and R. Al-Shaabi: “Navigation aid for blind people using depth information and augmented reality technology,” The Proc. of NICOGRAPH Int. 2012, pp. 120-125, 2012.
– reference: [14] Q. K. Dang, Y. Chee, D. D. Pham, and Y. S. Suh: “A virtual blind cane using a line laser-based vision system and an inertial measurement unit,” Sensors, Vol.16, No.1, 2016.
– reference: [40] M. Zöllner, S. Huber, H.-C Jetter, and H. Reiterer: “NAVI – A proof-of-concept of a mobile navigational aid for visually impaired based on the microsoft kinect,” 13th IFIP TC13 Conf. on Human-Computer Interaction – INTERACT 2011, pp. 584-587, 2011.
– reference: [6] J. M. Benjamin, N. A. Ali, and A. F. Schepis: “A laser cane for the blind,” Proc. of the San Diego Biomedical Symp., Vol.12, pp. 53-57, 1973.
– reference: [22] S. K. Bahadir, V. Koncar, and F. Kalaoglu: “Wearable obstacle detection system fully integrated to textile structures for visually impaired people,” Sensors and Actuators A: Physical, Vol.179, pp. 297-311, 2012.
– reference: [1] WHO, Blindness and vision impairment: https://www.who.int/news-room/fact-sheets/detail/blindness-and-visual-impairment [accessed Oct. 8, 2019]
– reference: [39] V.-N. Hoang, T.-H. Nguyen, T.-L. Le, T.-H. Tran, T.-P. Vuong, and N. Vuillerme: “Obstacle detection and warning system for visually impaired people based on electrode matrix and mobile kinect,” Vietnam J. of Computer Science, Vol.4, No.2, pp. 71-83, 2017.
– reference: [25] D. Ni, A. Song, L. Tian, X. Xu, and D. Chen: “A walking assistant robotic system for the visually impaired based on computer vision and tactile perception,” Int. J. of Social Robotics, Vol.7, No.5, pp. 617-628, Nov 2015.
– reference: [27] B. Mocanu, R. Tapu, and T. Zaharia: “When ultrasonic sensors and computer vision join forces for efficient obstacle detection and recognition,” Sensors, Vol.16, No.11, p. 1807, 2016.
– reference: [29] J. Zelek, R. Audette, J. Balthazaar, and C. Dunk: “A stereo-vision system for the visually impaired,” Technical report, University of Guelph, 2000.
– reference: [36] A. Rodrguez, J. J. Yebes, P. F. Alcantarilla, L. M. Bergasa, J. Almazn, and A. Cela: “Assisting the visually impaired: Obstacle detection and warning system by acoustic feedback,” Sensors, Vol.12, No.12, pp. 17476-17496, 2012.
– reference: [48] S. Nakagawa, H. Takizawa, and M. Aoyagi: “Preliminary study on seat recognition by use of a realsense 200 cane system for the visually impaired,” Proc. of the Sensory Substitution Symposium, pp. 1-3, 2017.
– reference: [7] J. M. Benjamin, Jr., and M. S. E. E: “The laser cane,” J. of Rehabilitation Research & Development, Vol. BPR10-22, pp. 443-450, 1974.
– reference: [28] N. Molton, S. Se, J. M. Brady, D. Lee, and P. Probert: “A stereo vision-based aid for the visually impaired,” Image and Vision Computing, Vol.16, pp. 251-263, 1998.
– reference: [50] T. Watanabe, H. Kaga, M. Kobayashi, and K. Minatani: “A survey of smartphone and tablet usage by blind people 2017,” IEICE Technical Report, Vol.117, No.251, pp. 69-74, 2017.
– reference: [24] G. Gayathr, M. Vishnupriya, R. Nandhini, and M. Banupriya: “Smart walking stick for visually impaired,” Int. J. of Engineering and Computer Science, Vol.3, No.3, pp. 4057-4061, 2014.
– reference: [46] H. Takizawa, Y. Kuramochi, and M. Aoyagi: “Kinect cane system: Recognition aid of available seats for the visually impaired,” Proc. of 2019 IEEE 1st Global Conf. on Life Sciences and Technologies, pp. 189-193, 2019.
– reference: [9] Y. Yasumuro, M. Murakami, M. Imura, T. Kuroda, Y. Manabe, and K. Chihara: “E-cane with situation presumption for the visually impaired,” Proc. of the User Interfaces for All 7th Int. Conf. on Universal Access: Theoretical Perspectives, Practice, and Experience, pp. 409-421, 2003.
– reference: [42] F. Ribeiro, D. Florencio, P. A. Chou, and Z. Zhang: “Auditory augmented reality: Object sonification for the visually impaired,” 2012 IEEE 14th Int. Workshop on Multimedia Signal Processing (MMSP), pp. 319-324, 2012.
– reference: [16] S. Kotani, H. Mori, and N. Kiyohiro: “Development of the robotic travel aid “HITOMI”,” Robotics and Autonomous Systems, Vol.17, Nos.1-2, pp. 119-128, 1996.
– reference: [26] A. S, N. S, P. Alekhya, R. S N, and L. Jain: “Blind guide – An outdoor navigation application for visually impaired people,” Int. J. of Advances in Electronics and Computer Science, Vol.3, No.Sp, pp. 102-106, 2016.
– reference: [3] Labour Ministry of Health and Japan Welfare: https://www.mhlw.go.jp/stf/seisakunitsuite/bunya/0000165273.html [accessed Apr. 30, 2020]
– reference: [34] L. Dunai, G. P. Fajarnes, V. S. Praderas, B. D. Garcia, and I. L. Lengua: “Real-time assistance prototype – A new navigation aid for blind people,” IECON 2010 – 36th Annual Conf. on IEEE Industrial Electronics Society, pp. 1173-1178, 2010.
– reference: [23] M. H. Mahmud, R. Saha, and S. Islam: “Smart walking stick - an electronic approach to assist visually disabled persons,” Int. J. of Scientific & Engineering Research, Vol.4, No.10, pp. 111-114, 2013.
– reference: [17] S. Shoval, J. Borenstein, and Y. Koren: “The navbelt – A computerized travel aid for the blind based on mobile robotics technology,” IEEE Trans. on Biomedical Engineering, Vol.45, No.11, pp. 1376-1386, 1998.
– reference: [31] G. Balakrishnan, G. Sainarayanan, R. Nagarajan, and S. Yaacob: “A stereo image processing system for visually impaired,” World Academy of Science, Engineering and Technology, Vol.20, pp. 206-215, 2006.
– reference: [4] D. Dakopoulos and N. G. Bourbakis: “Wearable obstacle avoidance electronic travel aids for blind: a survey,” IEEE Trans. on Systems, Man, and Cybernetics, Part C: Applications and Reviews, Vol.40, No.1, pp. 25-35,2010.
– reference: [12] J. V. Gomez and F. E. Sandnes: “Roboguidedog: Guiding blind users through physical environments with laser range scanners,” Procedia Computer Science, Vol.14, pp. 218-225, 2012.
SSID ssj0069051
ssib023159992
ssib058986276
ssib029852168
ssib002484266
ssib001106413
ssib017172185
ssib002222523
ssib000937382
Score 2.1109924
SourceID jstage
SourceType Publisher
StartPage 75
Title 視覚障がい者のための物体認識と物体認識支援
URI https://www.jstage.jst.go.jp/article/jsoft/32/3/32_75/_article/-char/ja
Volume 32
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
ispartofPNX 知能と情報, 2020/06/15, Vol.32(3), pp.75-79
journalDatabaseRights – providerCode: PRVAFT
  databaseName: Open Access Digital Library
  customDbUrl:
  eissn: 1881-7203
  dateEnd: 99991231
  omitProxy: true
  ssIdentifier: ssj0069051
  issn: 1347-7986
  databaseCode: KQ8
  dateStart: 20060101
  isFulltext: true
  titleUrlDefault: http://grweb.coalliance.org/oadl/oadl.html
  providerName: Colorado Alliance of Research Libraries
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnV3Ni9QwFC_retGD-Inf7MEcO7ZN2yQgSDrTYVFWEHZhb6WZpoc57Io7e_FkRQ-C4kERWQRviwrqxat_TWUd_wvfS9tpZ9nDuJfw-pKmL-9l-n4v07xY1q3UUanmmWOnKmW2L7zcTl0xsn2VgrvMAkVNoLj2IFzd8O9tBptLJ-50vlranaje6MmR-0qOY1XggV1xl-x_WHbWKTCABvtCCRaGciEbk5gTGRIRNoQkscBS9ElMASQSPiN8bMMdwoOaI-OaEENDeEi3VYxwQaQgsU-iARHUPEISPjAEcHjTmC_WGMTziRwiwYck8ruwGHsAMWRghKTYSdt5iBwUOyDSIdFsjmBNFBExMIRHpIfzBbsCSWA4oAoJ1by7sOHh-Td2tbXTTEUjFzNPDlFtsn-ULIdFMNJRIoJac9KtiYg3HGoaD9AsqJU-mKXjAKjPbCaa9NwVj3PXxr-ru16jXZVtFxWMC6gOgqnBRHVQzmE3RSFoRje1A562R70eTZqb5jJ_1_MqMe0S6iUUCxYkTQVuzEvGEB2c9MCn4cEl9x92wLXArFWdXdEQ_ftz4A3e5V3w63OEa821y3B1oM1sBIFAANHErD9PcIB-bbAacNCah-C2wkEhZn4zyx21RqsMtzj223MjB-w3hkio-YrSALv1s9aZOiJbkdVoz1lLeuu8dbqTpxOu1mbJjXcuWHen---n-3t_9z6WxauyeD59-qIsvpfFp_JZAcSfl19-_3o7_fp6-u1DWXzuXh68-3Hw5udFa2MYr_dX7focEnvsem5qp47IWMoynUN4lXOWwrC11hgOObniWarBjemMZiHT-FE_KoaNuAp9gOsQX9FL1vLW9pa-bK3kwGYiZ3ioAkDpjCtNearwrHcRqpG6YrFKF8mjKtlMsugkuHrsO69Zp9qf33VrefJ4V98ApD1RN82E-gdn46y7
linkProvider Colorado Alliance of Research Libraries
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=%E8%A6%96%E8%A6%9A%E9%9A%9C%E3%81%8C%E3%81%84%E8%80%85%E3%81%AE%E3%81%9F%E3%82%81%E3%81%AE%E7%89%A9%E4%BD%93%E8%AA%8D%E8%AD%98%E3%81%A8%E7%89%A9%E4%BD%93%E8%AA%8D%E8%AD%98%E6%94%AF%E6%8F%B4&rft.jtitle=%E7%9F%A5%E8%83%BD%E3%81%A8%E6%83%85%E5%A0%B1&rft.au=%E6%BB%9D%E6%B2%A2%2C+%E7%A9%82%E9%AB%98&rft.date=2020-06-15&rft.pub=%E6%97%A5%E6%9C%AC%E7%9F%A5%E8%83%BD%E6%83%85%E5%A0%B1%E3%83%95%E3%82%A1%E3%82%B8%E3%82%A3%E5%AD%A6%E4%BC%9A&rft.issn=1347-7986&rft.eissn=1881-7203&rft.volume=32&rft.issue=3&rft.spage=75&rft.epage=79&rft_id=info:doi/10.3156%2Fjsoft.32.3_75&rft.externalDocID=article_jsoft_32_3_32_75_article_char_ja
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1347-7986&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1347-7986&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1347-7986&client=summon