Updated on 2024/04/11

写真a

 
MARU Noriaki
 
Name of department
Faculty of Systems Engineering, Mechatronics
Job title
Associate Professor
Concurrent post
Robotics Majaor(Associate Professor)
Mail Address
E-mail address
Homepage
External link

Education

  • 1986
    -
    1989

    大阪大学大学院   基礎工学研究科  

Degree

  • Dr. of Engineering   1993

Academic & Professional Experience

  • 2004
    -
    Now

    Wakayama University   Faculty of Systems Engineering   Associate Professor

  • 2002
    -
    2003

    Wakayama University   Faculty of Systems Engineering   助教授

  • 1996

    Wakayama University   Faculty of Systems Engineering   講師

  • 1995
    -
    1998

    Wakayama University   Faculty of Systems Engineering   講師

  • 1992
    -
    1994

    Osaka University   School of Engineering Science Direct Affiliates   助手

Association Memberships

  • 1998.04
    -
    Now

    日本機械学会

  • 1996.04
    -
    Now

    システム制御情報学会

  • 1993.04
    -
    Now

    計測自動制御学会

  • 1984.04
    -
    Now

    日本ロボット学会

Research Areas

  • Manufacturing technology (mechanical, electrical/electronic, chemical engineering) / Control and systems engineering / Robotics

Classes (including Experimental Classes, Seminars, Graduation Thesis Guidance, Graduation Research, and Topical Research)

  • 2022   Fundamentals of Robotics   Liberal Arts and Sciences Subjects

  • 2022   Graduation Research   Specialized Subjects

  • 2022   Graduation Research   Specialized Subjects

  • 2022   Graduation Research   Specialized Subjects

  • 2022   Experiments for Mechatronics   Specialized Subjects

  • 2022   Practice for Researches in Mechatronics   Specialized Subjects

  • 2022   Robotics   Specialized Subjects

  • 2022   Robot Vision   Specialized Subjects

  • 2021   Graduation Research   Specialized Subjects

  • 2021   Robot Vision   Specialized Subjects

  • 2021   Graduation Research   Specialized Subjects

  • 2021   Practice for Researches in Mechatronics   Specialized Subjects

  • 2021   Robotics   Specialized Subjects

  • 2021   Graduation Research   Specialized Subjects

  • 2021   Graduation Research   Specialized Subjects

  • 2021   Experiments for Mechatronics   Specialized Subjects

  • 2021   Introductory Seminar in Systems Engineering   Specialized Subjects

  • 2020   Graduation Research   Specialized Subjects

  • 2020   Graduation Research   Specialized Subjects

  • 2020   Practice for Researches in Mechatronics   Specialized Subjects

  • 2020   Robot Vision   Specialized Subjects

  • 2020   Experiments for Mechatronics   Specialized Subjects

  • 2020   Robotics   Specialized Subjects

  • 2019   Practice for Researches in Mechatronics   Specialized Subjects

  • 2019   Robot Vision   Specialized Subjects

  • 2019   Experiments for Mechatronics   Specialized Subjects

  • 2019   Introductory Seminar in Systems Engineering   Specialized Subjects

  • 2019   Robotics   Specialized Subjects

  • 2019   Computer Engineering   Specialized Subjects

  • 2018   NA   Specialized Subjects

  • 2018   NA   Specialized Subjects

  • 2018   NA   Specialized Subjects

  • 2018   Practice for Researches in Mechatronics   Specialized Subjects

  • 2018   Robot Vision   Specialized Subjects

  • 2018   Robotics   Specialized Subjects

  • 2018   Computer Engineering   Specialized Subjects

  • 2017   NA   Specialized Subjects

  • 2017   NA   Specialized Subjects

  • 2017   Practice for reserches   Specialized Subjects

  • 2017   NA   Specialized Subjects

  • 2017   Robot Vision   Specialized Subjects

  • 2017   Robotics   Specialized Subjects

  • 2017   Computer Engineering   Specialized Subjects

  • 2016   NA   Specialized Subjects

  • 2016   Experiments C for Opto-mechatronics   Specialized Subjects

  • 2016   Practice for reserches   Specialized Subjects

  • 2016   Applied Seminar   Specialized Subjects

  • 2016   Robotics   Specialized Subjects

  • 2016   Computer Engineering   Specialized Subjects

  • 2015   Experiments C for Opto-mechatronics   Specialized Subjects

  • 2015   Applied Seminar   Specialized Subjects

  • 2015   Introductory Seminar in Systems Engineering   Specialized Subjects

  • 2015   Practice for reserches   Specialized Subjects

  • 2015   Robotics   Specialized Subjects

  • 2015   Computer Engineering   Specialized Subjects

  • 2014   Applied Seminar   Specialized Subjects

  • 2014   Practice for reserches   Specialized Subjects

  • 2014   Experiments C for Opto-mechatronics   Specialized Subjects

  • 2014   Robotics   Specialized Subjects

  • 2014   Introduction to Opto-Mechatronics   Specialized Subjects

  • 2014   Computer Engineering   Specialized Subjects

  • 2013   Applied Seminar   Specialized Subjects

  • 2013   Practice for reserches   Specialized Subjects

  • 2013   Experiments C for Opto-mechatronics   Specialized Subjects

  • 2013   Robotics   Specialized Subjects

  • 2013   Introduction to Opto-Mechatronics   Specialized Subjects

  • 2013   Computer Engineering   Specialized Subjects

  • 2013   Mechatoronics using in familiar products   Liberal Arts and Sciences Subjects

  • 2013   Introductory Seminar   Liberal Arts and Sciences Subjects

  • 2012   NA   Specialized Subjects

  • 2012   Mechatoronics using in familiar products   Liberal Arts and Sciences Subjects

  • 2012   Introduction to Opto-Mechatronics   Specialized Subjects

  • 2012   Experiments C for Opto-mechatronics   Specialized Subjects

  • 2012   Applied Seminar   Specialized Subjects

  • 2012   Practice for reserches   Specialized Subjects

  • 2012   Robotics   Specialized Subjects

  • 2012   Computer Engineering   Specialized Subjects

  • 2011   NA   Specialized Subjects

  • 2011   Experiments C for Opto-mechatronics   Specialized Subjects

  • 2011   Applied Seminar   Specialized Subjects

  • 2011   Practice for reserches   Specialized Subjects

  • 2011   Introductory Seminar   Liberal Arts and Sciences Subjects

  • 2011   Robotics   Specialized Subjects

  • 2011   Computer Engineering   Specialized Subjects

  • 2011   Introduction to Opto-Mechatronics   Specialized Subjects

  • 2011   Mechatoronics using in familiar products   Liberal Arts and Sciences Subjects

  • 2010   NA   Specialized Subjects

  • 2010   Mechatoronics using in familiar products   Liberal Arts and Sciences Subjects

  • 2010   Experiments C for Opto-mechatronics   Specialized Subjects

  • 2010   Robotics   Specialized Subjects

  • 2010   Applied Seminar   Specialized Subjects

  • 2010   Introduction to Opto-Mechatronics   Specialized Subjects

  • 2010   Computer Engineering   Specialized Subjects

  • 2010   Practice for reserches   Specialized Subjects

  • 2010   Mechatoronics using in familiar products   Liberal Arts and Sciences Subjects

  • 2009   NA   Specialized Subjects

  • 2009   Practice for reserches   Specialized Subjects

  • 2009   NA   Specialized Subjects

  • 2009   Robotics   Specialized Subjects

  • 2009   NA   Specialized Subjects

  • 2009   Computer Engineering   Specialized Subjects

  • 2009   Introduction to Opto-Mechatronics   Specialized Subjects

  • 2009   NA   Specialized Subjects

  • 2009   NA   Specialized Subjects

  • 2008   NA   Specialized Subjects

  • 2008   Practice for reserches   Specialized Subjects

  • 2008   NA   Specialized Subjects

  • 2008   Robotics   Specialized Subjects

  • 2008   NA   Specialized Subjects

  • 2008   Computer Engineering   Specialized Subjects

  • 2008   Introduction to Opto-Mechatronics   Specialized Subjects

  • 2008   NA   Specialized Subjects

  • 2008   Mechatoronics using in familiar products   Specialized Subjects

  • 2007   NA   Specialized Subjects

  • 2007   Practice for reserches   Specialized Subjects

  • 2007   NA   Specialized Subjects

  • 2007   Robotics   Specialized Subjects

  • 2007   NA   Specialized Subjects

  • 2007   NA   Specialized Subjects

  • 2007   Computer Engineering   Specialized Subjects

  • 2007   Introduction to Opto-Mechatronics   Specialized Subjects

  • 2007   NA   Specialized Subjects

▼display all

Satellite Courses

  • 2012   Mechatoronics using in familiar products   Liberal Arts and Sciences Subjects

Independent study

  • 2007   ロボットの製作と制御

  • 2007   二足歩行ロボットの歩行動作の実現

Classes

  • 2022   Systems Engineering Global Seminar Ⅱ   Doctoral Course

  • 2022   Systems Engineering Global Seminar Ⅰ   Doctoral Course

  • 2022   Systems Engineering Advanced Research   Doctoral Course

  • 2022   Systems Engineering Advanced Seminar Ⅱ   Doctoral Course

  • 2022   Systems Engineering Advanced Seminar Ⅰ   Doctoral Course

  • 2022   Systems Engineering Project SeminarⅡB   Master's Course

  • 2022   Systems Engineering Project SeminarⅡA   Master's Course

  • 2022   Systems Engineering Project SeminarⅠB   Master's Course

  • 2022   Systems Engineering Project SeminarⅠA   Master's Course

  • 2022   Systems Engineering SeminarⅡB   Master's Course

  • 2022   Systems Engineering SeminarⅡA   Master's Course

  • 2022   Systems Engineering SeminarⅠB   Master's Course

  • 2022   Systems Engineering SeminarⅠA   Master's Course

  • 2021   Systems Engineering Global Seminar Ⅱ   Doctoral Course

  • 2021   Systems Engineering Global Seminar Ⅰ   Doctoral Course

  • 2021   Systems Engineering Advanced Research   Doctoral Course

  • 2021   Systems Engineering Advanced Seminar Ⅱ   Doctoral Course

  • 2021   Systems Engineering Advanced Seminar Ⅰ   Doctoral Course

  • 2021   Systems Engineering Project SeminarⅡB   Master's Course

  • 2021   Systems Engineering Project SeminarⅡA   Master's Course

  • 2021   Systems Engineering Project SeminarⅠB   Master's Course

  • 2021   Systems Engineering Project SeminarⅠA   Master's Course

  • 2021   Systems Engineering SeminarⅡB   Master's Course

  • 2021   Systems Engineering SeminarⅡA   Master's Course

  • 2021   Systems Engineering SeminarⅠB   Master's Course

  • 2021   Systems Engineering SeminarⅠA   Master's Course

  • 2020   Systems Engineering Global Seminar Ⅱ   Doctoral Course

  • 2020   Systems Engineering Global Seminar Ⅰ   Doctoral Course

  • 2020   Systems Engineering Advanced Research   Doctoral Course

  • 2020   Systems Engineering Advanced Seminar Ⅱ   Doctoral Course

  • 2020   Systems Engineering Advanced Seminar Ⅰ   Doctoral Course

  • 2020   Systems Engineering Project SeminarⅡB   Master's Course

  • 2020   Systems Engineering Project SeminarⅡA   Master's Course

  • 2020   Systems Engineering Project SeminarⅠB   Master's Course

  • 2020   Systems Engineering Project SeminarⅠA   Master's Course

  • 2020   Systems Engineering SeminarⅡB   Master's Course

  • 2020   Systems Engineering SeminarⅡA   Master's Course

  • 2020   Systems Engineering SeminarⅠB   Master's Course

  • 2020   Systems Engineering SeminarⅠA   Master's Course

  • 2019   Systems Engineering Advanced Seminar Ⅰ   Doctoral Course

  • 2019   Systems Engineering Advanced Seminar Ⅰ   Doctoral Course

  • 2019   Systems Engineering Advanced Research   Doctoral Course

  • 2019   Systems Engineering Advanced Research   Doctoral Course

  • 2019   Systems Engineering SeminarⅡB   Master's Course

  • 2019   Systems Engineering SeminarⅡA   Master's Course

  • 2019   Systems Engineering SeminarⅠB   Master's Course

  • 2019   Systems Engineering SeminarⅠA   Master's Course

  • 2019   Systems Engineering Project SeminarⅡB   Master's Course

  • 2019   Systems Engineering Project SeminarⅡA   Master's Course

  • 2019   Systems Engineering Project SeminarⅠB   Master's Course

  • 2019   Systems Engineering Project SeminarⅠA   Master's Course

  • 2018   Systems Engineering Global Seminar Ⅱ   Doctoral Course

  • 2018   Systems Engineering Global Seminar Ⅱ   Doctoral Course

  • 2018   Systems Engineering Advanced Research   Doctoral Course

  • 2018   Systems Engineering Advanced Research   Doctoral Course

  • 2018   Systems Engineering Project SeminarⅡB   Master's Course

  • 2018   Systems Engineering Project SeminarⅡA   Master's Course

  • 2018   Systems Engineering Project SeminarⅠB   Master's Course

  • 2018   Systems Engineering Project SeminarⅠA   Master's Course

  • 2018   Systems Engineering SeminarⅡB   Master's Course

  • 2018   Systems Engineering SeminarⅡA   Master's Course

  • 2018   Systems Engineering SeminarⅠB   Master's Course

  • 2018   Systems Engineering SeminarⅠA   Master's Course

  • 2018   Advanced Robotics   Master's Course

  • 2017   Systems Engineering Global Seminar Ⅱ   Doctoral Course

  • 2017   Systems Engineering Advanced Research   Doctoral Course

  • 2017   Systems Engineering Advanced Research   Doctoral Course

  • 2017   Systems Engineering Advanced Seminar Ⅱ   Doctoral Course

  • 2017   Systems Engineering Advanced Seminar Ⅱ   Doctoral Course

  • 2017   Systems Engineering Advanced Seminar Ⅰ   Doctoral Course

  • 2017   Systems Engineering Advanced Seminar Ⅰ   Doctoral Course

  • 2017   Systems Engineering Project SeminarⅡB   Master's Course

  • 2017   Systems Engineering Project SeminarⅡA   Master's Course

  • 2017   Systems Engineering Project SeminarⅠB   Master's Course

  • 2017   Systems Engineering Project SeminarⅠA   Master's Course

  • 2017   Advanced Robotics   Master's Course

  • 2017   Systems Engineering SeminarⅡB   Master's Course

  • 2017   Systems Engineering SeminarⅡA   Master's Course

  • 2017   Systems Engineering SeminarⅠB   Master's Course

  • 2017   Systems Engineering SeminarⅠA   Master's Course

  • 2016   Systems Engineering Global Seminar Ⅰ   Doctoral Course

  • 2016   Systems Engineering Global Seminar Ⅰ   Doctoral Course

  • 2016   Systems Engineering Advanced Research   Doctoral Course

  • 2016   Systems Engineering Advanced Research   Doctoral Course

  • 2016   Systems Engineering Advanced Seminar Ⅰ   Doctoral Course

  • 2016   Systems Engineering Advanced Seminar Ⅰ   Doctoral Course

  • 2016   Systems Engineering Project SeminarⅡB   Master's Course

  • 2016   Systems Engineering Project SeminarⅡA   Master's Course

  • 2016   Systems Engineering Project SeminarⅠB   Master's Course

  • 2016   Systems Engineering Project SeminarⅠA   Master's Course

  • 2016   Systems Engineering SeminarⅡB   Master's Course

  • 2016   Systems Engineering SeminarⅡA   Master's Course

  • 2016   Systems Engineering SeminarⅠB   Master's Course

  • 2016   Systems Engineering SeminarⅠA   Master's Course

  • 2016   Advanced Robotics   Master's Course

  • 2015   Systems Engineering Advanced Seminar Ⅰ  

  • 2015   Systems Engineering Advanced Research  

  • 2015   Systems Engineering SeminarⅡA  

  • 2015   Systems Engineering SeminarⅠA  

  • 2015   Systems Engineering Project SeminarⅡA  

  • 2015   Systems Engineering Project SeminarⅠA  

  • 2015   Advanced Robotics  

  • 2015   Systems Engineering Advanced Seminar Ⅰ  

  • 2015   Systems Engineering Advanced Research  

  • 2015   Systems Engineering SeminarⅡB  

  • 2015   Systems Engineering SeminarⅠB  

  • 2015   Systems Engineering Project SeminarⅡB  

  • 2015   Systems Engineering Project SeminarⅠB  

  • 2014   Systems Engineering Advanced Research  

  • 2014   Systems Engineering Advanced Research  

  • 2014   Systems Engineering Advanced Seminar Ⅱ  

  • 2014   Systems Engineering Advanced Seminar Ⅱ  

  • 2014   Systems Engineering Advanced Seminar Ⅰ  

  • 2014   Systems Engineering Advanced Seminar Ⅰ  

  • 2014   Systems Engineering Project SeminarⅡB  

  • 2014   Systems Engineering Project SeminarⅡA  

  • 2014   Systems Engineering Project SeminarⅠB  

  • 2014   Systems Engineering Project SeminarⅠA  

  • 2014   Advanced Robotics  

  • 2014   Systems Engineering SeminarⅡB  

  • 2014   Systems Engineering SeminarⅡA  

  • 2014   Systems Engineering SeminarⅠB  

  • 2014   Systems Engineering SeminarⅠA  

  • 2013   Systems Engineering Advanced Research  

  • 2013   Systems Engineering Advanced Research  

  • 2013   Systems Engineering Advanced Seminar Ⅱ  

  • 2013   Systems Engineering Advanced Seminar Ⅱ  

  • 2013   Systems Engineering Advanced Seminar Ⅰ  

  • 2013   Systems Engineering Advanced Seminar Ⅰ  

  • 2013   Systems Engineering Project SeminarⅡB  

  • 2013   Systems Engineering Project SeminarⅡA  

  • 2013   Systems Engineering Project SeminarⅠB  

  • 2013   Systems Engineering Project SeminarⅠA  

  • 2013   Advanced Robotics  

  • 2013   Systems Engineering SeminarⅡB  

  • 2013   Systems Engineering SeminarⅡA  

  • 2013   Systems Engineering SeminarⅠB  

  • 2013   Systems Engineering SeminarⅠA  

  • 2012   Systems Engineering Advanced Seminar Ⅱ  

  • 2012   Systems Engineering Advanced Seminar Ⅰ  

  • 2012   Systems Engineering Advanced Research  

  • 2012   Systems Engineering SeminarⅡA  

  • 2012   Systems Engineering SeminarⅠA  

  • 2012   Systems Engineering Project SeminarⅡA  

  • 2012   Systems Engineering Project SeminarⅠA  

  • 2012   Advanced Robotics  

  • 2012   Systems Engineering Advanced Seminar Ⅱ  

  • 2012   Systems Engineering Advanced Seminar Ⅰ  

  • 2012   Systems Engineering Advanced Research  

  • 2012   Systems Engineering SeminarⅡB  

  • 2012   Systems Engineering SeminarⅠB  

  • 2012   Systems Engineering Project SeminarⅡB  

  • 2012   Systems Engineering Project SeminarⅠB  

  • 2011   Systems Engineering Project SeminarⅡB  

  • 2011   Systems Engineering Project SeminarⅡA  

  • 2011   Systems Engineering Project SeminarⅠB  

  • 2011   Systems Engineering Project SeminarⅠA  

  • 2011   Systems Engineering Advanced Research  

  • 2011   Systems Engineering Advanced Research  

  • 2011   NA  

  • 2011   NA  

  • 2011   Systems Engineering Advanced Seminar Ⅱ  

  • 2011   Systems Engineering Advanced Seminar Ⅱ  

  • 2011   Systems Engineering Advanced Seminar Ⅰ  

  • 2011   Systems Engineering Advanced Seminar Ⅰ  

  • 2011   Advanced Robotics  

  • 2009   Advanced Robotics   Master's Course

  • 2009   NA   Master's Course

  • 2009   NA   Master's Course

  • 2009   NA   Master's Course

  • 2009   NA   Master's Course

  • 2008   Advanced Robotics   Master's Course

  • 2008   NA   Master's Course

  • 2008   NA   Master's Course

  • 2008   NA   Master's Course

  • 2008   NA   Master's Course

  • 2007   Advanced Robotics   Master's Course

  • 2007   NA   Master's Course

  • 2007   NA   Master's Course

  • 2007   NA   Master's Course

  • 2007   NA   Master's Course

▼display all

Research Interests

  • 筋骨格ロボット

  • 視空間ビジュアルサーボ

  • ヒュ-マノイドロボット

  • ステレオ視覚システム

  • 補償眼球運動

  • 視空間ベースビジュアルサーボ

▼display all

Published Papers

  • Control of Eye-And-Hand robot arm by visual space based visual servoing

    Ryohei HIROSE, Noriaki MARU (Part: Last author, Corresponding author )

    Transactions of the JSME   908 ( 88 ) 1 - 12   2022.03  [Refereed]

    DOI

  • Position and Attitude Control of Eye-In-Hand System by Visual Servoing using Binocular Visual Space

    Atsushi Ozato, Noriaki Maru

    2014 WORLD AUTOMATION CONGRESS (WAC): EMERGING TECHNOLOGIES FOR A NEW PARADIGM IN SYSTEM OF SYSTEMS ENGINEERING ( IEEE )    2014  [Refereed]

     View Summary

    We propose the 3D position and attitude control method of Eye-in-Hand System by visual servoing using Binocular Visual Space. The position of the target is estimated based on the linear approximation between translational motion space and binocular visual space. The attitude of the target is also estimated by the linear approximation between rotational motion space and posture binocular visual space. The proposed method is robust to calibration error of camera angles, because it does not use camera angles to calculate feedback command for translational and rotational velocity. Simulation results are presented to demonstrate the effectiveness of the proposed method.

  • Gait of Quadruped Robot including Positioning Control using Linear Visual Servoing

    Y.Inoue, N.Maru (Part: Corresponding author )

    International Journal of Automation Technology   5 ( 5 ) 649 - 654   2011.09  [Refereed]

  • Linear Visual Servoing-Based Control of the Position and Attitude of Omnidirectional Mobile Robots

    A.Ozato, N.Maru (Part: Corresponding author )

    International Journal of Automation Technology   5 ( 4 ) 569 - 574   2011.07  [Refereed]

  • Guidance and Control of Nursing Care Robot using Gaze Point Detector and Linear Visual Servoing

    A.Imasato, N.Maru (Part: Corresponding author )

    International Journal of Automation Technology   5 ( 3 ) 452 - 457   2011.05  [Refereed]

  • Development of a Low Cost Fast Stereo Vision System using CMOS Imagers and DSP

    M.Yamashita, N.Maru (Part: Corresponding author )

    International Journal of Automation Technology   5 ( 3 ) 445 - 451   2011.05  [Refereed]

  • 視空間誤差を用いた仮想バネダンパ仮説に基づくダイナミックビジュアルサーボによるEye-In-Hand型ロボットの位置姿勢制御

    松浦精太郎, 丸 典明 (Part: Corresponding author )

    日本機械学会誌論文集(C)   077 ( 776 ) 1366 - 1375   2011.04  [Refereed]

  • CMOS撮像素子とDSPを用いた組み込み用途向けのローコストな高速ステレオビジョンシステムの開発

    山下 真, 丸 典明 (Part: Corresponding author )

    日本機械学会論文集(C)   077 ( 773 ) 158 - 165   2011.01  [Refereed]

  • Control of 6 DOF Arm of the Humanoid Robot by Linear Visual Servoing

    Yusaku Shibuya, Noriaki Maru

    ISIE: 2009 IEEE INTERNATIONAL SYMPOSIUM ON INDUSTRIAL ELECTRONICS ( IEEE )    1774 - 1779   2009  [Refereed]

     View Summary

    This paper proposes a position and posture control method of the 6 DOF arm of the Humanoid Robot by Linear Visual Servoing. It is based on the linear approximation of the forward kinematics which has a similar kinematic structure as a human being. Forward kinematics is the transformation from joint space to binocular visual space. Pseudo inverse matrix of the 3 linear approximation matrices of the 3 points makes it possible to realize position and posture control of 6 DOF arm by Linear Visual Servoing. It is very robust to calibration error of camera angles, because it uses neither camera angles nor joint angles to calculate feedback command. Furthermore, the amount of calculation is very small compared to the conventional visual servoing schemes. Although the conventional 6 DOF arm control methods based on pseudo inverse matrix need non-linear complex calculation using joint angle, the proposed method does not need it. Simulation results are presented to demonstrate the effectiveness of the proposed method.

  • 線形ビジュアルサーボによる移動ロボットの追従制御

    岡本, 山口, 丸 (Part: Corresponding author )

    日本機械学会論文集(C)     72 - 718   2006.06  [Refereed]

  • Redundant arm control by linear visual servoing using pseudo inverse matrix

    Satoshi Mukai, Noriaki Maru

    2006 IEEE/RSJ International Conference on Intelligent Robots and Systems, Vols 1-12 ( IEEE )    1237 - 1242   2006  [Refereed]

     View Summary

    We proposed a simple visual servoing scheme called linear visual servoing (indicated as LVS). It is based on the linearity of the transformation from binocular visual space to joint space of the arm of the humanoid robot which has a similar kinetic structure as a human being. LVS is very robust to calibration errors, especially to camera angle errors, because it uses constant Jacobian matrix with neither camera angles nor joint angles to calculate feedback command. Furthermore, the amount of calculation is very small compared to conventional visual servoing schemes. Hence, it is especially suitable for humanoid robots which use active stereo vision. But conventional LVS can not deal with redundant arm, because it is based on linear approximation of inverse kinematics. In this paper, we propose a redundant arm positioning control method by linear visual servoing based on linear approximation of forward kinematics. Simulation and experimental results are presented to demonstrate the effectiveness of the proposed method.

  • Visual servoing based on coarse optical flow

    Takashi Mitsuda, Yoji Miyazaki, Noriaki Maru, Karl F. MacDorman, Atsushi Nishikawa, Fumio Miyazaki

    IFAC Proceedings Volumes ( Elsevier BV )  32 ( 2 ) 539 - 544   1999.07

    DOI

  • 粗いオプティカルフローを用いた高精度な位置決めビジュアルサーボ

    満田 隆, 宮崎陽司, 丸 典明, 宮崎文夫 (Part: Last author )

    日本ロボット学会誌   17 ( 2 ) 71 - 77   1999.03  [Refereed]

  • キャリブレーションエラーのあるステレオカメラにおける逆透視変換 不可能面の解析とビジュアルサーボの安定性

    富士川 和延, 丸 典明, 宮崎文夫 (Part: Last author )

    日本ロボット学会誌   16 ( 7 ) 150 - 152   1998.10  [Refereed]

  • Precise planar positioning using visual servoing based on coarse optical flow

    T Mitsuda, Y Miyazaki, N Maru, KF MacDorman, F Miyazaki

    1998 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS - PROCEEDINGS, VOLS 1-3 ( IEEE )    712 - 717   1998  [Refereed]

     View Summary

    In the drive towards miniaturization in manufacturing, accuracy in positioning minute objects by camera is vital For visual servoing, the rapid and robust detections of features in camera images is also essential to production line efficiency. Template matching provides flexibility in achieving this, often lacked by other methods, because it avoids the need to set object-specific parameters. Unfortunately, standard methods of template matching require much calculation, especially for detecting feature rotation. The delay this causes means that for many applications template matching provides too slow a source of visual feedback. As apr alternative, we propose a new method of detecting the translation and rotation of a feature from coarse optical flow, we and apply it to visual servoing. Coarse optical flow is derived from the difference in intensity between a region of the initial and current image and their pixel-by-pixel intensity gradients. Unlike template matching, our method can detect large rotations with relatively little calculation. Image resolution is then adjusted from coarse to fine. Sub-pixel accuracy results in a 100 fold improvement in precision (by area). We show experimental results for precise planar positioning.

  • Reconstruction of Object Surfaces by using Occlusion Information from Active Stereo Vision

    西川 敦, 小川晋平, 丸 典明, 宮崎文夫 (Part: Last author )

    Systems and Computeres in Japan   28 ( 9 ) 86 - 97   1997.10  [Refereed]

  • Reconstruction of object surfaces by using occlusion information from active stereo vision

    Atsushi Nishikawa, Shinpei Ogawa, Noriaki Maru, Fumio Miyazaki

    Systems and Computers in Japan   28 ( 9 ) 86-97 - 97   1997.08

     View Summary

    With robots moving in an unknown 3D environment, it is necessary to work out a method for reconstruction of surface structures (that is, where a surface is present and where there is a free space). In this paper, a method is proposed to recover surfaces by means of active moving of a stereo camera. First, the surface structure of the environment under consideration is represented by a set of virtual 3D segments (arcs) obtained by connecting edge points that lie on the same epipolar line; then the 3D position information and occlusion information acquired by active moving of the stereo camera are used to decide if a physical surface exists between the arcs. Using surface structures that have been so far recovered, camera location that offers the best results in surface reconstruction is predicted, and the camera is moved to this new location to obtain more precise results. The proposed method offers a way to select view-points that ensure efficient surface reconstruction, which has not been discussed in any previous study. Experiments showed that the proposed method offers correct reconstruction of object surfaces with a small number of measurements. © 1997 Scripta Technica, Inc.

    DOI

  • 網膜と皮質間の写像関係に基づく両眼追跡

    大城直紀, 丸 典明, 西川 敦, 宮崎文夫 (Part: Last author )

    システム制御情報学会論文誌   10 ( 6 ) 287 - 296   1997.01  [Refereed]

  • 視空間を用いたビジュアルサーボイング

    満田 隆, 丸 典明, 冨士川和延, 宮崎文夫 (Part: Last author )

    計測自動制御学会論文誌   33 ( 1 ) 35 - 41   1997.01  [Refereed]

  • Foveated vision for scene exploration

    Naoki Oshiro, Atsushi Nishikawa, Noriaki Maru, Fumio Miyazaki

    Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)   1351   256 - 263   1997

     View Summary

    In this paper, foveated vision for scene exploration is implemented. The peripheral and central vision are the basic capabilities of foveated vision. The informations obtained from the peripheral vision are used to determine the next gaze point. Due to the low resolution of the periphery, however, the determination is not always appropriate. To solve this problem, we propose to evaluate the target object by the central vision after gazing. We implement foveated vision based on the Log Polar Mapping (LPM) and construct an evaluation scheme of the target object in the central vision using LPM rotational-invariance. The peripheral vision is realized by Zero Disparity Filter for LPM stereo images. Some experimental results are Mso shown to demonstrate the effectiveness of the proposed method.

    DOI

  • Binocular visual servoing based on linear time-invariant mapping

    T Mitsuda, N Maru, K Fujikawa, F Miyazaki

    ADVANCED ROBOTICS ( VSP BV )  11 ( 5 ) 429 - 443   1997  [Refereed]

     View Summary

    We propose a simple visual servoing scheme based on the use of binocular visual space. When we use a hand-eye system which has a similar kinematic structure to a human being, we can approximate the transformation from a binocular visual space to a joint space of the manipulator as a linear time-invariant mapping. This relationship makes it possible to generate joint velocities from image observations using a constant linear mapping. This scheme is robust to calibration error, especially to camera turning, because it uses neither camera angles nor joint angles. Some experimental results are also shown to demonstrate the positioning precision remained unchanged despite the calibration error.

  • 視空間を用いた逆運動学の線形近似

    満田 隆, 丸 典明, 冨士川和延, 宮崎文夫 (Part: Last author )

    日本ロボット学会誌   14 ( 8 ) 1145 - 1151   1996.10  [Refereed]

  • 逆運動学の線形近似に基づくビジュアルサーボ

    満田 隆, 丸 典明, 冨士川和延, 宮崎文夫 (Part: Last author )

    日本ロボット学会誌   14 ( 5 ) 743 - 750   1996.08  [Refereed]

  • Binocular tracking using log polar mapping

    N Oshiro, N Maru, A Nishikawa, F Miyazaki

    IROS 96 - PROCEEDINGS OF THE 1996 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS - ROBOTIC INTELLIGENCE INTERACTING WITH DYNAMIC WORLDS, VOLS 1-3 ( I E E E )    791 - 798   1996  [Refereed]

     View Summary

    This paper describes a new binocular tracking method using Log Polar Mapping (LPM) which approximately represents the mapping of the retina into the visual cortex in primate vision. Using LPM makes it possible not only to obtain both a high central resolution and a Wide field of view, but also to significantly reduce processing image data. In this paper, LPM is performed in software by lookup table method, Our tracking method utilizes zero disparity filter (ZDF) for extracting the target object and virtual horopter method for estimating binocular disparities, respectively. The performance of both target extraction and disparity estimation is improved in comparison With the conventional methods, by using LPM. Some experimental results are also shown to demonstrate the effectiveness of the proposed method.

  • アクティブなステレオ視覚からの隠れ情報に基づ く面構造の復元

    西川 敦, 小川晋平, 丸 典明, 宮崎文夫 (Part: Last author )

    電子情報通信学会誌   79 ( 2 ) 153 - 164   1995.12  [Refereed]

  • Detecting object surfaces by using occlusion information from active binocular stereo

    A NISHIKAWA, S OGAWA, N MARU, F MIYAZAKI

    PROCEEDINGS OF 1995 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, VOLS 1-3 ( I E E E )    2974 - 2981   1995  [Refereed]

  • 3-D TRACKING OF A MOVING OBJECT BY AN ACTIVE STEREO VISION SYSTEM

    M TANAKA, N MARU, F MIYAZAKI

    IECON '94 - 20TH INTERNATIONAL CONFERENCE ON INDUSTRIAL ELECTRONICS, CONTROL AND INSTRUMENTATION, VOL 1-3 ( IEEE )    816 - 820   1994  [Refereed]

  • アクティブなステレオビジョンによる曲面の隠れ輪郭の検出

    西川 敦, 丸 典明, 宮崎文夫 (Part: Last author )

    電子情報通信学会誌   76 ( 8 ) 1654 - 1666   1993.08  [Refereed]

  • ステレオ視によるマニピュレータのビジュアルサーボ

    加瀬 裕, 丸 典明, 西川 敦, 宮崎文夫 (Part: Corresponding author )

    システム制御情報学会誌   6 ( 8 ) 360 - 367   1993.07  [Refereed]

  • アクティブなカメラ運動による両眼視差の検出

    丸 典明, 西川 敦, 宮崎文夫, 有本 卓 (Part: Lead author )

    日本ロボット学会誌   11 ( 2 ) 272 - 280   1993.03  [Refereed]

  • ACTIVE BINOCULAR STEREO

    N MARU, A NISHIKAWA, F MIYAZAKI, S ARIMOTO

    1993 IEEE COMPUTER SOCIETY CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION : PROCEEDINGS ( I E E E, COMPUTER SOC PRESS )    724 - 725   1993  [Refereed]

  • MANIPULATOR CONTROL BY VISUAL SERVOING WITH THE STEREO VISION

    N MARU, H KASE, S YAMADA, A NISHIKAWA, F MIYAZAKI

    IROS 93 : PROCEEDINGS OF THE 1993 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS, VOL 1-3 ( I E E E )    1866 - 1870   1993  [Refereed]

  • VISUAL SERVOING OF THE MANIPULATOR USING THE STEREO VISION

    H KASE, N MARU, A NISHIKAWA, S YAMADA, F MIYAZAKI

    PROCEEDINGS OF THE IECON 93 - INTERNATIONAL CONFERENCE ON INDUSTRIAL ELECTRONICS, CONTROL, AND INSTRUMENTATION, VOLS 1-3 ( I E E E )    1791 - 1796   1993  [Refereed]

  • Active detection of binocular disparities

    Noriaki Maru, Atsushi Nishikawa, Fumio Miyazaki, Suguru Arimoto

        263 - 268   1992

     View Summary

    The major problem in binocular stereo vision is a well-known correspondence problem. Almost of the previous works solved this problem by imposing the additional constraints such as smoothness constraint or the ordering constraint etc. Therefore, they can not find the occlusion correctly and deal with the transparent surfaces, because these constraints are violated. In this paper, we propose an active algorithm to detect binocular disparities without using additional constraints. It is based on the motion parallax obtained by the moving monocular camera. The search range of binocular disparity is restricted based on the monocular motion parallax. The condition to find binocular disparities is only the uniqueness of disparity. Experimental results with complicated scene are presented to demonstrate the effectiveness of this method.

  • 両眼立体視における適応的視差検出法

    丸典明, 西川敦, 宮崎文夫, 有本卓

    日本ロボット学会誌   9 ( 3 ) 287-294   1991.06

▼display all

Books etc

  • Climbing and Walking Robots

    M.O.Tokhi, G.S.Wirk, M.A.Hossain( Part: Joint author,  Work: ステレオ全方位画像による4足歩行ロボットの全脚制御の方法を説明した。)

    Springer-Verlag  2006.01 

     View Summary

    ステレオ全方位画像による4足歩行ロボットの全脚制御の方法を説明した。

  • ExplorationExperimental Robotics, Lecture Notes in Computer Scince 1351

    大城尚紀,西川 敦,丸 典明,宮崎文夫( Part: Joint author,  Work: Foveated Vision for Scene)

    Springer-Verlag  1997.01 

     View Summary

    本書では,人間の視覚系を参考にした,中心かを持ったロボットの視覚システムについて説明している.人間と同様にロボットに中心かを持たせることにより,シーン中の物体を効率的に探索することが可能となる.具体的には,対数極座標変換画像(LPM画像)を用いた中心視と周辺視の構成方法と,周辺視により次の注視点を決定する方法,さらにその注視点の妥当性を中心視により評価する方法を提案している.注視点の妥当性の評価には,LPM画像の回転に対する普遍性を利用することにより,眼の姿勢の影響が少なくなることを示している.

  • 光メカトロニクス入門

    和歌山大学光メカトロニクス研究会( Part: Joint author,  Work: 第6章ロボットの執筆)

    共立出版  1996.04 

     View Summary

    本書では,ロボットにおいて光メカトロニクス技術がどのように利用されているかについて解説している.具体的には,人間の頭脳,足,腕,手などが,ロボットではどのような技術により実現されているか,またどのような光メカトロニクス製品が使われているかについて述べている.さらに,ロボット自ら環境を認識して自律的に行動できる未来の知能ロボットについて説明し,そのために必要不可欠なロボットの視覚について人間と対比しながら説明している.

  • Experimental Robotics III, Lecture Notes in Control and Information Sciences 200

    西川 敦,丸 典明,宮崎文夫( Part: Joint author,  Work: Detection of Occluding Contours and Occlusion by Active Binocular Stereo)

    Springer-Verlag  1993.12 

     View Summary

    本書では,ステレオカメラを能動的に動かすアクティブなステレオ視により,物体の曲面の隠れ輪郭とオクルージョンを高速かつ高い信頼度で検出する手法を提示している.本方法は,カメラを能動的に動かすことにより得られる単眼の粗い距離情報を用いることにより,従来の発見的な拘束条件を用いずステレオ対応点を求めるために拘束条件の成り立たない複雑なシーンに対しても有効である.本論文では,ステレオカメラにより撮像した実画像を用いて提案した手法の有効性を示している.

Misc

  • 428 Study on Control of the Hand Stiffness and Hand Trajectory of Musculoskeletal Arm Using Nonlinear Actuator Model

    FUJIMOTO Hiromi, MARU Noriaki

    関西支部講演会講演論文集 ( The Japan Society of Mechanical Engineers )  2014 ( 89 ) "4 - 28"   2014.03

  • 710 Obstacle Avoidance Leaching with the Characteristic of Human Upper-Limbs Movement By Binocular Visual Space Based Visual Servoing

    NAKAMURA Satoru, MARU Noriaki

    関西支部講演会講演論文集 ( The Japan Society of Mechanical Engineers )  2014 ( 89 ) "7 - 10"   2014.03

  • 722 Research on adaptive walking system for irregular terrain of Bipedal robot based onSemi-Passive Dynamics Walking

    MORITA Yusuke, MARU Noriaki

    関西支部講演会講演論文集 ( The Japan Society of Mechanical Engineers )  2011 ( 86 ) "7 - 22"   2011.03

  • Position and Orientation Control of Omnidirectional Mobile Robot by Linear Visual Servoing

    OZATO Atsushi, MARU Noriaki

    TRANSACTIONS OF THE JAPAN SOCIETY OF MECHANICAL ENGINEERS Series C ( The Japan Society of Mechanical Engineers )  77 ( 774 ) 460 - 469   2011

     View Summary

    We propose position and orientation control method of Omnidirectional Mobile Robot by Liner Visual Servoing(LVS). We define both the new binocular visual space which represent attitude of the target using binocular visual coordinate of two markers and the new motion space of omnidirectional mobile robot to realize position and orientation control by LVS. It is robust to calibration error of camera angles, because it does not use camera angles to calculate feedback command. Simulation results are presented to demonstrate the effectiveness of the proposed method.

    DOI

  • Development of a Low Cost High Speed Stereo Vision System for Embeded Use Using CMOS Imagers and DSP

    YAMASHITA Makoto, MARU Noriaki

    TRANSACTIONS OF THE JAPAN SOCIETY OF MECHANICAL ENGINEERS Series C ( The Japan Society of Mechanical Engineers )  77 ( 773 ) 158 - 165   2011

     View Summary

    We developped a low cost high speed stereo vision system for embeded use using CMOS imagers on the market and TI DSP(Digital Signal Processor) board that can capture subframe images in shorter period than video rate. Because the proposed system is small, low enegy consumption and low cost, it is suitable for embeded use or small autonomous mobile robot. Some experiments demonstrate that this vision system can capture 96[pixel]×96[pixel] image in 1[ms] period and detect the target position and track the target in 3[ms] period.

    DOI

  • Position and Attitude Control of Eye-In-Hand Robot by Dynamic Visual Servoing Based on Virtual Spring-Dumper Hypothesis Using Binocular Visual Space Error

    MATSUURA Shoutaro, MARU Noriaki

    TRANSACTIONS OF THE JAPAN SOCIETY OF MECHANICAL ENGINEERS Series C ( The Japan Society of Mechanical Engineers )  77 ( 776 ) 1366 - 1375   2011

     View Summary

    We propose position and attitude control method of Eye-In-Hand Robot by Dynamic visual servoing based on Virtual Spring-Dumper Hypothesis using Binocular Visual Space Error. We obtain the 3D position and attitude error from the binocular visual space error using Jacobian between cartesian space and binocular visual space. The proposed method is robust to calibration error of camera angles, because it does not use camera angles to calculate feedback command. Simulation results are presented to demonstrate the effectiveness of the proposed method.

    DOI

  • 1328 The high-speed reaching control of the robot-arm by torque control type visual servoing using binocular visual space

    SAKAGUCHI Hirotoshi, MARU Noriaki

    関西支部講演会講演論文集 ( The Japan Society of Mechanical Engineers )  2009 ( 84 ) "13 - 28"   2009.03

  • B-10-104 Synchronous Imager Control on Visible Light Communication

    Yamashita Makoto, Maru Noriaki

    Proceedings of the IEICE General Conference ( The Institute of Electronics, Information and Communication Engineers )  2009 ( 2 ) 423 - 423   2009.03

  • Following Motion Control of the Mobile Robot by Using Linear Visual Servoing

    OKAMOTO Kazuya, YAMAGUCHI Kengo, MARU Noriaki

    Transactions of the Japan Society of Mechanical Engineers. C ( The Japan Society of Mechanical Engineers )  72 ( 718 ) 1840 - 1847   2006.06

     View Summary

    We propose a method for following motion control of the mobile robot by using Linear Visual Servoing (LVS). Following motion control is realized by the following motion of the body to the target object keeping a desired distance. Following motion control by using LVS is based on linear approximation of the transformation between binocular visual space and motion space of the mobile robot. Motion space is defined by translational velocity and rotational velocity of the robot coordinate system which is attached at the center of gravity of the mobile robot. Some experimental results are presen...

    DOI

  • Reaching control of the humanoid robot by linear visual servoing

    K Yamaguchi, K Namba, N Maru

    Proceedings of the Tenth IASTED International Conference on Robotics and Applications ( ACTA PRESS )    113 - 117   2004

     View Summary

    This paper propose a, reaching control method of the 3 d.o.f. arm of the humanoid robot by linear visual servoing. Linear visual servoing is based on the linear approximation between binocular visual space and joint space of the arm of the humanoid robot which has a, similar kinematic structure as a human being. It is very robust to calibration error, especially to camera, angle errors and Joint angle errors, because it uses neither camera angles nor Joint angles to calculate feedback command. Although we showed that 3D linear visual servoing is effective in reaching control of the 3 d.o.f. arm of the humanoid robot, the reaching speed is slow because of image sampling time. Furthermore, visual servoing is effective when both the end tip of the arm and the target is in an image. In this paper, we propose a fast and accurate reaching control method of the humanoid robot by combining slow but accurate linear visual servoing and inaccurate but fast open loop control. Some experimental results are presented to demonstrate the effectiveness of the proposed method.

  • Positioning control of the arm of the humanoid robot by linear visual servoing

    K Namba, N Maru

    2003 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, VOLS 1-3, PROCEEDINGS ( IEEE )  2003   3036 - 3041   2003

     View Summary

    This paper presents a positioning control of the arm of the humanoid robot by linear visual servoing. Linear visual servoing is based on the linear approximation between binocular visual space and joint space of the arm of the humanoid robot. It is very robust to calibration error, especially to camera angle errors and joint angle errors, because it uses neither camera angles nor joint angles to calculate feedback command. In this paper, we propose a method to expand work space of linear visual servoing by using neck joint. We obtain the linear approximation matrix in wide space and express it as a function of the neck angle by using the neural network. Some experimental results are presented to demonstrate the effectiveness of the proposed method.

  • Precise Planar Positioning Method Using Visual Servoing Based on Coarse Optical Flow

    MITSUDA Takashi, MIYAZAKI Yoji, MARU Noriaki, MIYAZAKI Fumio

    Journal of the Robotics Society of Japan ( The Robotics Society of Japan )  17 ( 2 ) 227 - 233   1999.03

     View Summary

    The planar positioning of an object using a camera is an important technique for minute manufacturing. And detecting a feature in an image is an essential subject for it, thus research has been actively pursued in this area. Template matching is a useful method of detecting a feature in an image. It doesnt require any complicated settings, and we can use it more easily than other methods (centroid determination in binary image processing, etc.), by an equipment on the market. However, template matching is poor at detecting the rotation of a feature, and the amount of calculation is large. I...

    DOI

  • The Analysis of Unreconstructible Surface from Images, and the Effect of Calibration Error on Visual Servoing in a Stereo Vision System

    FUJIKAWA Kazunobu, KOARA Kengo, MIYAZAKI Fumio, MARU Noriaki

    Journal of the Robotics Society of Japan ( 日本ロボット学会 )  16 ( 7 ) 1026 - 1028   1998.10

     View Summary

    This paper presents, in positioning a robot manipulator by visual servoing with stereo vision, when pan angles are not exactly calibrated, the analysis of unreconstructible surface from images, the effect on visual servoing, and the region robust to calibration error about pan angles.

    DOI

  • The Analysis of Unreconstructible Surface from Images, and the Effect of Calibration Error on Visual Servoing in a Stereo Vision System

    Fujikawa Kazunobu, Koara Kengo, Miyazaki Fumio, Maru Noriaki

    Journal of the Robotics Society of Japan ( The Robotics Society of Japan )  16 ( 7 ) 1026 - 1028   1998

     View Summary

    This paper presents, in positioning a robot manipulator by visual servoing with stereo vision, when pan angles are not exactly calibrated, the analysis of unreconstructible surface from images, the effect on visual servoing, and the region robust to calibration error about pan angles.

    DOI

  • Suitable kinematic relationships which simplify the transformation between visual and proprioceptive coordinate systems

    MITSUDA Takashi, MARU Noriaki, MIYAZAKI Fumio

    生体・生理工学シンポジウム論文集   12   413 - 416   1997.09

  • Binocular Tracking based on Retino-Cortical Mapping

    OSHIRO Naoki, MARU Noriaki, NISHIKAWA Atsushi, MIYAZAKI Fumio

    Transactions of the Institute of Systems,Control and Information Engineers ( システム制御情報学会 )  10 ( 6 ) 287 - 296   1997.06

     View Summary

    This paper describes a new binocular tracking method using Log Polar Mapping (LPM) which approximately represents the mapping of the retina into the visual cortex in primate vision : Using LPM makes it possible not only to obtain both a high central resolution and a wide field of view, but also to significantly reduce processing image data. In this paper, LPM is performed in software by lookup table method. Our tracking method utilizes zero disparity filter (ZDF) for extracting the target object and virtual horopter method for estimating binocular disparities, respectively. The performance of both target extraction and disparity estimation is improved in comparison with the conventional methods, by using LPM. Some experimental results are also shown to demonstrate the effectiveness of the proposed method.

    DOI

  • Visual Servoing Based on the Use of Binocular Visual Space

    MITSUDA Takashi, MARU Noriaki, FUJIKAWA Kazunobu, MIYAZAKI Fumio

    計測自動制御学会論文集 ( 計測自動制御学会 )  33 ( 1 ) 35 - 41   1997.01

     View Summary

    We propose a simple visual servoing scheme based on the use of binocular visual space. When we use a hand-eye system which has a similar kinematic structure to a human being, we can approximate the transformation from a binocular visual space to a joint space of the manipulator as a linear time-invariant mapping. This relationship makes it possible to generate joint velocities from image observations using a constant linear mapping. This scheme is robust to calibration error, because it uses neither camera angles nor joint angles. Some experimental results are also shown to demonstrate the positioning precision remained unchanged despite the calibration error.

    DOI

  • Linear Approximation of the Inverser Kinematics by using a Binocular Visual Space

    MITSUDA Takashi, MARU Noriaki, FUJIKAWA Kazunobu, MIYAZAKI Fumio

    Journal of the Robotics Society of Japan ( The Robotics Society of Japan )  14 ( 8 ) 1145 - 1151   1996.11

     View Summary

    We propose a linear approximation method of the inverser kinematics of a manipulator by using a binocular visual space. When we use a hand-eye system which has a similar structure as a human being, we can approximate the transformation from a binocular visual space to a joint space of the manipulator as a linear function. We show that the most suitable structure of a stereo camera and a manipulator for linear approximation of the inverser kinematics is similar to human being.

    DOI

  • Visual Servoing based on Linear Approximation of the Inverser Kinematics

    MITSUDA Takashi, MARU Noriaki, FUJIKAWA Kazunobu, MIYAZAKI Fumio

    Journal of the Robotics Society of Japan ( The Robotics Society of Japan )  14 ( 5 ) 743 - 750   1996.07

     View Summary

    We propose a simple visual servoing based on linear approximation of the inverse kinematics. When we use a hand-eye system which has a similar structure as a human being, we can approximate the transformation from a binocular visual space to a joint space of the manipulator as a linear function. This relationship makes it possible to produce the desired joint angles from the image data using a constant linear function instead of the variable nonlinear image jacobian and robot jacobian. This method is robust to the calibration error, because it uses neithor camera angles nor joint angles. We...

    DOI

  • Reconstruction of Object Surfaces by Using Occlusion Information from Active Stereo Vision

    NISHIKAWA Atsushi, OGAWA Shinpei, MARU Noriaki, MIYAZAKI Fumio

    The transactions of the Institute of Electronics, Information and Communication Engineers ( The Institute of Electronics, Information and Communication Engineers )  79 ( 2 ) 153 - 164   1996.02

     View Summary

    未知の3次元環境を認識・行動するロボットにとって, 環境の面構造(どこが面でどこが空間か)を復元する手法を確立することは必要不可欠な課題である. 本論文では, ステレオカメラをアクティブに運動させることにより環境の面構造を復元する一方法を提案する. まず, 左右の各画像中で同一のエピポーラ線上に連続して出現する二つのエッジ間に対応する仮想的な3次元線分(アーク)の集合によって環境の面構造を表現し, ステレオカメラをアクティブに運動させることにより得られるエッジ点の3次元位置情報と隠れ情報に基づいて, 各アーク間に物体表面が存在するか否かの判定を行う. 更に, 現在までに復元された面構造に基づいて, 物体表面が存在するか否かが新たに最も多く決定できるであろうカメラ位置を予測し, その位置ヘステレオカメラを移動させた後, 同様の処理を新たに決定できる可能性があるアークが生じなくなるまで繰り返し行う. 本方法は, 従来の研究ではほとんど議論されていなかった面構造を効率良く復元するためのカメラの移動戦略や視点の選択法に対する一つの解答を与えている. 提案する手法を用いれば, 少ない観測回数で環境の面構造が正しく復元できることが, 実シーンを用いた実験により示される.

  • Detection of Occluding Contours by Using Active Stereo Vision

    Nishikawa Atsushi, Maru Noriaki, Miyazaki Fumio

    The transactions of the Institute of Electronics, Information and Communication Engineers ( The Institute of Electronics, Information and Communication Engineers )  76 ( 8 ) 1654 - 1666   1993.08

     View Summary

    球の輪郭線や円柱の側面のように,視点位置に依存して生成される輪郭線(=輪郭生成線)の像を隠れ輪郭と呼ぶ.隠れ輪郭には物体表面を記述するための多くの有益な情報が含まれている.本論文では.ステレオカメラをアクティブに運動させることにより曲面の隠れ輪郭を検出する一方法を提案する.まず,物体表面と隠れ輪郭の幾何学的関係を記述するモデル(隠れ輪郭モデル)を導出する.カメラのアクティブな移動によって生じる像の動きをこのモデルに当てはめると,もう一方のカメラにおける対応点の存在範囲を拘束することができる.この幾何学的拘束に基づいて各画像間でステレオ対応付けを行い,得られた対応点の系列をモデルに最小2乗フィッティングさせることによって画像中の隠れ輪郭を検出する.本論文で示す方法では,隠れ輪郭のモデルを直接利用してステレオ対応処理を行っているために,モデルを用いない方法や発見的拘束を用いる従来の手法に比べて,より正確に隠れ輪郭を検出することができる.また,隠れ輪郭近傍の局所3次元形状をモデルから容易に推定することができるという利点もある.人工画像ならびに実画像を用いたいくつかの実験により提案する手法の有効性を示している.

  • Manipulator Control by Visual Servoing with Stereo Vision

    KASE Hiroshi, MARU Noriaki, NISHIKAWA Atsushi, MIYAZAKI Fumio

    Transactions of the Institute of Systems,Control and Information Engineers ( システム制御情報学会 )  6 ( 8 ) 360 - 367   1993.08

     View Summary

    This paper presents a new method of visual servoing with stereo vision to control the position of a manipulator with respect to an object. Conventional control methods by visual servoing use a monocular camera and have several problems. For example, either shape information or desired distance of the target object from the camera must be given. Furthermore, it is not stable if the initial positional error of features in the image is very large. These problems are caused by image the Jacobian matrix, that is, the approximate value at desired position is used instead of correct one. By using stereo vision, the image Jacobian matrix can be calculated correctly at any position. So neither shape information nor desired distance of the target object is required. It is also stable even if the initial error is very large. Both simulation and experimental results are presented to demonstrate the effectiveness of this method.

    DOI

  • Detection of Binocular Disparity based on Active Camera Motion

    MARU Noriaki, NISHIKAWA Atsushi, MIYAZAKI Fumio, ARIMOTO Suguru

    Journal of the Robotics Society of Japan ( The Robotics Society of Japan )  11 ( 2 ) 272 - 280   1993.03

     View Summary

    The major problem in binocular stereo vision is a well-known correspondence problem. Almost of the previous works solved it by regularization with the additional constraints such as the smoothness constraint or the ordering constraint. Therefore, they can not find the occlusion correctly and deal with the transparent surfaces, because these constraints are violated in the complicated scene contain-ing many occlusion. In this paper, we propose a method to detect binocular disparity and occlusion without using additional constraints by moving stereo cameras actively. It is based on the motion...

    DOI

▼display all

Conference Activities & Talks

  • 繰り返し学習を用いた筋内力による筋骨格ロボットアームの剛性楕円体の軸方向制御

    金田, 丸

    システム制御情報学会・計測自動制御学会シンポジウム  2022.01.10  

  • 視空間ビジュアルサーボとDWAを用いた移動ロボットの軌道制御

    鶴見, 丸

    日本機械学会関西支部講演会  2021.03.12  

  • 視空間とファジィ制御を用いたドローンの着陸時における障害物回避の研究

    原田 拓哉, 丸 典明

    システム制御情報学会・計測自動制御学会シンポジウム  2021.01.10  

  • 視空間ベースビジュアルサーボによる角度ポテンシャル法を用いた移動ロボットの障害物回避

    日高, 丸

    計測自動制御学会関西支部・システム制御情報学会シンポジウム  2021.01  

  • 視空間ビジュアルサーボによるクワッドコプターの自動着陸におけるステレオ魚眼カメラを用いた障害物回避

    渕上, 丸

    システム制御情報学会・計測自動制御学会講演会  2020.01.10  

  • Eye-And-Handロボットアームの視空間ベースビジュアルサーボにおける繰り返し学習制御を用いた軌道制御

    広瀬, 丸

    システム制御情報学会・計測自動制御学会講演会  2020.01.10  

  • 視空間ベースビジュアルサーボによるEye-And-Handロボットアームの制御

    広瀬, 丸

    ROBOMEC  2019.06  

  • 視空間ビジュアルサーボにおける視空間ヤコビアンを用いた補償眼球運動

    小柴, 丸

    日本機械学会関西支部講演会  2019.03.19  

  • 視空間ベースビジュアルサーボにおける軌道制御

    堀江, 丸

    2019.03.19  

  • 視空間に基づいたEye-In-Hand型ロボットアームの位置と姿勢の制御

    堀江, 丸

    日本ロボット学会学術講演会  2018.09  

  • 視空間ビジュアルサーボを用いたEye-In-Hand型ロボットアームの位置と姿勢の制御

    松浦, 丸

    計測自動制御学会SI部門講演会  2011.12  

▼display all

Research Exchange

  • システム制御情報学会・計測自動制御学会シンポジウム

    2022.01
     
  • 日本機械学会関西支部講演会

    2021.03
     
  • システム制御情報学会・計測自動制御学会シンポジウム

    2021.01
     
  • システム制御情報学会・計測自動制御学会シンポジウム

    2021.01
     
  • システム制御情報学会・計測自動制御学会シンポジウム

    2020.01
     
  • システム制御情報学会・計測自動制御学会シンポジウム

    2020.01
     
  • ROBOMEC

    2019.06
     
  • システム制御情報学会講演会

    2017.05
     
  • システム制御情報学会講演会

    2016.05
     
  • システム制御情報学会講演会

    2015.05
     
  • システム制御情報学会講演会

    2014.05
     
  • システム制御情報学会講演会

    2013.05
     
  • 自立分散システム部会研究会

    2012.06
     
  • システム制御情報学会講演会

    2012.05
     
  • 立石記念公園会

    2012.05
     
  • 第10回アイサイ企業交流会

    2012.03
     
  • 第12回制御部門大会

    2012.03
     
  • IEEE Kansai Section第66回技術講演会

    2011.12
     
  • 農業用パワーアシストPJ報告会

    2011.12
     
  • MATLABの活用事例

    2011.10
     
  • ナック大阪MAC3D人間計測セミナー

    2011.10
     
  • 2011光メカトロニクス技術at 和歌山大学

    2011.09
     
  • 農業用パワーアシストPJ報告会

    2011.09
     
  • 日本ロボット学会学術講演会

    2011.09
     
  • 第26回生体生理工学シンポジウム

    2011.09
     
  • 包括脳ネットワーク夏のワークショップ

    2011.08
     
  • 第15回ASSC定例会議

    2011.06
     
  • 第55回システム制御情報学会研究発表講演会

    2011.05
     

▼display all

Public Funding (other government agencies of their auxiliary organs, local governments, etc.)

  • 農業用パワーアシストスーツの開発

    2010.04
    -
    2012.03
     

    Co-investigator

Instructor for open lecture, peer review for academic journal, media appearances, etc.

  • 講師

    2020.12
    -
    2021.01.31

    和歌山大学

     View Details

    リカレント教育

    リカレント教育(ロボット工学の基礎)

  • 講師

    2016.07
    -
    Now

    進路研究会「開智オープンセミナー」

     View Details

    講演講師等

    講師

  • 論文の査読

    2014.04
    -
    2015.03

    日本ロボット学会

     View Details

    学術雑誌等の編集委員・査読・審査員等

    論文の査読,任期:2014.4.1~2015.3.31

  • 論文の査読

    2013.04
    -
    Now

    日本ロボット学会

     View Details

    学術雑誌等の編集委員・査読・審査員等

    論文の査読

  • 公開体験学習

    2013.04

    不明

     View Details

    公開講座・講演会の企画・講師等

    体験学習でロボットの操作体験をさせた,日付:11月24日

  • 体験学習におけるデモ

    2013.04

    その他

     View Details

    小・中・高校生を対象とした学部体験入学・出張講座等

    ヒューマノイドロボットのデモを行った,日付:2013.11~2013.11

  • 大学説明会におけるデモ

    2013.04

    その他

     View Details

    小・中・高校生を対象とした学部体験入学・出張講座等

    ヒューマノイドロボットのデモを行った,日付:2013.8~2013.8

  • 論文の査読

    2012.04
    -
    2012.05

    日本ロボット学会誌

     View Details

    学術雑誌等の編集委員・査読・審査員等

    論文の査読

  • 公開体験学習

    2012.04

    不明

     View Details

    公開講座・講演会の企画・講師等

    体験学習でロボットの操作体験をさせた,日付:11月24日

  • 大学説明会におけるデモ

    2012.04

    その他

     View Details

    小・中・高校生を対象とした学部体験入学・出張講座等

    ヒューマノイドロボットのデモを行った,日付:2012.8~2012.8

  • 体験学習におけるデモ

    2012.04

    その他

     View Details

    小・中・高校生を対象とした学部体験入学・出張講座等

    ヒューマノイドロボットのデモを行った,日付:2012.11~2012.11

  • サイエンスキャンプ

    2012.04

    その他

     View Details

    小・中・高校生を対象とした学部体験入学・出張講座等

    高校生に大して2泊3日のサイエンスキャンプを行った,日付:8月20日~22日

  • 公開体験学習

    2011.04

    不明

     View Details

    公開講座・講演会の企画・講師等

    体験学習でロボットの操作体験をさせた,日付:11月24日

  • 大学説明会におけるデモ

    2011.04

    その他

     View Details

    小・中・高校生を対象とした学部体験入学・出張講座等

    ヒューマノイドロボットのデモを行った,日付:2011.8~2011.8

  • 体験学習におけるデモ

    2011.04

    その他

     View Details

    小・中・高校生を対象とした学部体験入学・出張講座等

    ヒューマノイドロボットのデモを行った,日付:2011.11~2011.11

  • 体験学習におけるデモ

    2010.04

    その他

     View Details

    小・中・高校生を対象とした学部体験入学・出張講座等

    ヒューマノイドロボットのデモを行った,日付:2010.11~2010.11

  • 大学説明会におけるデモ

    2010.04

    その他

     View Details

    小・中・高校生を対象とした学部体験入学・出張講座等

    ヒューマノイドロボットのデモを行った,日付:2010.8~2010.8

  • オープンキャンパス

    2009.11

    不明

     View Details

    公開講座・講演会の企画・講師等

    オープンキャンパスでヒューマノイドロボットのデモを行った,日付:2009.11

  • 大学説明会におけるデモ

    2009.08

    その他

     View Details

    小・中・高校生を対象とした学部体験入学・出張講座等

    ヒューマノイドロボットのデモを行った,日付:2009.8

  • 大学説明会におけるデモ

    2009.04

    その他

     View Details

    小・中・高校生を対象とした学部体験入学・出張講座等

    ヒューマノイドロボットのデモを行った,日付:2008.8~2009.8

  • オープンキャンパスにおけるデモ

    2008.11

    不明

     View Details

    公開講座・講演会の企画・講師等

    オープンキャンパスにおけるデモ,日付:2008.11

  • 大学説明会におけるデモ

    2008.04

    その他

     View Details

    小・中・高校生を対象とした学部体験入学・出張講座等

    ヒューマノイドロボットのデモを行った,日付:2008.8~2009.8

  • オープンキャンパスにおけるでも

    2007.11

    不明

     View Details

    公開講座・講演会の企画・講師等

    ヒューマノイドロボットのデモを行った,日付:2007.11

  • 大学説明会におけるでも

    2007.08

    その他

     View Details

    小・中・高校生を対象とした学部体験入学・出張講座等

    ヒューマノイドロボットのデモを行った,日付:2007.8

  • オープンキャンパスにおけるでも

    2006.11

    不明

     View Details

    公開講座・講演会の企画・講師等

    ヒューマノイドロボットのデモを行った,日付:2006.11

  • 大学説明会におけるデモ

    2006.08

    その他

     View Details

    小・中・高校生を対象とした学部体験入学・出張講座等

    ヒューマノイドロボットのデモを行った,日付:2006.8

  • 奨励賞選考委員

    2006.04
    -
    2006.12

    日本ロボット学会

     View Details

    学術雑誌等の編集委員・査読・審査員等

    奨励賞選考委員,任期:1

  • 和歌山商工まつり

    2004.10

    和歌山商工まつり

     View Details

    公開講座・講演会の企画・講師等

    和歌山商工会議所からの依頼により、和歌山商工まつりにてロボットの展示デモンストレーションを行った。,日付:2004.10

  • スーバーサイエンスハイスクール

    2004.07

    その他

     View Details

    小・中・高校生を対象とした学部体験入学・出張講座等

    桐蔭高校の依頼により、1年生の理科系学生40人にスーパーサイエンスハイスクールにおけるロボット教育として、ロボットのしくみの講義とマインドストームを用いた製作実習の指導を行った。,日付:2004.7

  • 公開講座

    2004.04

    放送大学

     View Details

    公開講座・講演会の企画・講師等

    放送大学からの依頼により、社会人に対してロボットのしくみについて講義した。,日付:2003.4~2004.4

  • スーパーサイエンスハイスクール

    2004.04

    その他

     View Details

    小・中・高校生を対象とした学部体験入学・出張講座等

    桐蔭高校の依頼により、1年生の理科系学生40人にスーパーサイエンスハイスクールにおけるロボット教育として、ロボットのしくみの講義とマインドストームを用いた製作実習の指導を行った。,日付:2003.11~2004.11

  • 公開講座

    2003.04

    放送大学

     View Details

    公開講座・講演会の企画・講師等

    放送大学からの依頼により、社会人に対してロボットのしくみについて講義した。,日付:2003.4~2004.4

  • スーパーサイエンスハイスクール

    2003.04

    その他

     View Details

    小・中・高校生を対象とした学部体験入学・出張講座等

    桐蔭高校の依頼により、1年生の理科系学生40人にスーパーサイエンスハイスクールにおけるロボット教育として、ロボットのしくみの講義とマインドストームを用いた製作実習の指導を行った。,日付:2003.11~2004.11

  • スーバーサイエンスプログラム

    2003.01

    その他

     View Details

    小・中・高校生を対象とした学部体験入学・出張講座等

    桐蔭高校によるスーバーサイエンスプログラムにおいて高校1年生の理科系学生約40名に対してロボットのしくみと動作プログラムの製作実習の指導を行った。,日付:2003.1

  • 会誌編集委員会委員

    2002.04
    -
    2004.03

    日本ロボット学会

     View Details

    学術雑誌等の編集委員・査読・審査員等

    会誌編集委員会委員,任期:2年

  • 論文査読小委員会委員

    2002.04
    -
    2004.03

    日本ロボット学会

     View Details

    学術雑誌等の編集委員・査読・審査員等

    論文査読小委員会委員,任期:2年

  • メディア出演等

    2000.11

    読売新聞

     View Details

    研究成果に係る新聞掲載、テレビ・ラジオ出演

    人間と類似の構造を持ったヒューマノイドロボットにおける視空間と関節空間の線形近似に基づいた線形ビジュアルサーボによる腕の制御方法について報道された。

▼display all

Committee member history in academic associations, government agencies, municipalities, etc.

  • 日本機械学会関西学生会副顧問

    2022.04.01
    -
    2024.03.31
     

    日本機械学会

     View Details

    機械

    副顧問

  • 商議員

    2014.04
    -
    2015.03
     

    日本機械学会

     View Details

    学協会、政府、自治体等の公的委員

    学協会、政府、自治体等の公的委員,任期:2014.4.1~2015.3.31

  • 商議員

    2013.04
    -
    Now
     

    日本機械学会

     View Details

    学協会、政府、自治体等の公的委員

    学協会、政府、自治体等の公的委員,任期:2013.4.1~2014.3.31

  • 商議員

    2012.04
    -
    2014.03
     

    日本機械学会

     View Details

    学協会、政府、自治体等の公的委員

    学協会、政府、自治体等の公的委員,任期:2012年4月1日~2014年3月31日

  • 評議員

    2010.04
    -
    2012.03
     

    日本ロボット学会

     View Details

    学協会、政府、自治体等の公的委員

    学協会、政府、自治体等の公的委員,任期:2010.4~2012.3

  • 知能メカトロニクスワークショップ実行委員

    2009.07
    -
    2009.09
     

    精密工学会

     View Details

    学協会、政府、自治体等の公的委員

    学協会、政府、自治体等の公的委員,任期:2009.7~2009.9

  • 奨励賞選考委員

    2006.11
    -
    2007.09
     

    日本ロボット学会

     View Details

    国や地方自治体、他大学・研究機関等での委員

    奨励賞選考委員 ,任期:2006.11~2007.9

▼display all