語系:
繁體中文
English
說明(常見問題)
圖資館首頁
登入
回首頁
切換:
標籤
|
MARC模式
|
ISBD
3D vision for mobile robots: Mapping...
~
Du, Jianhao.
3D vision for mobile robots: Mapping, exploration and telepresence.
紀錄類型:
書目-電子資源 : Monograph/item
正題名/作者:
3D vision for mobile robots: Mapping, exploration and telepresence.
作者:
Du, Jianhao.
出版者:
Ann Arbor : ProQuest Dissertations & Theses, 2016
面頁冊數:
170 p.
附註:
Source: Dissertation Abstracts International, Volume: 78-05(E), Section: B.
附註:
Adviser: Weihua Sheng.
Contained By:
Dissertation Abstracts International78-05B(E).
標題:
Electrical engineering.
電子資源:
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=10188461
ISBN:
9781369360523
3D vision for mobile robots: Mapping, exploration and telepresence.
Du, Jianhao.
3D vision for mobile robots: Mapping, exploration and telepresence.
- Ann Arbor : ProQuest Dissertations & Theses, 2016 - 170 p.
Source: Dissertation Abstracts International, Volume: 78-05(E), Section: B.
Thesis (Ph.D.)--Oklahoma State University, 2016.
Visual perception is a fundamental task in many robotic applications. This dissertation focuses on developing a framework for mobile robots to intelligently perceive, explore and interact with 3D environments using an RGB-D camera. First, a motion-assisted 3D mapping method is proposed to improve the accuracy of 3D mapping. The motion data are fused with the point clouds after calibration and a multi-level ICP (Iterative Closest Point) method is applied for further refinement of the 3D map. Second, we propose a proactive strategy for robotic 3D mapping in which the robot consistently monitors its mapping performance and takes actions when the mapping performance deteriorates. A pattern is projected into the environment to increase the number of features when the pose estimation has large errors. Third, we propose a method to enable the robot to autonomously explore an unknown 3D indoor environment and efficiently generate a 3D map of the environment. The viewpoint planning is based on the frontiers extracted from a partial map and an entropy-based metric is used to select the next best viewpoint. Fourth, a human-robot collaborative 3D mapping framework is proposed which combines both robot's capability of quantitatively evaluating the map and human's high-level intelligence of global planning. The motion of the robot is fused using a Bayesian framework to improve the mapping accuracy. A self-recovery mechanism is introduced to improve the robustness. Fifth, a virtual-reality-based human-guided 3D mapping and virtual exploration framework is proposed and implemented using an Oculus Rift goggle. The 3D data are rendered to the goggle for stereoscopic display and the user's intentions are recognized to control the robot. Sixth, we implement a virtual-reality-based robotic telepresence system. The intentions of the user are inferred using hidden Markov models and four microphones are used to detect the direction of the sound. A two-stage collaborative control scheme is proposed to combine intention recognition and sound localization. Various experiments are conducted and the proposed intelligent robotic 3D system is fully evaluated for mapping, exploration and telepresence. Our work can be extended in many robotic applications.
ISBN: 9781369360523Subjects--Topical Terms:
454503
Electrical engineering.
3D vision for mobile robots: Mapping, exploration and telepresence.
LDR
:03164nmm a2200289 4500
001
502172
005
20170619070734.5
008
170818s2016 ||||||||||||||||| ||eng d
020
$a
9781369360523
035
$a
(MiAaPQ)AAI10188461
035
$a
AAI10188461
040
$a
MiAaPQ
$c
MiAaPQ
100
1
$a
Du, Jianhao.
$3
766235
245
1 0
$a
3D vision for mobile robots: Mapping, exploration and telepresence.
260
$a
Ann Arbor :
$b
ProQuest Dissertations & Theses,
$c
2016
300
$a
170 p.
500
$a
Source: Dissertation Abstracts International, Volume: 78-05(E), Section: B.
500
$a
Adviser: Weihua Sheng.
502
$a
Thesis (Ph.D.)--Oklahoma State University, 2016.
520
$a
Visual perception is a fundamental task in many robotic applications. This dissertation focuses on developing a framework for mobile robots to intelligently perceive, explore and interact with 3D environments using an RGB-D camera. First, a motion-assisted 3D mapping method is proposed to improve the accuracy of 3D mapping. The motion data are fused with the point clouds after calibration and a multi-level ICP (Iterative Closest Point) method is applied for further refinement of the 3D map. Second, we propose a proactive strategy for robotic 3D mapping in which the robot consistently monitors its mapping performance and takes actions when the mapping performance deteriorates. A pattern is projected into the environment to increase the number of features when the pose estimation has large errors. Third, we propose a method to enable the robot to autonomously explore an unknown 3D indoor environment and efficiently generate a 3D map of the environment. The viewpoint planning is based on the frontiers extracted from a partial map and an entropy-based metric is used to select the next best viewpoint. Fourth, a human-robot collaborative 3D mapping framework is proposed which combines both robot's capability of quantitatively evaluating the map and human's high-level intelligence of global planning. The motion of the robot is fused using a Bayesian framework to improve the mapping accuracy. A self-recovery mechanism is introduced to improve the robustness. Fifth, a virtual-reality-based human-guided 3D mapping and virtual exploration framework is proposed and implemented using an Oculus Rift goggle. The 3D data are rendered to the goggle for stereoscopic display and the user's intentions are recognized to control the robot. Sixth, we implement a virtual-reality-based robotic telepresence system. The intentions of the user are inferred using hidden Markov models and four microphones are used to detect the direction of the sound. A two-stage collaborative control scheme is proposed to combine intention recognition and sound localization. Various experiments are conducted and the proposed intelligent robotic 3D system is fully evaluated for mapping, exploration and telepresence. Our work can be extended in many robotic applications.
590
$a
School code: 0664.
650
4
$a
Electrical engineering.
$3
454503
650
4
$a
Robotics.
$3
181952
690
$a
0544
690
$a
0771
710
2
$a
Oklahoma State University.
$b
Electrical Engineering.
$3
492989
773
0
$t
Dissertation Abstracts International
$g
78-05B(E).
790
$a
0664
791
$a
Ph.D.
792
$a
2016
793
$a
English
856
4 0
$u
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=10188461
筆 0 讀者評論
全部
電子館藏
館藏
1 筆 • 頁數 1 •
1
條碼號
館藏地
館藏流通類別
資料類型
索書號
使用類型
借閱狀態
預約狀態
備註欄
附件
000000135110
電子館藏
1圖書
學位論文
TH 2016
一般使用(Normal)
在架
0
1 筆 • 頁數 1 •
1
多媒體
多媒體檔案
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=10188461
評論
新增評論
分享你的心得
Export
取書館別
處理中
...
變更密碼
登入