Language:
English
繁體中文
Help
圖資館首頁
Login
Back
Switch To:
Labeled
|
MARC Mode
|
ISBD
3D vision for mobile robots: Mapping...
~
Du, Jianhao.
3D vision for mobile robots: Mapping, exploration and telepresence.
Record Type:
Electronic resources : Monograph/item
Title/Author:
3D vision for mobile robots: Mapping, exploration and telepresence.
Author:
Du, Jianhao.
Published:
Ann Arbor : ProQuest Dissertations & Theses, 2016
Description:
170 p.
Notes:
Source: Dissertation Abstracts International, Volume: 78-05(E), Section: B.
Notes:
Adviser: Weihua Sheng.
Contained By:
Dissertation Abstracts International78-05B(E).
Subject:
Electrical engineering.
Online resource:
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=10188461
ISBN:
9781369360523
3D vision for mobile robots: Mapping, exploration and telepresence.
Du, Jianhao.
3D vision for mobile robots: Mapping, exploration and telepresence.
- Ann Arbor : ProQuest Dissertations & Theses, 2016 - 170 p.
Source: Dissertation Abstracts International, Volume: 78-05(E), Section: B.
Thesis (Ph.D.)--Oklahoma State University, 2016.
Visual perception is a fundamental task in many robotic applications. This dissertation focuses on developing a framework for mobile robots to intelligently perceive, explore and interact with 3D environments using an RGB-D camera. First, a motion-assisted 3D mapping method is proposed to improve the accuracy of 3D mapping. The motion data are fused with the point clouds after calibration and a multi-level ICP (Iterative Closest Point) method is applied for further refinement of the 3D map. Second, we propose a proactive strategy for robotic 3D mapping in which the robot consistently monitors its mapping performance and takes actions when the mapping performance deteriorates. A pattern is projected into the environment to increase the number of features when the pose estimation has large errors. Third, we propose a method to enable the robot to autonomously explore an unknown 3D indoor environment and efficiently generate a 3D map of the environment. The viewpoint planning is based on the frontiers extracted from a partial map and an entropy-based metric is used to select the next best viewpoint. Fourth, a human-robot collaborative 3D mapping framework is proposed which combines both robot's capability of quantitatively evaluating the map and human's high-level intelligence of global planning. The motion of the robot is fused using a Bayesian framework to improve the mapping accuracy. A self-recovery mechanism is introduced to improve the robustness. Fifth, a virtual-reality-based human-guided 3D mapping and virtual exploration framework is proposed and implemented using an Oculus Rift goggle. The 3D data are rendered to the goggle for stereoscopic display and the user's intentions are recognized to control the robot. Sixth, we implement a virtual-reality-based robotic telepresence system. The intentions of the user are inferred using hidden Markov models and four microphones are used to detect the direction of the sound. A two-stage collaborative control scheme is proposed to combine intention recognition and sound localization. Various experiments are conducted and the proposed intelligent robotic 3D system is fully evaluated for mapping, exploration and telepresence. Our work can be extended in many robotic applications.
ISBN: 9781369360523Subjects--Topical Terms:
454503
Electrical engineering.
3D vision for mobile robots: Mapping, exploration and telepresence.
LDR
:03164nmm a2200289 4500
001
502172
005
20170619070734.5
008
170818s2016 ||||||||||||||||| ||eng d
020
$a
9781369360523
035
$a
(MiAaPQ)AAI10188461
035
$a
AAI10188461
040
$a
MiAaPQ
$c
MiAaPQ
100
1
$a
Du, Jianhao.
$3
766235
245
1 0
$a
3D vision for mobile robots: Mapping, exploration and telepresence.
260
$a
Ann Arbor :
$b
ProQuest Dissertations & Theses,
$c
2016
300
$a
170 p.
500
$a
Source: Dissertation Abstracts International, Volume: 78-05(E), Section: B.
500
$a
Adviser: Weihua Sheng.
502
$a
Thesis (Ph.D.)--Oklahoma State University, 2016.
520
$a
Visual perception is a fundamental task in many robotic applications. This dissertation focuses on developing a framework for mobile robots to intelligently perceive, explore and interact with 3D environments using an RGB-D camera. First, a motion-assisted 3D mapping method is proposed to improve the accuracy of 3D mapping. The motion data are fused with the point clouds after calibration and a multi-level ICP (Iterative Closest Point) method is applied for further refinement of the 3D map. Second, we propose a proactive strategy for robotic 3D mapping in which the robot consistently monitors its mapping performance and takes actions when the mapping performance deteriorates. A pattern is projected into the environment to increase the number of features when the pose estimation has large errors. Third, we propose a method to enable the robot to autonomously explore an unknown 3D indoor environment and efficiently generate a 3D map of the environment. The viewpoint planning is based on the frontiers extracted from a partial map and an entropy-based metric is used to select the next best viewpoint. Fourth, a human-robot collaborative 3D mapping framework is proposed which combines both robot's capability of quantitatively evaluating the map and human's high-level intelligence of global planning. The motion of the robot is fused using a Bayesian framework to improve the mapping accuracy. A self-recovery mechanism is introduced to improve the robustness. Fifth, a virtual-reality-based human-guided 3D mapping and virtual exploration framework is proposed and implemented using an Oculus Rift goggle. The 3D data are rendered to the goggle for stereoscopic display and the user's intentions are recognized to control the robot. Sixth, we implement a virtual-reality-based robotic telepresence system. The intentions of the user are inferred using hidden Markov models and four microphones are used to detect the direction of the sound. A two-stage collaborative control scheme is proposed to combine intention recognition and sound localization. Various experiments are conducted and the proposed intelligent robotic 3D system is fully evaluated for mapping, exploration and telepresence. Our work can be extended in many robotic applications.
590
$a
School code: 0664.
650
4
$a
Electrical engineering.
$3
454503
650
4
$a
Robotics.
$3
181952
690
$a
0544
690
$a
0771
710
2
$a
Oklahoma State University.
$b
Electrical Engineering.
$3
492989
773
0
$t
Dissertation Abstracts International
$g
78-05B(E).
790
$a
0664
791
$a
Ph.D.
792
$a
2016
793
$a
English
856
4 0
$u
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=10188461
based on 0 review(s)
ALL
電子館藏
Items
1 records • Pages 1 •
1
Inventory Number
Location Name
Item Class
Material type
Call number
Usage Class
Loan Status
No. of reservations
Opac note
Attachments
000000135110
電子館藏
1圖書
學位論文
TH 2016
一般使用(Normal)
On shelf
0
1 records • Pages 1 •
1
Multimedia
Multimedia file
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=10188461
Reviews
Add a review
and share your thoughts with other readers
Export
pickup library
Processing
...
Change password
Login