<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
	<id>https://ims.ut.ee/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Robertvalner</id>
	<title>Intelligent Materials and Systems Lab - User contributions [en]</title>
	<link rel="self" type="application/atom+xml" href="https://ims.ut.ee/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Robertvalner"/>
	<link rel="alternate" type="text/html" href="https://ims.ut.ee/Special:Contributions/Robertvalner"/>
	<updated>2026-04-24T14:25:21Z</updated>
	<subtitle>User contributions</subtitle>
	<generator>MediaWiki 1.38.2</generator>
	<entry>
		<id>https://ims.ut.ee/index.php?title=Theses_in_Robotics&amp;diff=19572</id>
		<title>Theses in Robotics</title>
		<link rel="alternate" type="text/html" href="https://ims.ut.ee/index.php?title=Theses_in_Robotics&amp;diff=19572"/>
		<updated>2019-06-07T09:01:06Z</updated>

		<summary type="html">&lt;p&gt;Robertvalner: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:IMS-robotics]]&lt;br /&gt;
= Projects in Advanced Robotics =&lt;br /&gt;
''The main objective of the follwing projects is to give students experience in working with advanced robotics tehcnology. Our group is active in several R&amp;amp;D projects involving human-robot collaboration, intuitive teleoperation of robots, and autonomous navigation of unmanned mobile platforms. Our main software platforms are [http://www.ros.org/ Robot Operating System (ROS)] for developing software for advanced robot systems and [http://gazebosim.org/ Gazebo] for running realistic robotic simulations.''&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
For further information, contact [[User:Karl|Karl Kruusamäe]]&lt;br /&gt;
&lt;br /&gt;
The following is not an exhaustive list of all available thesis/research topics.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Image:Youbot.png|200px|thumb|right|KUKA youBot]]&lt;br /&gt;
== Development of demonstrative and promotional applications for KUKA youBot ==&lt;br /&gt;
The goal of this project is to develop promotional use cases for KUKA youBot, that demonstrate the capabilities of modern robotics and inspire people to get involved with it. &lt;br /&gt;
The list of possible robotic demos include:&lt;br /&gt;
*demonstration of motion planning algorithms for mobile manipulation,&lt;br /&gt;
*using 3D vision for human and/or environment detection,&lt;br /&gt;
*interactive navigation,&lt;br /&gt;
*autonomous path planning,&lt;br /&gt;
*different pick-and-place applications,&lt;br /&gt;
*and human-robot collaboration.&lt;br /&gt;
&lt;br /&gt;
[[Image:Ur5 left.png|200px|thumb|left|Universal Robots UR5]]&lt;br /&gt;
&lt;br /&gt;
== Development of demonstrative and promotional applications for Universal Robots UR5 ==&lt;br /&gt;
Sample demonstrations include:&lt;br /&gt;
*autonomous pick-and-place,&lt;br /&gt;
*load-assistance for human-robot collaboration,&lt;br /&gt;
*packaging,&lt;br /&gt;
*physical compliance during human-robot interaction,&lt;br /&gt;
*tracing objects surface during scanning,&lt;br /&gt;
*robotic kitting,&lt;br /&gt;
*grinding of non-flat surfaces.&lt;br /&gt;
&lt;br /&gt;
== Development of demonstrative and promotional applications for Clearpath Jackal ==&lt;br /&gt;
Sample demonstrations include:&lt;br /&gt;
*human-robot interaction,&lt;br /&gt;
*multi-robot mapping,&lt;br /&gt;
*autonomous driving,&lt;br /&gt;
&lt;br /&gt;
== ROS support and educational materials for open-source mobile robot ==&lt;br /&gt;
[[Image:RosLarge.png|left|100px|ROS]]&lt;br /&gt;
The goal of the project is to develop [http://www.ros.org/ Robot Operating System] (ROS) wrapper functions for drivers of an open-source mobile robot platform, that has been successfully used in Robotex competitions. Additionally, educational materials must be developed to integrate the robot platform into ROS ecosystem. The outcome of this work will give a lot of educational uses and rapid prototyping opportunities for the platform.&lt;br /&gt;
&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Ros_equation.png|x100px|What is ROS?]]&lt;br /&gt;
&lt;br /&gt;
== Detecting features of urban and off-road surroundings ==&lt;br /&gt;
Accurate navigation of self-driving unmanned robotic platforms requires identification of traversable terrain. A combined analysis of point-cloud data with RGB information of the robot's environment can help autonomous systems make correct decisions. The goal of this work is to develop algorithms for terrain classification.&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Rtab-map.png|x90px|Mapping]]&lt;br /&gt;
&lt;br /&gt;
== Robotic simulations (in Gazebo) ==&lt;br /&gt;
1) Developing large area simulation worlds for mobile robotics. In order to develop robot navigation algorithms, it is more time- and cost-efficient to test robot behavior in a wide range of realistically simulated worlds. These simulated worlds include both indoor and outdoor environments. This work focuses on designing robot simulation environments using an open-source platform [http://gazebosim.org Gazebo].&amp;lt;br&amp;gt;&lt;br /&gt;
2) Humans in Gazebo. Integrating walking and gesturing humans to Gazebo.&amp;lt;br&amp;gt;&lt;br /&gt;
3) Tracked Robots in Gazebo. Creating and testing tracked robotis in Gaxebo and testing different track configurations.&amp;lt;br&amp;gt;&lt;br /&gt;
4) Robot basketball simulation for game strategy and shot accuracy&amp;lt;br&amp;gt;&lt;br /&gt;
5) Robotont at the Institute of Technology&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Gazebo.png|x90px|Gazebo]] [[Image:Robonaut-2-simulator.png|x90px|NASA Robonaut simulation in Gazebo]]&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Follow-the-leader robotic demo ==&lt;br /&gt;
[[Image:LeapMotion.png|200px|thumb|right|Detecting 2 hands with Leap Motion Controller]]&lt;br /&gt;
The idea is to create a robotic demonstration where a mobile robot is using Kinect or similar depth-camera for identifying a person and then starts following that person. The project will be implemented using Robot Operating System (ROS) on either KUKA youbot or similar mobile robot platform.&lt;br /&gt;
&lt;br /&gt;
== Detecting hand signals for intuitive human-robot interface ==&lt;br /&gt;
This project involves creating ROS libraries for using either a [https://www.leapmotion.com/ Leap Motion Controller] or an [http://www.intel.com/content/www/us/en/architecture-and-technology/realsense-overview.html RGB-D camera] to detect most common human hand signals (e.g., thumbs up, thumbs down, all clear, pointing into distance, inviting).&lt;br /&gt;
&lt;br /&gt;
== Virtual reality user interface (VRUI) for intuitive teleoperation system ==&lt;br /&gt;
Adding [http://www.osvr.org/hardware-devs.html virtual reality capability] to a [https://www.youtube.com/watch?v=L25HHFd00rc gesture- and natural-language-based robot teleoperation system].&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Temoto-working-principle-in-pics.png|x160px|Gesture-based teleoperation]]&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
{{doi-inline|10.1109/HSI.2016.7529630|Kruusamäe et al. (2016) High-precision telerobot with human-centered variable perspective and scalable gestural interface}}&lt;br /&gt;
&lt;br /&gt;
== Health monitor for intuitive telerobot ==&lt;br /&gt;
Intelligent status and error handling for an intuitive telerobotic system.&lt;br /&gt;
&lt;br /&gt;
== Dynamic stitching for achieveing 360° FOV ==&lt;br /&gt;
Automated image stitching of images from multiple camera sources for achieveing 360° field-of-view during mobile telerobotic inspection of remote areas.&lt;br /&gt;
&lt;br /&gt;
== 3D scanning of industrial objects ==&lt;br /&gt;
Using laser sensors and cameras to create accurate models of inustrial producst for quality control or further processing.&lt;br /&gt;
&lt;br /&gt;
== Modeling humans for human-robot interaction ==&lt;br /&gt;
True human-robot collaboration means that the robot must understand the actions, intention, and state of its human partner. This work invovlves using cameras and other human sensors for digitally representing and modelling humans. There are multiple stages for modeling: a) physical models of human kinematics and dynamics; b) higher level-models for recognizing human intent.&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Skeletal ROS.PNG|x160px|ROS &amp;amp; Kinect &amp;amp; Skeleton-Markers Package]]&lt;br /&gt;
&lt;br /&gt;
== ROS wrapper for Estonian Speech Synthesizer ==&lt;br /&gt;
Creating a ROS package that enables robots to speak in Estonian. The basis of the work is the existing [https://www.eki.ee/heli/index.php?option=com_content&amp;amp;view=article&amp;amp;id=6&amp;amp;Itemid=465 Estonian language speech synthesizer] that needs to be integrated with ROS [http://wiki.ros.org/sound_play sound_play] package or a stand-alone ROS wrapper package.&lt;br /&gt;
&lt;br /&gt;
== Robotic avatar for telepresence ==&lt;br /&gt;
Integrating hand gestures and head movements to control a robot avatar in virtual reality user interface.&lt;br /&gt;
&lt;br /&gt;
== ROS driver for Artificial Muscle actuators ==&lt;br /&gt;
Desigining a controller box and writing software for interfacing artificial muscle actuators [{{doi-inline|10.3390/act4010017|1}}, [https://www.youtube.com/watch?v=tspg_l49hSA&amp;amp;index=10&amp;amp;list=UU186z2gc0XiLh12hNvPZdUQ 2]] ROS.&lt;br /&gt;
&lt;br /&gt;
== Upgrading KUKA youBot ==&lt;br /&gt;
KUKA youBot is omnidirectionl mobile manipulator platform. The project involves replacing the current onboard computer with a more capable one, installing newer version of Ubuntu Linux and ROS, testing core capabilities, and creating documentation for the robot. &lt;br /&gt;
&lt;br /&gt;
== Development of strategies for inter-robot knowledge representation ==&lt;br /&gt;
An inseparable part for making robots work together is to enable them to share knowledge about surrounding environment and robot's intention. This project focuses on developing and testing the methodologies for inter-robot knowledge representation. This project is developed as a subsystem of [https://utnuclearroboticspublic.github.io/temoto2/index.html TeMoto].&lt;br /&gt;
&lt;br /&gt;
== TeMoto based smart home control ==&lt;br /&gt;
The project involves designing a open-source ROS+[https://utnuclearroboticspublic.github.io/temoto2/index.html TeMoto] based scalable smart home controller.&lt;br /&gt;
&lt;br /&gt;
== Detection of hardware and software resources for smart integration of robots ==&lt;br /&gt;
Vast majority of today’s robotic applications rely on hard-coded device and algorithm usage. This project focuses on developing a Resource Snooper software, that can detect addition or removal of resources for the benefit of dynamic reconfiguration of robotic systems. This project is developed as a subsystem of [https://utnuclearroboticspublic.github.io/temoto2/index.html TeMoto].&lt;br /&gt;
&lt;br /&gt;
== Sonification of feedback during teleoperation of robots ==&lt;br /&gt;
Humans are used to receiving auditory feedback in their everyday lives. It helps us make decision and be aware of potential dangers. Telerobotic interfaces can deploy the same idea to improve the Situational Awareness and robotic task efficiency. The thesis project involves a study about different sonification solutions and implementation of it in a telerobotic application using ROS. &lt;br /&gt;
&lt;br /&gt;
== Human-Robot and Robot-Robot collaboration applications ==&lt;br /&gt;
Creating a demo or analysis of software capabilities related to human-robot or tobot-robot teams&lt;br /&gt;
* human-robot collaborative assembly&lt;br /&gt;
* distributed mapping; analysis and demo of existing ROS (e.g., segmap https://youtu.be/JJhEkIA1xSE) packages for multi-robot mapping&lt;br /&gt;
* Inaccessible region teamwork&lt;br /&gt;
**youbot+drone - a drone maps the environment (for example a maze) and ground vehicle uses this information to traverse the maze&lt;br /&gt;
**youbot+clearbot - youbot cannot go up ledges but it can lift smaller robot, such as clearbot, up a ledge.&lt;br /&gt;
&lt;br /&gt;
== Developing ROS driver for a robotic gripper ==&lt;br /&gt;
The goal for this project is to develop ROS drivers for LEHF32K2-64 gripper. The work is concluded by demonstrating the functionalities of the gripper via pick-and-place task. &amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Smc gripper.jpg|120px|SMC LEHF32K2-64 gripper.]]&lt;br /&gt;
&lt;br /&gt;
== Mirroring human hand movements on industrial robots ==&lt;br /&gt;
The goal of this project is to integrate continuous control of industrial robot manipulator with a gestural telerobotics interface. The recommended tools for this thesis project are Leap Motion Controller or a standard web camera, Universal Robot UR5 manipulator, and ROS.&lt;br /&gt;
&lt;br /&gt;
== ROS2-based robotics demo ==&lt;br /&gt;
Converting ROS demos and tutorials to ROS2.&lt;br /&gt;
&lt;br /&gt;
== ROS2 for robotont ==&lt;br /&gt;
Creating ROS2 support for robotont mobile platform&lt;br /&gt;
&lt;br /&gt;
== TeMoto for robotont ==&lt;br /&gt;
Swarm-management for robotont using [https://temoto-telerobotics.github.io TeMoto] framework.&lt;br /&gt;
&lt;br /&gt;
== 3D lidar for mobile robotics ==&lt;br /&gt;
Analysing the technical characteristics of 3D lidar.. Desinging and constructing the mount for Ouster OS-1 lidar and validating its applicability for indoor and outdoor scenarios.&lt;br /&gt;
&lt;br /&gt;
== Making KUKA youBot user friendly again ==&lt;br /&gt;
This thesis focuses on integrating the low-level software capabilities of KUKA youBot in order to achieve high-level commonly used functionalities such as &lt;br /&gt;
* teach mode - robot can replicate user demonstrated trajectories&lt;br /&gt;
* end-effector jogging&lt;br /&gt;
* gripper control&lt;br /&gt;
* gamepad integration - user can control the robot via gamepad&lt;br /&gt;
* web integration - user can control the robot via internet browser&lt;br /&gt;
&lt;br /&gt;
The thesis is suitable for both, master and bachelor levels, as the associated code can be scaled up to generic &amp;quot;user-friendly control&amp;quot; package.&lt;br /&gt;
&lt;br /&gt;
== Flexible peer-to-peer network infrastructure for environments with restricted signal coverage==&lt;br /&gt;
A very common issue with robotics in real world environments is that the network coverage is highly dependent on the environment. This makes the communication between the robot-to-base-station or robot-to-robot unreliable, potentially compromising the whole mission. This thesis focuses on implementing a peer-to-peer based network system on mobile robot platforms, where the platforms extend the network coverage between, e.g., an operator and a Worker robot. The work will be demonstrated in a real world setting, where common networking strategies for teleoperation (tethered or single router based) do not work.&lt;br /&gt;
&lt;br /&gt;
= Completed projects =&lt;br /&gt;
== Masters's theses ==&lt;br /&gt;
*Madis K Nigol, tba&lt;br /&gt;
*Renno Raudmäe, tba&lt;br /&gt;
*Asif Sattar, tba&lt;br /&gt;
*Ragnar Margus, tba&lt;br /&gt;
*Pavel Šumejko, tba&lt;br /&gt;
*Dzvezdana Arsovska, tba&lt;br /&gt;
*Tõnis Tiimus, [http://hdl.handle.net/10062/60713 Visuaalselt esilekutsutud potentsiaalidel põhinev aju-arvuti liides robootika rakendustes (A VEP-based BCI for robotics applications)], MS thesis, 2018&lt;br /&gt;
*Martin Appo, [http://hdl.handle.net/10062/60718 Hardware-agnostic compliant control ROS package for collaborative industrial manipulators (Riistvarapaindlik ROSi tarkvarapakett tööstuslike robotite mööndlikuks juhtimiseks)], MS thesis, 2018&lt;br /&gt;
*Hassan Mahmoud Shehawy Elhanash, [http://hdl.handle.net/10062/60711 Optical Tracking of Forearm for Classifying Fingers Poses (Küünarvarre visuaalne jälgimine ennustamaks sõrmede asendeid)], MS thesis, 2018&lt;br /&gt;
&lt;br /&gt;
== Bachelor's theses ==&lt;br /&gt;
*Meelis Pihlap, tba&lt;br /&gt;
*Kaarel Mark, tba&lt;br /&gt;
*Kätriin Julle, tba&lt;br /&gt;
*Georg Astok, tba&lt;br /&gt;
*Martin Hallist, tba&lt;br /&gt;
*Ahmed Helmi, tba&lt;br /&gt;
*Kristo Allaje, [http://hdl.handle.net/10062/60294 Draiveripakett käte jälgimise seadme Leap Motion™ kontroller kasutamiseks robootika arendusplatvormil ROS], BS thesis, 2018&lt;br /&gt;
*Martin Maidla, [http://hdl.handle.net/10062/60288 Avatud robotiarendusplatvormi Robotont omniliikumise ja odomeetria arendamine], BS thesis, 2018&lt;br /&gt;
*Raid Vellerind, [http://hdl.handle.net/10062/56559 Avatud robotiarendusplatvormi ROS võimekuse loomine Tartu Ülikooli Robotexi robootikaplatvormile], BS thesis, 2017&lt;/div&gt;</summary>
		<author><name>Robertvalner</name></author>
	</entry>
	<entry>
		<id>https://ims.ut.ee/index.php?title=Theses_in_Robotics&amp;diff=19568</id>
		<title>Theses in Robotics</title>
		<link rel="alternate" type="text/html" href="https://ims.ut.ee/index.php?title=Theses_in_Robotics&amp;diff=19568"/>
		<updated>2019-06-03T13:16:11Z</updated>

		<summary type="html">&lt;p&gt;Robertvalner: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:IMS-robotics]]&lt;br /&gt;
= Projects in Advanced Robotics =&lt;br /&gt;
''The main objective of the follwing projects is to give students experience in working with advanced robotics tehcnology. Our group is active in several R&amp;amp;D projects involving human-robot collaboration, intuitive teleoperation of robots, and autonomous navigation of unmanned mobile platforms. Our main software platforms are [http://www.ros.org/ Robot Operating System (ROS)] for developing software for advanced robot systems and [http://gazebosim.org/ Gazebo] for running realistic robotic simulations.''&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
For further information, contact [[User:Karl|Karl Kruusamäe]]&lt;br /&gt;
&lt;br /&gt;
The following is not an exhaustive list of all available thesis/research topics.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Image:Youbot.png|200px|thumb|right|KUKA youBot]]&lt;br /&gt;
== Development of demonstrative and promotional applications for KUKA youBot ==&lt;br /&gt;
The goal of this project is to develop promotional use cases for KUKA youBot, that demonstrate the capabilities of modern robotics and inspire people to get involved with it. &lt;br /&gt;
The list of possible robotic demos include:&lt;br /&gt;
*demonstration of motion planning algorithms for mobile manipulation,&lt;br /&gt;
*using 3D vision for human and/or environment detection,&lt;br /&gt;
*interactive navigation,&lt;br /&gt;
*autonomous path planning,&lt;br /&gt;
*different pick-and-place applications,&lt;br /&gt;
*and human-robot collaboration.&lt;br /&gt;
&lt;br /&gt;
[[Image:Ur5 left.png|200px|thumb|left|Universal Robots UR5]]&lt;br /&gt;
&lt;br /&gt;
== Development of demonstrative and promotional applications for Universal Robots UR5 ==&lt;br /&gt;
Sample demonstrations include:&lt;br /&gt;
*autonomous pick-and-place,&lt;br /&gt;
*load-assistance for human-robot collaboration,&lt;br /&gt;
*packaging,&lt;br /&gt;
*physical compliance during human-robot interaction,&lt;br /&gt;
*tracing objects surface during scanning,&lt;br /&gt;
*robotic kitting,&lt;br /&gt;
*grinding of non-flat surfaces.&lt;br /&gt;
&lt;br /&gt;
== Development of demonstrative and promotional applications for Clearpath Jackal ==&lt;br /&gt;
Sample demonstrations include:&lt;br /&gt;
*human-robot interaction,&lt;br /&gt;
*multi-robot mapping,&lt;br /&gt;
*autonomous driving,&lt;br /&gt;
&lt;br /&gt;
== ROS support and educational materials for open-source mobile robot ==&lt;br /&gt;
[[Image:RosLarge.png|left|100px|ROS]]&lt;br /&gt;
The goal of the project is to develop [http://www.ros.org/ Robot Operating System] (ROS) wrapper functions for drivers of an open-source mobile robot platform, that has been successfully used in Robotex competitions. Additionally, educational materials must be developed to integrate the robot platform into ROS ecosystem. The outcome of this work will give a lot of educational uses and rapid prototyping opportunities for the platform.&lt;br /&gt;
&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Ros_equation.png|x100px|What is ROS?]]&lt;br /&gt;
&lt;br /&gt;
== Detecting features of urban and off-road surroundings ==&lt;br /&gt;
Accurate navigation of self-driving unmanned robotic platforms requires identification of traversable terrain. A combined analysis of point-cloud data with RGB information of the robot's environment can help autonomous systems make correct decisions. The goal of this work is to develop algorithms for terrain classification.&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Rtab-map.png|x90px|Mapping]]&lt;br /&gt;
&lt;br /&gt;
== Robotic simulations (in Gazebo) ==&lt;br /&gt;
1) Developing large area simulation worlds for mobile robotics. In order to develop robot navigation algorithms, it is more time- and cost-efficient to test robot behavior in a wide range of realistically simulated worlds. These simulated worlds include both indoor and outdoor environments. This work focuses on designing robot simulation environments using an open-source platform [http://gazebosim.org Gazebo].&amp;lt;br&amp;gt;&lt;br /&gt;
2) Humans in Gazebo. Integrating walking and gesturing humans to Gazebo.&amp;lt;br&amp;gt;&lt;br /&gt;
3) Tracked Robots in Gazebo. Creating and testing tracked robotis in Gaxebo and testing different track configurations.&amp;lt;br&amp;gt;&lt;br /&gt;
4) Robot basketball simulation for game strategy and shot accuracy&amp;lt;br&amp;gt;&lt;br /&gt;
5) Robotont at the Institute of Technology&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Gazebo.png|x90px|Gazebo]] [[Image:Robonaut-2-simulator.png|x90px|NASA Robonaut simulation in Gazebo]]&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Follow-the-leader robotic demo ==&lt;br /&gt;
[[Image:LeapMotion.png|200px|thumb|right|Detecting 2 hands with Leap Motion Controller]]&lt;br /&gt;
The idea is to create a robotic demonstration where a mobile robot is using Kinect or similar depth-camera for identifying a person and then starts following that person. The project will be implemented using Robot Operating System (ROS) on either KUKA youbot or similar mobile robot platform.&lt;br /&gt;
&lt;br /&gt;
== Detecting hand signals for intuitive human-robot interface ==&lt;br /&gt;
This project involves creating ROS libraries for using either a [https://www.leapmotion.com/ Leap Motion Controller] or an [http://www.intel.com/content/www/us/en/architecture-and-technology/realsense-overview.html RGB-D camera] to detect most common human hand signals (e.g., thumbs up, thumbs down, all clear, pointing into distance, inviting).&lt;br /&gt;
&lt;br /&gt;
== Virtual reality user interface (VRUI) for intuitive teleoperation system ==&lt;br /&gt;
Adding [http://www.osvr.org/hardware-devs.html virtual reality capability] to a [https://www.youtube.com/watch?v=L25HHFd00rc gesture- and natural-language-based robot teleoperation system].&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Temoto-working-principle-in-pics.png|x160px|Gesture-based teleoperation]]&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
{{doi-inline|10.1109/HSI.2016.7529630|Kruusamäe et al. (2016) High-precision telerobot with human-centered variable perspective and scalable gestural interface}}&lt;br /&gt;
&lt;br /&gt;
== Health monitor for intuitive telerobot ==&lt;br /&gt;
Intelligent status and error handling for an intuitive telerobotic system.&lt;br /&gt;
&lt;br /&gt;
== Dynamic stitching for achieveing 360° FOV ==&lt;br /&gt;
Automated image stitching of images from multiple camera sources for achieveing 360° field-of-view during mobile telerobotic inspection of remote areas.&lt;br /&gt;
&lt;br /&gt;
== 3D scanning of industrial objects ==&lt;br /&gt;
Using laser sensors and cameras to create accurate models of inustrial producst for quality control or further processing.&lt;br /&gt;
&lt;br /&gt;
== Modeling humans for human-robot interaction ==&lt;br /&gt;
True human-robot collaboration means that the robot must understand the actions, intention, and state of its human partner. This work invovlves using cameras and other human sensors for digitally representing and modelling humans. There are multiple stages for modeling: a) physical models of human kinematics and dynamics; b) higher level-models for recognizing human intent.&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Skeletal ROS.PNG|x160px|ROS &amp;amp; Kinect &amp;amp; Skeleton-Markers Package]]&lt;br /&gt;
&lt;br /&gt;
== ROS wrapper for Estonian Speech Synthesizer ==&lt;br /&gt;
Creating a ROS package that enables robots to speak in Estonian. The basis of the work is the existing [https://www.eki.ee/heli/index.php?option=com_content&amp;amp;view=article&amp;amp;id=6&amp;amp;Itemid=465 Estonian language speech synthesizer] that needs to be integrated with ROS [http://wiki.ros.org/sound_play sound_play] package or a stand-alone ROS wrapper package.&lt;br /&gt;
&lt;br /&gt;
== Robotic avatar for telepresence ==&lt;br /&gt;
Integrating hand gestures and head movements to control a robot avatar in virtual reality user interface.&lt;br /&gt;
&lt;br /&gt;
== ROS driver for Artificial Muscle actuators ==&lt;br /&gt;
Desigining a controller box and writing software for interfacing artificial muscle actuators [{{doi-inline|10.3390/act4010017|1}}, [https://www.youtube.com/watch?v=tspg_l49hSA&amp;amp;index=10&amp;amp;list=UU186z2gc0XiLh12hNvPZdUQ 2]] ROS.&lt;br /&gt;
&lt;br /&gt;
== Upgrading KUKA youBot ==&lt;br /&gt;
KUKA youBot is omnidirectionl mobile manipulator platform. The project involves replacing the current onboard computer with a more capable one, installing newer version of Ubuntu Linux and ROS, testing core capabilities, and creating documentation for the robot. &lt;br /&gt;
&lt;br /&gt;
== Development of strategies for inter-robot knowledge representation ==&lt;br /&gt;
An inseparable part for making robots work together is to enable them to share knowledge about surrounding environment and robot's intention. This project focuses on developing and testing the methodologies for inter-robot knowledge representation. This project is developed as a subsystem of [https://utnuclearroboticspublic.github.io/temoto2/index.html TeMoto].&lt;br /&gt;
&lt;br /&gt;
== TeMoto based smart home control ==&lt;br /&gt;
The project involves designing a open-source ROS+[https://utnuclearroboticspublic.github.io/temoto2/index.html TeMoto] based scalable smart home controller.&lt;br /&gt;
&lt;br /&gt;
== Detection of hardware and software resources for smart integration of robots ==&lt;br /&gt;
Vast majority of today’s robotic applications rely on hard-coded device and algorithm usage. This project focuses on developing a Resource Snooper software, that can detect addition or removal of resources for the benefit of dynamic reconfiguration of robotic systems. This project is developed as a subsystem of [https://utnuclearroboticspublic.github.io/temoto2/index.html TeMoto].&lt;br /&gt;
&lt;br /&gt;
== Sonification of feedback during teleoperation of robots ==&lt;br /&gt;
Humans are used to receiving auditory feedback in their everyday lives. It helps us make decision and be aware of potential dangers. Telerobotic interfaces can deploy the same idea to improve the Situational Awareness and robotic task efficiency. The thesis project involves a study about different sonification solutions and implementation of it in a telerobotic application using ROS. &lt;br /&gt;
&lt;br /&gt;
== Human-Robot and Robot-Robot collaboration applications ==&lt;br /&gt;
Creating a demo or analysis of software capabilities related to human-robot or tobot-robot teams&lt;br /&gt;
* human-robot collaborative assembly&lt;br /&gt;
* distributed mapping; analysis and demo of existing ROS (e.g., segmap https://youtu.be/JJhEkIA1xSE) packages for multi-robot mapping&lt;br /&gt;
&lt;br /&gt;
== Developing ROS driver for a robotic gripper ==&lt;br /&gt;
The goal for this project is to develop ROS drivers for LEHF32K2-64 gripper. The work is concluded by demonstrating the functionalities of the gripper via pick-and-place task. &amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Smc gripper.jpg|120px|SMC LEHF32K2-64 gripper.]]&lt;br /&gt;
&lt;br /&gt;
== Mirroring human hand movements on industrial robots ==&lt;br /&gt;
The goal of this project is to integrate continuous control of industrial robot manipulator with a gestural telerobotics interface. The recommended tools for this thesis project are Leap Motion Controller or a standard web camera, Universal Robot UR5 manipulator, and ROS.&lt;br /&gt;
&lt;br /&gt;
== ROS2-based robotics demo ==&lt;br /&gt;
Converting ROS demos and tutorials to ROS2.&lt;br /&gt;
&lt;br /&gt;
== ROS2 for robotont ==&lt;br /&gt;
Creating ROS2 support for robotont mobile platform&lt;br /&gt;
&lt;br /&gt;
== TeMoto for robotont ==&lt;br /&gt;
Swarm-management for robotont using [https://temoto-telerobotics.github.io TeMoto] framework.&lt;br /&gt;
&lt;br /&gt;
== 3D lidar for mobile robotics ==&lt;br /&gt;
Analysing the technical characteristics of 3D lidar.. Desinging and constructing the mount for Ouster OS-1 lidar and validating its applicability for indoor and outdoor scenarios.&lt;br /&gt;
&lt;br /&gt;
== Making KUKA youBot user friendly again ==&lt;br /&gt;
This thesis focuses on integrating the low-level software capabilities of KUKA youBot in order to achieve high-level commonly used functionalities such as &lt;br /&gt;
* teach mode - robot can replicate user demonstrated trajectories&lt;br /&gt;
* end-effector jogging&lt;br /&gt;
* gripper control&lt;br /&gt;
* gamepad integration - user can control the robot via gamepad&lt;br /&gt;
* web integration - user can control the robot via internet browser&lt;br /&gt;
&lt;br /&gt;
The thesis is suitable for both, master and bachelor levels, as the associated code can be scaled up to generic &amp;quot;user-friendly control&amp;quot; package.&lt;br /&gt;
&lt;br /&gt;
= Completed projects =&lt;br /&gt;
== Masters's theses ==&lt;br /&gt;
*Madis K Nigol, tba&lt;br /&gt;
*Renno Raudmäe, tba&lt;br /&gt;
*Asif Sattar, tba&lt;br /&gt;
*Ragnar Margus, tba&lt;br /&gt;
*Pavel Šumejko, tba&lt;br /&gt;
*Dzvezdana Arsovska, tba&lt;br /&gt;
*Tõnis Tiimus, [http://hdl.handle.net/10062/60713 Visuaalselt esilekutsutud potentsiaalidel põhinev aju-arvuti liides robootika rakendustes (A VEP-based BCI for robotics applications)], MS thesis, 2018&lt;br /&gt;
*Martin Appo, [http://hdl.handle.net/10062/60718 Hardware-agnostic compliant control ROS package for collaborative industrial manipulators (Riistvarapaindlik ROSi tarkvarapakett tööstuslike robotite mööndlikuks juhtimiseks)], MS thesis, 2018&lt;br /&gt;
*Hassan Mahmoud Shehawy Elhanash, [http://hdl.handle.net/10062/60711 Optical Tracking of Forearm for Classifying Fingers Poses (Küünarvarre visuaalne jälgimine ennustamaks sõrmede asendeid)], MS thesis, 2018&lt;br /&gt;
&lt;br /&gt;
== Bachelor's theses ==&lt;br /&gt;
*Meelis Pihlap, tba&lt;br /&gt;
*Kaarel Mark, tba&lt;br /&gt;
*Kätriin Julle, tba&lt;br /&gt;
*Georg Astok, tba&lt;br /&gt;
*Martin Hallist, tba&lt;br /&gt;
*Ahmed Helmi, tba&lt;br /&gt;
*Kristo Allaje, [http://hdl.handle.net/10062/60294 Draiveripakett käte jälgimise seadme Leap Motion™ kontroller kasutamiseks robootika arendusplatvormil ROS], BS thesis, 2018&lt;br /&gt;
*Martin Maidla, [http://hdl.handle.net/10062/60288 Avatud robotiarendusplatvormi Robotont omniliikumise ja odomeetria arendamine], BS thesis, 2018&lt;br /&gt;
*Raid Vellerind, [http://hdl.handle.net/10062/56559 Avatud robotiarendusplatvormi ROS võimekuse loomine Tartu Ülikooli Robotexi robootikaplatvormile], BS thesis, 2017&lt;/div&gt;</summary>
		<author><name>Robertvalner</name></author>
	</entry>
	<entry>
		<id>https://ims.ut.ee/index.php?title=Theses_in_Robotics&amp;diff=18363</id>
		<title>Theses in Robotics</title>
		<link rel="alternate" type="text/html" href="https://ims.ut.ee/index.php?title=Theses_in_Robotics&amp;diff=18363"/>
		<updated>2018-06-08T10:25:59Z</updated>

		<summary type="html">&lt;p&gt;Robertvalner: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:IMS-robotics]]&lt;br /&gt;
= Projects in Advanced Robotics =&lt;br /&gt;
''The main objective of the follwing projects is to give students experience in working with advanced robotics tehcnology. Our group is active in several R&amp;amp;D projects involving human-robot collaboration, intuitive teleoperation of robots, and autonomous navigation of unmanned mobile platforms. Our main software platforms are [http://www.ros.org/ Robot Operating System (ROS)] for developing software for advanced robot systems and [http://www.gazebosim.org Gazebo] for running realistic robotic simulations.''&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
For further information, contact [[User:Karl|Karl Kruusamäe]]&lt;br /&gt;
&lt;br /&gt;
The following is not an exhaustive list of all available thesis/research topics.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Image:Youbot.png|200px|thumb|right|KUKA youBot]]&lt;br /&gt;
== Development of demonstrative and promotional applications for KUKA youBot ==&lt;br /&gt;
The goal of this project is to develop promotional use cases for KUKA youBot, that demonstrate the capabilities of modern robotics and inspire people to get involved with it. &lt;br /&gt;
The list of possible robotic demos include:&lt;br /&gt;
*using 3D vision for human and/or environment detection,&lt;br /&gt;
*interactive navigation,&lt;br /&gt;
*autonomous path planning,&lt;br /&gt;
*different pick-and-place applications,&lt;br /&gt;
*and human-robot collaboration.&lt;br /&gt;
&lt;br /&gt;
[[Image:Ur5 left.png|200px|thumb|left|Universal Robots UR5]]&lt;br /&gt;
== Development of demonstrative and promotional applications for Universal Robots UR5 ==&lt;br /&gt;
Sample demonstrations include:&lt;br /&gt;
*autonomous pick-and-place,&lt;br /&gt;
*load-assistance for human-robot collaboration,&lt;br /&gt;
*packaging,&lt;br /&gt;
*physical compliance during human-robot interaction,&lt;br /&gt;
*tracing objects surface during scanning,&lt;br /&gt;
*robotic kitting,&lt;br /&gt;
*grinding of non-flat surfaces.&lt;br /&gt;
&lt;br /&gt;
== ROS support and educational materials for open-source mobile robot ==&lt;br /&gt;
[[Image:RosLarge.png|left|100px|ROS]]&lt;br /&gt;
The goal of the project is to develop [http://www.ros.org/ Robot Operating System] (ROS) wrapper functions for drivers of an open-source mobile robot platform, that has been successfully used in Robotex competitions. Additionally, educational materials must be developed to integrate the robot platform into ROS ecosystem. The outcome of this work will give a lot of educational uses and rapid prototyping opportunities for the platform.&lt;br /&gt;
&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Ros_equation.png|x100px|What is ROS?]]&lt;br /&gt;
&lt;br /&gt;
== Detecting features of urban and off-road surroundings ==&lt;br /&gt;
Accurate navigation of self-driving unmanned robotic platforms requires identification of traversable terrain. A combined analysis of point-cloud data with RGB information of the robot's environment can help autonomous systems make correct decisions. The goal of this work is to develop algorithms for terrain classification.&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Rtab-map.png|x90px|Mapping]]&lt;br /&gt;
&lt;br /&gt;
== Robotic simulations (in Gazebo) ==&lt;br /&gt;
1) Developing large area simulation worlds for mobile robotics. In order to develop robot navigation algorithms, it is more time- and cost-efficient to test robot behavior in a wide range of realistically simulated worlds. These simulated worlds include both indoor and outdoor environments. This work focuses on designing robot simulation environments using an open-source platform [http://gazebosim.org Gazebo].&amp;lt;br&amp;gt;&lt;br /&gt;
2) Humans in Gazebo. Integrating walking and gesturing humans to Gazebo.&amp;lt;br&amp;gt;&lt;br /&gt;
3) Tracked Robots in Gazebo. Creating and testing tracked robotis in Gaxebo and testing different track configurations.&amp;lt;br&amp;gt;&lt;br /&gt;
4) Robot basketball simulation for game strategy and shot accuracy&amp;lt;br&amp;gt;&lt;br /&gt;
5) Robotont at the Institute of Technology&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Gazebo.png|x90px|Gazebo]] [[Image:Robonaut-2-simulator.png|x90px|NASA Robonaut simulation in Gazebo]]&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Follow-the-leader robotic demo ==&lt;br /&gt;
[[Image:LeapMotion.png|200px|thumb|right|Detecting 2 hands with Leap Motion Controller]]&lt;br /&gt;
The idea is to create a robotic demonstration where a mobile robot is using Kinect or similar depth-camera for identifying a person and then starts following that person. The project will be implemented using Robot Operating System (ROS) on either KUKA youbot or similar mobile robot platform.&lt;br /&gt;
&lt;br /&gt;
== Detecting hand signals for intuitive human-robot interface ==&lt;br /&gt;
This project involves creating ROS libraries for using either a [https://www.leapmotion.com/ Leap Motion Controller] or an [http://www.intel.com/content/www/us/en/architecture-and-technology/realsense-overview.html RGB-D camera] to detect most common human hand signals (e.g., thumbs up, thumbs down, all clear, pointing into distance, inviting).&lt;br /&gt;
&lt;br /&gt;
== Virtual reality user interface (VRUI) for intuitive teleoperation system ==&lt;br /&gt;
Adding [http://www.osvr.org/hardware-devs.html virtual reality capability] to a [https://www.youtube.com/watch?v=L25HHFd00rc gesture- and natural-language-based robot teleoperation system].&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Temoto-working-principle-in-pics.png|x160px|Gesture-based teleoperation]]&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
{{doi-inline|10.1109/HSI.2016.7529630|Kruusamäe et al. (2016) High-precision telerobot with human-centered variable perspective and scalable gestural interface}}&lt;br /&gt;
&lt;br /&gt;
== Health monitor for intuitive telerobot ==&lt;br /&gt;
Intelligent status and error handling for an intuitive telerobotic system.&lt;br /&gt;
&lt;br /&gt;
== Dynamic stitching for achieveing 360° FOV ==&lt;br /&gt;
Automated image stitching of images from multiple camera sources for achieveing 360° field-of-view during mobile telerobotic inspection of remote areas.&lt;br /&gt;
&lt;br /&gt;
== 3D scanning of industrial objects ==&lt;br /&gt;
Using laser sensors and cameras to create accurate models of inustrial producst for quality control or further processing.&lt;br /&gt;
&lt;br /&gt;
== Modeling humans for human-robot interaction ==&lt;br /&gt;
True human-robot collaboration means that the robot must understand the actions, intention, and state of its human partner. This work invovlves using cameras and other human sensors for digitally representing and modelling humans. There are multiple stages for modeling: a) physical models of human kinematics and dynamics; b) higher level-models for recognizing human intent.&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Skeletal ROS.PNG|x160px|ROS &amp;amp; Kinect &amp;amp; Skeleton-Markers Package]]&lt;br /&gt;
&lt;br /&gt;
== ROS wrapper for Estonian Speech Synthesizer ==&lt;br /&gt;
Creating a ROS package that enables robots to speak in Estonian. The basis of the work is the existing [https://www.eki.ee/heli/index.php?option=com_content&amp;amp;view=article&amp;amp;id=6&amp;amp;Itemid=465 Estonian language speech synthesizer] that needs to be integrated with ROS [http://wiki.ros.org/sound_play sound_play] package or a stand-alone ROS wrapper package.&lt;br /&gt;
&lt;br /&gt;
== Robotic avatar for telepresence ==&lt;br /&gt;
Integrating hand gestures and head movements to control a robot avatar in virtual reality user interface.&lt;br /&gt;
&lt;br /&gt;
== ROS driver for Artificial Muscle actuators ==&lt;br /&gt;
Desigining a controller box and writing software for interfacing artificial muscle actuators [{{doi-inline|10.3390/act4010017|1}}, [https://www.youtube.com/watch?v=tspg_l49hSA&amp;amp;index=10&amp;amp;list=UU186z2gc0XiLh12hNvPZdUQ 2]] ROS.&lt;br /&gt;
&lt;br /&gt;
== Upgrading KUKA youBot ==&lt;br /&gt;
KUKA youBot is omnidirectionl mobile manipulator platform. The project involves replacing the current onboard computer with a more capable one, installing newer version of Ubuntu Linux and ROS, testing core capabilities, and creating documentation for the robot. &lt;br /&gt;
&lt;br /&gt;
== Development of strategies for inter-robot knowledge representation ==&lt;br /&gt;
An inseparable part for making robots work together is to enable them to share knowledge about surrounding environment and robot's intention. This project focuses on developing and testing the methodologies for inter-robot knowledge representation. This project is developed as a subsystem of [https://utnuclearroboticspublic.github.io/temoto2/index.html TeMoto].&lt;br /&gt;
&lt;br /&gt;
== TeMoto based smart home control ==&lt;br /&gt;
The project involves designing a open-source ROS+[https://utnuclearroboticspublic.github.io/temoto2/index.html TeMoto] based scalable smart home controller.&lt;br /&gt;
&lt;br /&gt;
== Detection of hardware and software resources for smart integration of robots ==&lt;br /&gt;
Vast majority of today’s robotic applications rely on hard-coded device and algorithm usage. This project focuses on developing a Resource Snooper software, that can detect addition or removal of resources for the benefit of dynamic reconfiguration of robotic systems. This project is developed as a subsystem of [https://utnuclearroboticspublic.github.io/temoto2/index.html TeMoto].&lt;br /&gt;
&lt;br /&gt;
== Sonification of feedback during teleoperation of robots ==&lt;br /&gt;
Humans are used to receiving auditory feedback in their everyday lives. It helps us make decision and be aware of potential dangers. Telerobotic interfaces can deploy the same idea to improve the Situational Awareness and robotic task efficiency. The thesis project involves a study about different sonification solutions and implementation of it in a telerobotic application using ROS. &lt;br /&gt;
&lt;br /&gt;
== Human-Robot and Robot-Robot collaboration applications ==&lt;br /&gt;
&lt;br /&gt;
== Developing ROS driver for a robotic gripper ==&lt;br /&gt;
The goal for this project is to develop ROS drivers for LEHF32K2-64 gripper. The work is concluded by demonstrating the functionalities of the gripper via pick-and-place task. &amp;lt;br&amp;gt;&lt;br /&gt;
[[File:Smc gripper.jpg|120px|SMC LEHF32K2-64 gripper.]]&lt;br /&gt;
&lt;br /&gt;
= Completed projects =&lt;br /&gt;
*Kristo Allaje, [http://hdl.handle.net/10062/60294 Draiveripakett käte jälgimise seadme Leap Motion™ kontroller kasutamiseks robootika arendusplatvormil ROS], BS thesis, 2018&lt;br /&gt;
*Martin Maidla, [http://hdl.handle.net/10062/60288 Avatud robotiarendusplatvormi Robotont omniliikumise ja odomeetria arendamine], BS thesis, 2018&lt;br /&gt;
*Raid Vellerind, [http://hdl.handle.net/10062/56559 Avatud robotiarendusplatvormi ROS võimekuse loomine Tartu Ülikooli Robotexi robootikaplatvormile], BS thesis, 2017&lt;/div&gt;</summary>
		<author><name>Robertvalner</name></author>
	</entry>
	<entry>
		<id>https://ims.ut.ee/index.php?title=File:Smc_gripper.jpg&amp;diff=18362</id>
		<title>File:Smc gripper.jpg</title>
		<link rel="alternate" type="text/html" href="https://ims.ut.ee/index.php?title=File:Smc_gripper.jpg&amp;diff=18362"/>
		<updated>2018-06-08T10:01:08Z</updated>

		<summary type="html">&lt;p&gt;Robertvalner: User created page with UploadWizard&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=={{int:filedesc}}==&lt;br /&gt;
{{Information&lt;br /&gt;
|description={{en|1=SMC LEHF32K2-64 gripper.}}&lt;br /&gt;
|date=2018-06-08&lt;br /&gt;
|source=http://www.smcpneumatics.com/LEHF32K2-64.html&lt;br /&gt;
|author=SMC Pneumatics&lt;br /&gt;
|permission=&lt;br /&gt;
|other versions=&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
=={{int:license-header}}==&lt;br /&gt;
{{cc-by-sa-4.0}}&lt;/div&gt;</summary>
		<author><name>Robertvalner</name></author>
	</entry>
	<entry>
		<id>https://ims.ut.ee/index.php?title=Theses_in_Robotics&amp;diff=18361</id>
		<title>Theses in Robotics</title>
		<link rel="alternate" type="text/html" href="https://ims.ut.ee/index.php?title=Theses_in_Robotics&amp;diff=18361"/>
		<updated>2018-06-08T09:56:24Z</updated>

		<summary type="html">&lt;p&gt;Robertvalner: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:IMS-robotics]]&lt;br /&gt;
= Projects in Advanced Robotics =&lt;br /&gt;
''The main objective of the follwing projects is to give students experience in working with advanced robotics tehcnology. Our group is active in several R&amp;amp;D projects involving human-robot collaboration, intuitive teleoperation of robots, and autonomous navigation of unmanned mobile platforms. Our main software platforms are [http://www.ros.org/ Robot Operating System (ROS)] for developing software for advanced robot systems and [http://www.gazebosim.org Gazebo] for running realistic robotic simulations.''&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
For further information, contact [[User:Karl|Karl Kruusamäe]]&lt;br /&gt;
&lt;br /&gt;
The following is not an exhaustive list of all available thesis/research topics.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Image:Youbot.png|200px|thumb|right|KUKA youBot]]&lt;br /&gt;
== Development of demonstrative and promotional applications for KUKA youBot ==&lt;br /&gt;
The goal of this project is to develop promotional use cases for KUKA youBot, that demonstrate the capabilities of modern robotics and inspire people to get involved with it. &lt;br /&gt;
The list of possible robotic demos include:&lt;br /&gt;
*using 3D vision for human and/or environment detection,&lt;br /&gt;
*interactive navigation,&lt;br /&gt;
*autonomous path planning,&lt;br /&gt;
*different pick-and-place applications,&lt;br /&gt;
*and human-robot collaboration.&lt;br /&gt;
&lt;br /&gt;
[[Image:Ur5 left.png|200px|thumb|left|Universal Robots UR5]]&lt;br /&gt;
== Development of demonstrative and promotional applications for Universal Robots UR5 ==&lt;br /&gt;
Sample demonstrations include:&lt;br /&gt;
*autonomous pick-and-place,&lt;br /&gt;
*load-assistance for human-robot collaboration,&lt;br /&gt;
*packaging,&lt;br /&gt;
*physical compliance during human-robot interaction,&lt;br /&gt;
*tracing objects surface during scanning,&lt;br /&gt;
*robotic kitting,&lt;br /&gt;
*grinding of non-flat surfaces.&lt;br /&gt;
&lt;br /&gt;
== ROS support and educational materials for open-source mobile robot ==&lt;br /&gt;
[[Image:RosLarge.png|left|100px|ROS]]&lt;br /&gt;
The goal of the project is to develop [http://www.ros.org/ Robot Operating System] (ROS) wrapper functions for drivers of an open-source mobile robot platform, that has been successfully used in Robotex competitions. Additionally, educational materials must be developed to integrate the robot platform into ROS ecosystem. The outcome of this work will give a lot of educational uses and rapid prototyping opportunities for the platform.&lt;br /&gt;
&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Ros_equation.png|x100px|What is ROS?]]&lt;br /&gt;
&lt;br /&gt;
== Detecting features of urban and off-road surroundings ==&lt;br /&gt;
Accurate navigation of self-driving unmanned robotic platforms requires identification of traversable terrain. A combined analysis of point-cloud data with RGB information of the robot's environment can help autonomous systems make correct decisions. The goal of this work is to develop algorithms for terrain classification.&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Rtab-map.png|x90px|Mapping]]&lt;br /&gt;
&lt;br /&gt;
== Robotic simulations (in Gazebo) ==&lt;br /&gt;
1) Developing large area simulation worlds for mobile robotics. In order to develop robot navigation algorithms, it is more time- and cost-efficient to test robot behavior in a wide range of realistically simulated worlds. These simulated worlds include both indoor and outdoor environments. This work focuses on designing robot simulation environments using an open-source platform [http://gazebosim.org Gazebo].&amp;lt;br&amp;gt;&lt;br /&gt;
2) Humans in Gazebo. Integrating walking and gesturing humans to Gazebo.&amp;lt;br&amp;gt;&lt;br /&gt;
3) Tracked Robots in Gazebo. Creating and testing tracked robotis in Gaxebo and testing different track configurations.&amp;lt;br&amp;gt;&lt;br /&gt;
4) Robot basketball simulation for game strategy and shot accuracy&amp;lt;br&amp;gt;&lt;br /&gt;
5) Robotont at the Institute of Technology&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Gazebo.png|x90px|Gazebo]] [[Image:Robonaut-2-simulator.png|x90px|NASA Robonaut simulation in Gazebo]]&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Follow-the-leader robotic demo ==&lt;br /&gt;
[[Image:LeapMotion.png|200px|thumb|right|Detecting 2 hands with Leap Motion Controller]]&lt;br /&gt;
The idea is to create a robotic demonstration where a mobile robot is using Kinect or similar depth-camera for identifying a person and then starts following that person. The project will be implemented using Robot Operating System (ROS) on either KUKA youbot or similar mobile robot platform.&lt;br /&gt;
&lt;br /&gt;
== Detecting hand signals for intuitive human-robot interface ==&lt;br /&gt;
This project involves creating ROS libraries for using either a [https://www.leapmotion.com/ Leap Motion Controller] or an [http://www.intel.com/content/www/us/en/architecture-and-technology/realsense-overview.html RGB-D camera] to detect most common human hand signals (e.g., thumbs up, thumbs down, all clear, pointing into distance, inviting).&lt;br /&gt;
&lt;br /&gt;
== Virtual reality user interface (VRUI) for intuitive teleoperation system ==&lt;br /&gt;
Adding [http://www.osvr.org/hardware-devs.html virtual reality capability] to a [https://www.youtube.com/watch?v=L25HHFd00rc gesture- and natural-language-based robot teleoperation system].&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Temoto-working-principle-in-pics.png|x160px|Gesture-based teleoperation]]&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
{{doi-inline|10.1109/HSI.2016.7529630|Kruusamäe et al. (2016) High-precision telerobot with human-centered variable perspective and scalable gestural interface}}&lt;br /&gt;
&lt;br /&gt;
== Health monitor for intuitive telerobot ==&lt;br /&gt;
Intelligent status and error handling for an intuitive telerobotic system.&lt;br /&gt;
&lt;br /&gt;
== Dynamic stitching for achieveing 360° FOV ==&lt;br /&gt;
Automated image stitching of images from multiple camera sources for achieveing 360° field-of-view during mobile telerobotic inspection of remote areas.&lt;br /&gt;
&lt;br /&gt;
== 3D scanning of industrial objects ==&lt;br /&gt;
Using laser sensors and cameras to create accurate models of inustrial producst for quality control or further processing.&lt;br /&gt;
&lt;br /&gt;
== Modeling humans for human-robot interaction ==&lt;br /&gt;
True human-robot collaboration means that the robot must understand the actions, intention, and state of its human partner. This work invovlves using cameras and other human sensors for digitally representing and modelling humans. There are multiple stages for modeling: a) physical models of human kinematics and dynamics; b) higher level-models for recognizing human intent.&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Skeletal ROS.PNG|x160px|ROS &amp;amp; Kinect &amp;amp; Skeleton-Markers Package]]&lt;br /&gt;
&lt;br /&gt;
== ROS wrapper for Estonian Speech Synthesizer ==&lt;br /&gt;
Creating a ROS package that enables robots to speak in Estonian. The basis of the work is the existing [https://www.eki.ee/heli/index.php?option=com_content&amp;amp;view=article&amp;amp;id=6&amp;amp;Itemid=465 Estonian language speech synthesizer] that needs to be integrated with ROS [http://wiki.ros.org/sound_play sound_play] package or a stand-alone ROS wrapper package.&lt;br /&gt;
&lt;br /&gt;
== Robotic avatar for telepresence ==&lt;br /&gt;
Integrating hand gestures and head movements to control a robot avatar in virtual reality user interface.&lt;br /&gt;
&lt;br /&gt;
== ROS driver for Artificial Muscle actuators ==&lt;br /&gt;
Desigining a controller box and writing software for interfacing artificial muscle actuators [{{doi-inline|10.3390/act4010017|1}}, [https://www.youtube.com/watch?v=tspg_l49hSA&amp;amp;index=10&amp;amp;list=UU186z2gc0XiLh12hNvPZdUQ 2]] ROS.&lt;br /&gt;
&lt;br /&gt;
== Upgrading KUKA youBot ==&lt;br /&gt;
KUKA youBot is omnidirectionl mobile manipulator platform. The project involves replacing the current onboard computer with a more capable one, installing newer version of Ubuntu Linux and ROS, testing core capabilities, and creating documentation for the robot. &lt;br /&gt;
&lt;br /&gt;
== Development of strategies for inter-robot knowledge representation ==&lt;br /&gt;
An inseparable part for making robots work together is to enable them to share knowledge about surrounding environment and robot's intention. This project focuses on developing and testing the methodologies for inter-robot knowledge representation. This project is developed as a subsystem of [https://utnuclearroboticspublic.github.io/temoto2/index.html TeMoto].&lt;br /&gt;
&lt;br /&gt;
== TeMoto based smart home control ==&lt;br /&gt;
The project involves designing a open-source ROS+[https://utnuclearroboticspublic.github.io/temoto2/index.html TeMoto] based scalable smart home controller.&lt;br /&gt;
&lt;br /&gt;
== Detection of hardware and software resources for smart integration of robots ==&lt;br /&gt;
Vast majority of today’s robotic applications rely on hard-coded device and algorithm usage. This project focuses on developing a Resource Snooper software, that can detect addition or removal of resources for the benefit of dynamic reconfiguration of robotic systems. This project is developed as a subsystem of [https://utnuclearroboticspublic.github.io/temoto2/index.html TeMoto].&lt;br /&gt;
&lt;br /&gt;
== Sonification of feedback during teleoperation of robots ==&lt;br /&gt;
Humans are used to receiving auditory feedback in their everyday lives. It helps us make decision and be aware of potential dangers. Telerobotic interfaces can deploy the same idea to improve the Situational Awareness and robotic task efficiency. The thesis project involves a study about different sonification solutions and implementation of it in a telerobotic application using ROS. &lt;br /&gt;
&lt;br /&gt;
== Human-Robot and Robot-Robot collaboration applications ==&lt;br /&gt;
&lt;br /&gt;
== Developing ROS driver for a robotic gripper ==&lt;br /&gt;
The goal for this project is to develop ROS drivers for LEHF32K2-64 gripper. The work is concluded by demonstrating the functionalities of the gripper via pick-and-place task.&lt;br /&gt;
[[Image:smc_gripper.png|200px|thumb|right|SMC LEHF32K2-64 gripper]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Completed projects =&lt;br /&gt;
*Kristo Allaje, [http://hdl.handle.net/10062/60294 Draiveripakett käte jälgimise seadme Leap Motion™ kontroller kasutamiseks robootika arendusplatvormil ROS], BS thesis, 2018&lt;br /&gt;
*Martin Maidla, [http://hdl.handle.net/10062/60288 Avatud robotiarendusplatvormi Robotont omniliikumise ja odomeetria arendamine], BS thesis, 2018&lt;br /&gt;
*Raid Vellerind, [http://hdl.handle.net/10062/56559 Avatud robotiarendusplatvormi ROS võimekuse loomine Tartu Ülikooli Robotexi robootikaplatvormile], BS thesis, 2017&lt;/div&gt;</summary>
		<author><name>Robertvalner</name></author>
	</entry>
	<entry>
		<id>https://ims.ut.ee/index.php?title=Theses_in_Robotics&amp;diff=18346</id>
		<title>Theses in Robotics</title>
		<link rel="alternate" type="text/html" href="https://ims.ut.ee/index.php?title=Theses_in_Robotics&amp;diff=18346"/>
		<updated>2018-06-01T12:10:18Z</updated>

		<summary type="html">&lt;p&gt;Robertvalner: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:IMS-robotics]]&lt;br /&gt;
= Projects in Advanced Robotics =&lt;br /&gt;
''The main objective of the follwing projects is to give students experience in working with advanced robotics tehcnology. Our group is active in several R&amp;amp;D projects involving human-robot collaboration, intuitive teleoperation of robots, and autonomous navigation of unmanned mobile platforms. Our main software platforms are [http://www.ros.org/ Robot Operating System (ROS)] for developing software for advanced robot systems and [http://www.gazebosim.org Gazebo] for running realistic robotic simulations.''&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
For further information, contact [[User:Karl|Karl Kruusamäe]]&lt;br /&gt;
&lt;br /&gt;
The following is not an exhaustive list of all available thesis/research topics.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Image:Youbot.png|200px|thumb|right|KUKA youBot]]&lt;br /&gt;
== Development of demonstrative and promotional applications for KUKA youBot ==&lt;br /&gt;
The goal of this project is to develop promotional use cases for KUKA youBot, that demonstrate the capabilities of modern robotics and inspire people to get involved with it. &lt;br /&gt;
The list of possible robotic demos include:&lt;br /&gt;
*using 3D vision for human and/or environment detection,&lt;br /&gt;
*interactive navigation,&lt;br /&gt;
*autonomous path planning,&lt;br /&gt;
*different pick-and-place applications,&lt;br /&gt;
*and human-robot collaboration.&lt;br /&gt;
&lt;br /&gt;
[[Image:Ur5 left.png|200px|thumb|left|Universal Robots UR5]]&lt;br /&gt;
== Development of demonstrative and promotional applications for Universal Robots UR5 ==&lt;br /&gt;
Sample demonstrations include:&lt;br /&gt;
*autonomous pick-and-place,&lt;br /&gt;
*load-assistance for human-robot collaboration,&lt;br /&gt;
*packaging,&lt;br /&gt;
*physical compliance during human-robot interaction,&lt;br /&gt;
*tracing objects surface during scanning,&lt;br /&gt;
*robotic kitting,&lt;br /&gt;
*grinding of non-flat surfaces.&lt;br /&gt;
&lt;br /&gt;
== ROS support and educational materials for open-source mobile robot ==&lt;br /&gt;
[[Image:RosLarge.png|left|100px|ROS]]&lt;br /&gt;
The goal of the project is to develop [http://www.ros.org/ Robot Operating System] (ROS) wrapper functions for drivers of an open-source mobile robot platform, that has been successfully used in Robotex competitions. Additionally, educational materials must be developed to integrate the robot platform into ROS ecosystem. The outcome of this work will give a lot of educational uses and rapid prototyping opportunities for the platform.&lt;br /&gt;
&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Ros_equation.png|x100px|What is ROS?]]&lt;br /&gt;
&lt;br /&gt;
== Detecting features of urban and off-road surroundings ==&lt;br /&gt;
Accurate navigation of self-driving unmanned robotic platforms requires identification of traversable terrain. A combined analysis of point-cloud data with RGB information of the robot's environment can help autonomous systems make correct decisions. The goal of this work is to develop algorithms for terrain classification.&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Rtab-map.png|x90px|Mapping]]&lt;br /&gt;
&lt;br /&gt;
== Robotic simulations (in Gazebo) ==&lt;br /&gt;
1) Developing large area simulation worlds for mobile robotics. In order to develop robot navigation algorithms, it is more time- and cost-efficient to test robot behavior in a wide range of realistically simulated worlds. These simulated worlds include both indoor and outdoor environments. This work focuses on designing robot simulation environments using an open-source platform [http://gazebosim.org Gazebo].&amp;lt;br&amp;gt;&lt;br /&gt;
2) Humans in Gazebo. Integrating walking and gesturing humans to Gazebo.&amp;lt;br&amp;gt;&lt;br /&gt;
3) Tracked Robots in Gazebo. Creating and testing tracked robotis in Gaxebo and testing different track configurations.&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Gazebo.png|x90px|Gazebo]] [[Image:Robonaut-2-simulator.png|x90px|NASA Robonaut simulation in Gazebo]]&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Follow-the-leader robotic demo ==&lt;br /&gt;
[[Image:LeapMotion.png|200px|thumb|right|Detecting 2 hands with Leap Motion Controller]]&lt;br /&gt;
The idea is to create a robotic demonstration where a mobile robot is using Kinect or similar depth-camera for identifying a person and then starts following that person. The project will be implemented using Robot Operating System (ROS) on either KUKA youbot or similar mobile robot platform.&lt;br /&gt;
&lt;br /&gt;
== Detecting hand signals for intuitive human-robot interface ==&lt;br /&gt;
This project involves creating ROS libraries for using either a [https://www.leapmotion.com/ Leap Motion Controller] or an [http://www.intel.com/content/www/us/en/architecture-and-technology/realsense-overview.html RGB-D camera] to detect most common human hand signals (e.g., thumbs up, thumbs down, all clear, pointing into distance, inviting).&lt;br /&gt;
&lt;br /&gt;
== Virtual reality user interface (VRUI) for intuitive teleoperation system ==&lt;br /&gt;
Adding [http://www.osvr.org/hardware-devs.html virtual reality capability] to a [https://www.youtube.com/watch?v=L25HHFd00rc gesture- and natural-language-based robot teleoperation system].&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Temoto-working-principle-in-pics.png|x160px|Gesture-based teleoperation]]&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
{{doi-inline|10.1109/HSI.2016.7529630|Kruusamäe et al. (2016) High-precision telerobot with human-centered variable perspective and scalable gestural interface}}&lt;br /&gt;
&lt;br /&gt;
== Health monitor for intuitive telerobot ==&lt;br /&gt;
Intelligent status and error handling for an intuitive telerobotic system.&lt;br /&gt;
&lt;br /&gt;
== Dynamic stitching for achieveing 360° FOV ==&lt;br /&gt;
Automated image stitching of images from multiple camera sources for achieveing 360° field-of-view during mobile telerobotic inspection of remote areas.&lt;br /&gt;
&lt;br /&gt;
== 3D scanning of industrial objects ==&lt;br /&gt;
Using laser sensors and cameras to create accurate models of inustrial producst for quality control or further processing.&lt;br /&gt;
&lt;br /&gt;
== Modeling humans for human-robot interaction ==&lt;br /&gt;
True human-robot collaboration means that the robot must understand the actions, intention, and state of its human partner. This work invovlves using cameras and other human sensors for digitally representing and modelling humans. There are multiple stages for modeling: a) physical models of human kinematics and dynamics; b) higher level-models for recognizing human intent.&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Skeletal ROS.PNG|x160px|ROS &amp;amp; Kinect &amp;amp; Skeleton-Markers Package]]&lt;br /&gt;
&lt;br /&gt;
== ROS wrapper for Estonian Speech Synthesizer ==&lt;br /&gt;
Creating a ROS package that enables robots to speak in Estonian. The basis of the work is the existing [https://www.eki.ee/heli/index.php?option=com_content&amp;amp;view=article&amp;amp;id=6&amp;amp;Itemid=465 Estonian language speech synthesizer] that needs to be integrated with ROS [http://wiki.ros.org/sound_play sound_play] package or a stand-alone ROS wrapper package.&lt;br /&gt;
&lt;br /&gt;
== Robotic avatar for telepresence ==&lt;br /&gt;
Integrating hand gestures and head movements to control a robot avatar in virtual reality user interface.&lt;br /&gt;
&lt;br /&gt;
== ROS driver for Artificial Muscle actuators ==&lt;br /&gt;
Desigining a controller box and writing software for interfacing artificial muscle actuators [{{doi-inline|10.3390/act4010017|1}}, [https://www.youtube.com/watch?v=tspg_l49hSA&amp;amp;index=10&amp;amp;list=UU186z2gc0XiLh12hNvPZdUQ 2]] ROS.&lt;br /&gt;
&lt;br /&gt;
== Upgrading KUKA youBot ==&lt;br /&gt;
KUKA youBot is omnidirectionl mobile manipulator platform. The project involves replacing the current onboard computer with a more capable one, installing newer version of Ubuntu Linux and ROS, testing core capabilities, and creating documentation for the robot. &lt;br /&gt;
&lt;br /&gt;
== Development of strategies for inter-robot knowledge representation ==&lt;br /&gt;
An inseparable part for making robots work together is to enable them to share knowledge about surrounding environment and robot's intention. This project focuses on developing and testing the methodologies for inter-robot knowledge representation. This project is developed as a subsystem of [https://utnuclearroboticspublic.github.io/temoto2/index.html TeMoto].&lt;br /&gt;
&lt;br /&gt;
== TeMoto based smart home control ==&lt;br /&gt;
The project involves designing a open-source ROS+[https://utnuclearroboticspublic.github.io/temoto2/index.html TeMoto] based scalable smart home controller.&lt;br /&gt;
&lt;br /&gt;
== Detection of hardware and software resources for smart integration of robots ==&lt;br /&gt;
Vast majority of today’s robotic applications rely on hard-coded device and algorithm usage. This project focuses on developing a Resource Snooper software, that can detect addition or removal of resources for the benefit of dynamic reconfiguration of robotic systems. This project is developed as a subsystem of [https://utnuclearroboticspublic.github.io/temoto2/index.html TeMoto].&lt;br /&gt;
&lt;br /&gt;
== Human-Robot and Robot-Robot collaboration applications ==&lt;br /&gt;
&lt;br /&gt;
= Completed projects =&lt;br /&gt;
*Kristo Allaje, Draiveripakett käte jälgimise seadme Leap Motion™ kontroller kasutamiseks robootika arendusplatvormil ROS, BS thesis, 2018&lt;br /&gt;
*Martin Maidla, Avatud robotiarendusplatvormi Robotont omniliikumise ja odomeetria arendamine, BS thesis, 2018&lt;br /&gt;
*Raid Vellerind, [http://hdl.handle.net/10062/56559 Avatud robotiarendusplatvormi ROS võimekuse loomine Tartu Ülikooli Robotexi robootikaplatvormile], BS thesis, 2017&lt;/div&gt;</summary>
		<author><name>Robertvalner</name></author>
	</entry>
	<entry>
		<id>https://ims.ut.ee/index.php?title=Theses_in_Robotics&amp;diff=18345</id>
		<title>Theses in Robotics</title>
		<link rel="alternate" type="text/html" href="https://ims.ut.ee/index.php?title=Theses_in_Robotics&amp;diff=18345"/>
		<updated>2018-06-01T12:06:25Z</updated>

		<summary type="html">&lt;p&gt;Robertvalner: /* TeMoto Based Smart Home Control */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:IMS-robotics]]&lt;br /&gt;
= Projects in Advanced Robotics =&lt;br /&gt;
''The main objective of the follwing projects is to give students experience in working with advanced robotics tehcnology. Our group is active in several R&amp;amp;D projects involving human-robot collaboration, intuitive teleoperation of robots, and autonomous navigation of unmanned mobile platforms. Our main software platforms are [http://www.ros.org/ Robot Operating System (ROS)] for developing software for advanced robot systems and [http://www.gazebosim.org Gazebo] for running realistic robotic simulations.''&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
For further information, contact [[User:Karl|Karl Kruusamäe]]&lt;br /&gt;
&lt;br /&gt;
The following is not an exhaustive list of all available thesis/research topics.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Image:Youbot.png|200px|thumb|right|KUKA youBot]]&lt;br /&gt;
== Development of demonstrative and promotional applications for KUKA youBot ==&lt;br /&gt;
The goal of this project is to develop promotional use cases for KUKA youBot, that demonstrate the capabilities of modern robotics and inspire people to get involved with it. &lt;br /&gt;
The list of possible robotic demos include:&lt;br /&gt;
*using 3D vision for human and/or environment detection,&lt;br /&gt;
*interactive navigation,&lt;br /&gt;
*autonomous path planning,&lt;br /&gt;
*different pick-and-place applications,&lt;br /&gt;
*and human-robot collaboration.&lt;br /&gt;
&lt;br /&gt;
[[Image:Ur5 left.png|200px|thumb|left|Universal Robots UR5]]&lt;br /&gt;
== Development of demonstrative and promotional applications for Universal Robots UR5 ==&lt;br /&gt;
Sample demonstrations include:&lt;br /&gt;
*autonomous pick-and-place,&lt;br /&gt;
*load-assistance for human-robot collaboration,&lt;br /&gt;
*packaging,&lt;br /&gt;
*physical compliance during human-robot interaction,&lt;br /&gt;
*tracing objects surface during scanning,&lt;br /&gt;
*robotic kitting,&lt;br /&gt;
*grinding of non-flat surfaces.&lt;br /&gt;
&lt;br /&gt;
== ROS support and educational materials for open-source mobile robot ==&lt;br /&gt;
[[Image:RosLarge.png|left|100px|ROS]]&lt;br /&gt;
The goal of the project is to develop [http://www.ros.org/ Robot Operating System] (ROS) wrapper functions for drivers of an open-source mobile robot platform, that has been successfully used in Robotex competitions. Additionally, educational materials must be developed to integrate the robot platform into ROS ecosystem. The outcome of this work will give a lot of educational uses and rapid prototyping opportunities for the platform.&lt;br /&gt;
&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Ros_equation.png|x100px|What is ROS?]]&lt;br /&gt;
&lt;br /&gt;
== Detecting features of urban and off-road surroundings ==&lt;br /&gt;
Accurate navigation of self-driving unmanned robotic platforms requires identification of traversable terrain. A combined analysis of point-cloud data with RGB information of the robot's environment can help autonomous systems make correct decisions. The goal of this work is to develop algorithms for terrain classification.&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Rtab-map.png|x90px|Mapping]]&lt;br /&gt;
&lt;br /&gt;
== Robotic simulations (in Gazebo) ==&lt;br /&gt;
1) Developing large area simulation worlds for mobile robotics. In order to develop robot navigation algorithms, it is more time- and cost-efficient to test robot behavior in a wide range of realistically simulated worlds. These simulated worlds include both indoor and outdoor environments. This work focuses on designing robot simulation environments using an open-source platform [http://gazebosim.org Gazebo].&amp;lt;br&amp;gt;&lt;br /&gt;
2) Humans in Gazebo. Integrating walking and gesturing humans to Gazebo.&amp;lt;br&amp;gt;&lt;br /&gt;
3) Tracked Robots in Gazebo. Creating and testing tracked robotis in Gaxebo and testing different track configurations.&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Gazebo.png|x90px|Gazebo]] [[Image:Robonaut-2-simulator.png|x90px|NASA Robonaut simulation in Gazebo]]&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Follow-the-leader robotic demo ==&lt;br /&gt;
[[Image:LeapMotion.png|200px|thumb|right|Detecting 2 hands with Leap Motion Controller]]&lt;br /&gt;
The idea is to create a robotic demonstration where a mobile robot is using Kinect or similar depth-camera for identifying a person and then starts following that person. The project will be implemented using Robot Operating System (ROS) on either KUKA youbot or similar mobile robot platform.&lt;br /&gt;
&lt;br /&gt;
== Detecting hand signals for intuitive human-robot interface ==&lt;br /&gt;
This project involves creating ROS libraries for using either a [https://www.leapmotion.com/ Leap Motion Controller] or an [http://www.intel.com/content/www/us/en/architecture-and-technology/realsense-overview.html RGB-D camera] to detect most common human hand signals (e.g., thumbs up, thumbs down, all clear, pointing into distance, inviting).&lt;br /&gt;
&lt;br /&gt;
== Virtual reality user interface (VRUI) for intuitive teleoperation system ==&lt;br /&gt;
Adding [http://www.osvr.org/hardware-devs.html virtual reality capability] to a [https://www.youtube.com/watch?v=L25HHFd00rc gesture- and natural-language-based robot teleoperation system].&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Temoto-working-principle-in-pics.png|x160px|Gesture-based teleoperation]]&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
{{doi-inline|10.1109/HSI.2016.7529630|Kruusamäe et al. (2016) High-precision telerobot with human-centered variable perspective and scalable gestural interface}}&lt;br /&gt;
&lt;br /&gt;
== Health monitor for intuitive telerobot ==&lt;br /&gt;
Intelligent status and error handling for an intuitive telerobotic system.&lt;br /&gt;
&lt;br /&gt;
== Dynamic stitching for achieveing 360° FOV ==&lt;br /&gt;
Automated image stitching of images from multiple camera sources for achieveing 360° field-of-view during mobile telerobotic inspection of remote areas.&lt;br /&gt;
&lt;br /&gt;
== 3D scanning of industrial objects ==&lt;br /&gt;
Using laser sensors and cameras to create accurate models of inustrial producst for quality control or further processing.&lt;br /&gt;
&lt;br /&gt;
== Modeling humans for human-robot interaction ==&lt;br /&gt;
True human-robot collaboration means that the robot must understand the actions, intention, and state of its human partner. This work invovlves using cameras and other human sensors for digitally representing and modelling humans. There are multiple stages for modeling: a) physical models of human kinematics and dynamics; b) higher level-models for recognizing human intent.&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Skeletal ROS.PNG|x160px|ROS &amp;amp; Kinect &amp;amp; Skeleton-Markers Package]]&lt;br /&gt;
&lt;br /&gt;
== ROS wrapper for Estonian Speech Synthesizer ==&lt;br /&gt;
Creating a ROS package that enables robots to speak in Estonian. The basis of the work is the existing [https://www.eki.ee/heli/index.php?option=com_content&amp;amp;view=article&amp;amp;id=6&amp;amp;Itemid=465 Estonian language speech synthesizer] that needs to be integrated with ROS [http://wiki.ros.org/sound_play sound_play] package or a stand-alone ROS wrapper package.&lt;br /&gt;
&lt;br /&gt;
== Robotic avatar for telepresence ==&lt;br /&gt;
Integrating hand gestures and head movements to control a robot avatar in virtual reality user interface.&lt;br /&gt;
&lt;br /&gt;
== ROS driver for Artificial Muscle actuators ==&lt;br /&gt;
Desigining a controller box and writing software for interfacing artificial muscle actuators [{{doi-inline|10.3390/act4010017|1}}, [https://www.youtube.com/watch?v=tspg_l49hSA&amp;amp;index=10&amp;amp;list=UU186z2gc0XiLh12hNvPZdUQ 2]] ROS.&lt;br /&gt;
&lt;br /&gt;
== Upgrading KUKA youBot ==&lt;br /&gt;
KUKA youBot is omnidirectionl mobile manipulator platform. The project involves replacing the current onboard computer with a more capable one, installing newer version of Ubuntu Linux and ROS, testing core capabilities, and creating documentation for the robot. &lt;br /&gt;
&lt;br /&gt;
== Development of strategies for inter-robot knowledge representation ==&lt;br /&gt;
An inseparable part for making robots work together is to enable them to share knowledge about surrounding environment and robot's intention. This project focuses on developing and testing the methodologies for inter-robot knowledge representation. This project is developed as a subsystem of [https://utnuclearroboticspublic.github.io/temoto2/index.html TeMoto].&lt;br /&gt;
&lt;br /&gt;
== TeMoto Based Smart Home Control ==&lt;br /&gt;
The project involves designing a open-source ROS+[https://utnuclearroboticspublic.github.io/temoto2/index.html TeMoto] based scalable smart home controller.&lt;br /&gt;
&lt;br /&gt;
== Detection of hardware and software resources for smart integration of robots ==&lt;br /&gt;
Vast majority of today’s robotic applications rely on hard-coded device and algorithm usage. This project focuses on developing a Resource Snooper software, that can detect addition or removal of resources for the benefit of dynamic reconfiguration of robotic systems. This project is developed as a subsystem of [https://utnuclearroboticspublic.github.io/temoto2/index.html TeMoto].&lt;br /&gt;
&lt;br /&gt;
= Completed projects =&lt;br /&gt;
*Kristo Allaje, Draiveripakett käte jälgimise seadme Leap Motion™ kontroller kasutamiseks robootika arendusplatvormil ROS, BS thesis, 2018&lt;br /&gt;
*Martin Maidla, Avatud robotiarendusplatvormi Robotont omniliikumise ja odomeetria arendamine, BS thesis, 2018&lt;br /&gt;
*Raid Vellerind, [http://hdl.handle.net/10062/56559 Avatud robotiarendusplatvormi ROS võimekuse loomine Tartu Ülikooli Robotexi robootikaplatvormile], BS thesis, 2017&lt;/div&gt;</summary>
		<author><name>Robertvalner</name></author>
	</entry>
	<entry>
		<id>https://ims.ut.ee/index.php?title=Theses_in_Robotics&amp;diff=18344</id>
		<title>Theses in Robotics</title>
		<link rel="alternate" type="text/html" href="https://ims.ut.ee/index.php?title=Theses_in_Robotics&amp;diff=18344"/>
		<updated>2018-06-01T12:05:48Z</updated>

		<summary type="html">&lt;p&gt;Robertvalner: /* Development of strategies for inter-robot knowledge representation */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:IMS-robotics]]&lt;br /&gt;
= Projects in Advanced Robotics =&lt;br /&gt;
''The main objective of the follwing projects is to give students experience in working with advanced robotics tehcnology. Our group is active in several R&amp;amp;D projects involving human-robot collaboration, intuitive teleoperation of robots, and autonomous navigation of unmanned mobile platforms. Our main software platforms are [http://www.ros.org/ Robot Operating System (ROS)] for developing software for advanced robot systems and [http://www.gazebosim.org Gazebo] for running realistic robotic simulations.''&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
For further information, contact [[User:Karl|Karl Kruusamäe]]&lt;br /&gt;
&lt;br /&gt;
The following is not an exhaustive list of all available thesis/research topics.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Image:Youbot.png|200px|thumb|right|KUKA youBot]]&lt;br /&gt;
== Development of demonstrative and promotional applications for KUKA youBot ==&lt;br /&gt;
The goal of this project is to develop promotional use cases for KUKA youBot, that demonstrate the capabilities of modern robotics and inspire people to get involved with it. &lt;br /&gt;
The list of possible robotic demos include:&lt;br /&gt;
*using 3D vision for human and/or environment detection,&lt;br /&gt;
*interactive navigation,&lt;br /&gt;
*autonomous path planning,&lt;br /&gt;
*different pick-and-place applications,&lt;br /&gt;
*and human-robot collaboration.&lt;br /&gt;
&lt;br /&gt;
[[Image:Ur5 left.png|200px|thumb|left|Universal Robots UR5]]&lt;br /&gt;
== Development of demonstrative and promotional applications for Universal Robots UR5 ==&lt;br /&gt;
Sample demonstrations include:&lt;br /&gt;
*autonomous pick-and-place,&lt;br /&gt;
*load-assistance for human-robot collaboration,&lt;br /&gt;
*packaging,&lt;br /&gt;
*physical compliance during human-robot interaction,&lt;br /&gt;
*tracing objects surface during scanning,&lt;br /&gt;
*robotic kitting,&lt;br /&gt;
*grinding of non-flat surfaces.&lt;br /&gt;
&lt;br /&gt;
== ROS support and educational materials for open-source mobile robot ==&lt;br /&gt;
[[Image:RosLarge.png|left|100px|ROS]]&lt;br /&gt;
The goal of the project is to develop [http://www.ros.org/ Robot Operating System] (ROS) wrapper functions for drivers of an open-source mobile robot platform, that has been successfully used in Robotex competitions. Additionally, educational materials must be developed to integrate the robot platform into ROS ecosystem. The outcome of this work will give a lot of educational uses and rapid prototyping opportunities for the platform.&lt;br /&gt;
&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Ros_equation.png|x100px|What is ROS?]]&lt;br /&gt;
&lt;br /&gt;
== Detecting features of urban and off-road surroundings ==&lt;br /&gt;
Accurate navigation of self-driving unmanned robotic platforms requires identification of traversable terrain. A combined analysis of point-cloud data with RGB information of the robot's environment can help autonomous systems make correct decisions. The goal of this work is to develop algorithms for terrain classification.&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Rtab-map.png|x90px|Mapping]]&lt;br /&gt;
&lt;br /&gt;
== Robotic simulations (in Gazebo) ==&lt;br /&gt;
1) Developing large area simulation worlds for mobile robotics. In order to develop robot navigation algorithms, it is more time- and cost-efficient to test robot behavior in a wide range of realistically simulated worlds. These simulated worlds include both indoor and outdoor environments. This work focuses on designing robot simulation environments using an open-source platform [http://gazebosim.org Gazebo].&amp;lt;br&amp;gt;&lt;br /&gt;
2) Humans in Gazebo. Integrating walking and gesturing humans to Gazebo.&amp;lt;br&amp;gt;&lt;br /&gt;
3) Tracked Robots in Gazebo. Creating and testing tracked robotis in Gaxebo and testing different track configurations.&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Gazebo.png|x90px|Gazebo]] [[Image:Robonaut-2-simulator.png|x90px|NASA Robonaut simulation in Gazebo]]&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Follow-the-leader robotic demo ==&lt;br /&gt;
[[Image:LeapMotion.png|200px|thumb|right|Detecting 2 hands with Leap Motion Controller]]&lt;br /&gt;
The idea is to create a robotic demonstration where a mobile robot is using Kinect or similar depth-camera for identifying a person and then starts following that person. The project will be implemented using Robot Operating System (ROS) on either KUKA youbot or similar mobile robot platform.&lt;br /&gt;
&lt;br /&gt;
== Detecting hand signals for intuitive human-robot interface ==&lt;br /&gt;
This project involves creating ROS libraries for using either a [https://www.leapmotion.com/ Leap Motion Controller] or an [http://www.intel.com/content/www/us/en/architecture-and-technology/realsense-overview.html RGB-D camera] to detect most common human hand signals (e.g., thumbs up, thumbs down, all clear, pointing into distance, inviting).&lt;br /&gt;
&lt;br /&gt;
== Virtual reality user interface (VRUI) for intuitive teleoperation system ==&lt;br /&gt;
Adding [http://www.osvr.org/hardware-devs.html virtual reality capability] to a [https://www.youtube.com/watch?v=L25HHFd00rc gesture- and natural-language-based robot teleoperation system].&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Temoto-working-principle-in-pics.png|x160px|Gesture-based teleoperation]]&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
{{doi-inline|10.1109/HSI.2016.7529630|Kruusamäe et al. (2016) High-precision telerobot with human-centered variable perspective and scalable gestural interface}}&lt;br /&gt;
&lt;br /&gt;
== Health monitor for intuitive telerobot ==&lt;br /&gt;
Intelligent status and error handling for an intuitive telerobotic system.&lt;br /&gt;
&lt;br /&gt;
== Dynamic stitching for achieveing 360° FOV ==&lt;br /&gt;
Automated image stitching of images from multiple camera sources for achieveing 360° field-of-view during mobile telerobotic inspection of remote areas.&lt;br /&gt;
&lt;br /&gt;
== 3D scanning of industrial objects ==&lt;br /&gt;
Using laser sensors and cameras to create accurate models of inustrial producst for quality control or further processing.&lt;br /&gt;
&lt;br /&gt;
== Modeling humans for human-robot interaction ==&lt;br /&gt;
True human-robot collaboration means that the robot must understand the actions, intention, and state of its human partner. This work invovlves using cameras and other human sensors for digitally representing and modelling humans. There are multiple stages for modeling: a) physical models of human kinematics and dynamics; b) higher level-models for recognizing human intent.&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Skeletal ROS.PNG|x160px|ROS &amp;amp; Kinect &amp;amp; Skeleton-Markers Package]]&lt;br /&gt;
&lt;br /&gt;
== ROS wrapper for Estonian Speech Synthesizer ==&lt;br /&gt;
Creating a ROS package that enables robots to speak in Estonian. The basis of the work is the existing [https://www.eki.ee/heli/index.php?option=com_content&amp;amp;view=article&amp;amp;id=6&amp;amp;Itemid=465 Estonian language speech synthesizer] that needs to be integrated with ROS [http://wiki.ros.org/sound_play sound_play] package or a stand-alone ROS wrapper package.&lt;br /&gt;
&lt;br /&gt;
== Robotic avatar for telepresence ==&lt;br /&gt;
Integrating hand gestures and head movements to control a robot avatar in virtual reality user interface.&lt;br /&gt;
&lt;br /&gt;
== ROS driver for Artificial Muscle actuators ==&lt;br /&gt;
Desigining a controller box and writing software for interfacing artificial muscle actuators [{{doi-inline|10.3390/act4010017|1}}, [https://www.youtube.com/watch?v=tspg_l49hSA&amp;amp;index=10&amp;amp;list=UU186z2gc0XiLh12hNvPZdUQ 2]] ROS.&lt;br /&gt;
&lt;br /&gt;
== Upgrading KUKA youBot ==&lt;br /&gt;
KUKA youBot is omnidirectionl mobile manipulator platform. The project involves replacing the current onboard computer with a more capable one, installing newer version of Ubuntu Linux and ROS, testing core capabilities, and creating documentation for the robot. &lt;br /&gt;
&lt;br /&gt;
== Development of strategies for inter-robot knowledge representation ==&lt;br /&gt;
An inseparable part for making robots work together is to enable them to share knowledge about surrounding environment and robot's intention. This project focuses on developing and testing the methodologies for inter-robot knowledge representation. This project is developed as a subsystem of [https://utnuclearroboticspublic.github.io/temoto2/index.html TeMoto].&lt;br /&gt;
&lt;br /&gt;
== TeMoto Based Smart Home Control ==&lt;br /&gt;
The project involves designing a open-source ROS+[https://utnuclearroboticspublic.github.io/temoto2/index.html TeMoto based scalable smart home controller.&lt;br /&gt;
&lt;br /&gt;
== Detection of hardware and software resources for smart integration of robots ==&lt;br /&gt;
Vast majority of today’s robotic applications rely on hard-coded device and algorithm usage. This project focuses on developing a Resource Snooper software, that can detect addition or removal of resources for the benefit of dynamic reconfiguration of robotic systems. This project is developed as a subsystem of [https://utnuclearroboticspublic.github.io/temoto2/index.html TeMoto].&lt;br /&gt;
&lt;br /&gt;
= Completed projects =&lt;br /&gt;
*Kristo Allaje, Draiveripakett käte jälgimise seadme Leap Motion™ kontroller kasutamiseks robootika arendusplatvormil ROS, BS thesis, 2018&lt;br /&gt;
*Martin Maidla, Avatud robotiarendusplatvormi Robotont omniliikumise ja odomeetria arendamine, BS thesis, 2018&lt;br /&gt;
*Raid Vellerind, [http://hdl.handle.net/10062/56559 Avatud robotiarendusplatvormi ROS võimekuse loomine Tartu Ülikooli Robotexi robootikaplatvormile], BS thesis, 2017&lt;/div&gt;</summary>
		<author><name>Robertvalner</name></author>
	</entry>
	<entry>
		<id>https://ims.ut.ee/index.php?title=Theses_in_Robotics&amp;diff=18343</id>
		<title>Theses in Robotics</title>
		<link rel="alternate" type="text/html" href="https://ims.ut.ee/index.php?title=Theses_in_Robotics&amp;diff=18343"/>
		<updated>2018-06-01T12:05:04Z</updated>

		<summary type="html">&lt;p&gt;Robertvalner: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:IMS-robotics]]&lt;br /&gt;
= Projects in Advanced Robotics =&lt;br /&gt;
''The main objective of the follwing projects is to give students experience in working with advanced robotics tehcnology. Our group is active in several R&amp;amp;D projects involving human-robot collaboration, intuitive teleoperation of robots, and autonomous navigation of unmanned mobile platforms. Our main software platforms are [http://www.ros.org/ Robot Operating System (ROS)] for developing software for advanced robot systems and [http://www.gazebosim.org Gazebo] for running realistic robotic simulations.''&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
For further information, contact [[User:Karl|Karl Kruusamäe]]&lt;br /&gt;
&lt;br /&gt;
The following is not an exhaustive list of all available thesis/research topics.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Image:Youbot.png|200px|thumb|right|KUKA youBot]]&lt;br /&gt;
== Development of demonstrative and promotional applications for KUKA youBot ==&lt;br /&gt;
The goal of this project is to develop promotional use cases for KUKA youBot, that demonstrate the capabilities of modern robotics and inspire people to get involved with it. &lt;br /&gt;
The list of possible robotic demos include:&lt;br /&gt;
*using 3D vision for human and/or environment detection,&lt;br /&gt;
*interactive navigation,&lt;br /&gt;
*autonomous path planning,&lt;br /&gt;
*different pick-and-place applications,&lt;br /&gt;
*and human-robot collaboration.&lt;br /&gt;
&lt;br /&gt;
[[Image:Ur5 left.png|200px|thumb|left|Universal Robots UR5]]&lt;br /&gt;
== Development of demonstrative and promotional applications for Universal Robots UR5 ==&lt;br /&gt;
Sample demonstrations include:&lt;br /&gt;
*autonomous pick-and-place,&lt;br /&gt;
*load-assistance for human-robot collaboration,&lt;br /&gt;
*packaging,&lt;br /&gt;
*physical compliance during human-robot interaction,&lt;br /&gt;
*tracing objects surface during scanning,&lt;br /&gt;
*robotic kitting,&lt;br /&gt;
*grinding of non-flat surfaces.&lt;br /&gt;
&lt;br /&gt;
== ROS support and educational materials for open-source mobile robot ==&lt;br /&gt;
[[Image:RosLarge.png|left|100px|ROS]]&lt;br /&gt;
The goal of the project is to develop [http://www.ros.org/ Robot Operating System] (ROS) wrapper functions for drivers of an open-source mobile robot platform, that has been successfully used in Robotex competitions. Additionally, educational materials must be developed to integrate the robot platform into ROS ecosystem. The outcome of this work will give a lot of educational uses and rapid prototyping opportunities for the platform.&lt;br /&gt;
&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Ros_equation.png|x100px|What is ROS?]]&lt;br /&gt;
&lt;br /&gt;
== Detecting features of urban and off-road surroundings ==&lt;br /&gt;
Accurate navigation of self-driving unmanned robotic platforms requires identification of traversable terrain. A combined analysis of point-cloud data with RGB information of the robot's environment can help autonomous systems make correct decisions. The goal of this work is to develop algorithms for terrain classification.&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Rtab-map.png|x90px|Mapping]]&lt;br /&gt;
&lt;br /&gt;
== Robotic simulations (in Gazebo) ==&lt;br /&gt;
1) Developing large area simulation worlds for mobile robotics. In order to develop robot navigation algorithms, it is more time- and cost-efficient to test robot behavior in a wide range of realistically simulated worlds. These simulated worlds include both indoor and outdoor environments. This work focuses on designing robot simulation environments using an open-source platform [http://gazebosim.org Gazebo].&amp;lt;br&amp;gt;&lt;br /&gt;
2) Humans in Gazebo. Integrating walking and gesturing humans to Gazebo.&amp;lt;br&amp;gt;&lt;br /&gt;
3) Tracked Robots in Gazebo. Creating and testing tracked robotis in Gaxebo and testing different track configurations.&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Gazebo.png|x90px|Gazebo]] [[Image:Robonaut-2-simulator.png|x90px|NASA Robonaut simulation in Gazebo]]&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Follow-the-leader robotic demo ==&lt;br /&gt;
[[Image:LeapMotion.png|200px|thumb|right|Detecting 2 hands with Leap Motion Controller]]&lt;br /&gt;
The idea is to create a robotic demonstration where a mobile robot is using Kinect or similar depth-camera for identifying a person and then starts following that person. The project will be implemented using Robot Operating System (ROS) on either KUKA youbot or similar mobile robot platform.&lt;br /&gt;
&lt;br /&gt;
== Detecting hand signals for intuitive human-robot interface ==&lt;br /&gt;
This project involves creating ROS libraries for using either a [https://www.leapmotion.com/ Leap Motion Controller] or an [http://www.intel.com/content/www/us/en/architecture-and-technology/realsense-overview.html RGB-D camera] to detect most common human hand signals (e.g., thumbs up, thumbs down, all clear, pointing into distance, inviting).&lt;br /&gt;
&lt;br /&gt;
== Virtual reality user interface (VRUI) for intuitive teleoperation system ==&lt;br /&gt;
Adding [http://www.osvr.org/hardware-devs.html virtual reality capability] to a [https://www.youtube.com/watch?v=L25HHFd00rc gesture- and natural-language-based robot teleoperation system].&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Temoto-working-principle-in-pics.png|x160px|Gesture-based teleoperation]]&lt;br /&gt;
&amp;lt;br&amp;gt;&lt;br /&gt;
{{doi-inline|10.1109/HSI.2016.7529630|Kruusamäe et al. (2016) High-precision telerobot with human-centered variable perspective and scalable gestural interface}}&lt;br /&gt;
&lt;br /&gt;
== Health monitor for intuitive telerobot ==&lt;br /&gt;
Intelligent status and error handling for an intuitive telerobotic system.&lt;br /&gt;
&lt;br /&gt;
== Dynamic stitching for achieveing 360° FOV ==&lt;br /&gt;
Automated image stitching of images from multiple camera sources for achieveing 360° field-of-view during mobile telerobotic inspection of remote areas.&lt;br /&gt;
&lt;br /&gt;
== 3D scanning of industrial objects ==&lt;br /&gt;
Using laser sensors and cameras to create accurate models of inustrial producst for quality control or further processing.&lt;br /&gt;
&lt;br /&gt;
== Modeling humans for human-robot interaction ==&lt;br /&gt;
True human-robot collaboration means that the robot must understand the actions, intention, and state of its human partner. This work invovlves using cameras and other human sensors for digitally representing and modelling humans. There are multiple stages for modeling: a) physical models of human kinematics and dynamics; b) higher level-models for recognizing human intent.&amp;lt;br&amp;gt;&lt;br /&gt;
[[Image:Skeletal ROS.PNG|x160px|ROS &amp;amp; Kinect &amp;amp; Skeleton-Markers Package]]&lt;br /&gt;
&lt;br /&gt;
== ROS wrapper for Estonian Speech Synthesizer ==&lt;br /&gt;
Creating a ROS package that enables robots to speak in Estonian. The basis of the work is the existing [https://www.eki.ee/heli/index.php?option=com_content&amp;amp;view=article&amp;amp;id=6&amp;amp;Itemid=465 Estonian language speech synthesizer] that needs to be integrated with ROS [http://wiki.ros.org/sound_play sound_play] package or a stand-alone ROS wrapper package.&lt;br /&gt;
&lt;br /&gt;
== Robotic avatar for telepresence ==&lt;br /&gt;
Integrating hand gestures and head movements to control a robot avatar in virtual reality user interface.&lt;br /&gt;
&lt;br /&gt;
== ROS driver for Artificial Muscle actuators ==&lt;br /&gt;
Desigining a controller box and writing software for interfacing artificial muscle actuators [{{doi-inline|10.3390/act4010017|1}}, [https://www.youtube.com/watch?v=tspg_l49hSA&amp;amp;index=10&amp;amp;list=UU186z2gc0XiLh12hNvPZdUQ 2]] ROS.&lt;br /&gt;
&lt;br /&gt;
== Upgrading KUKA youBot ==&lt;br /&gt;
KUKA youBot is omnidirectionl mobile manipulator platform. The project involves replacing the current onboard computer with a more capable one, installing newer version of Ubuntu Linux and ROS, testing core capabilities, and creating documentation for the robot. &lt;br /&gt;
&lt;br /&gt;
== Development of strategies for inter-robot knowledge representation ==&lt;br /&gt;
An inseparable part of making robots work together is to enable them to share knowledge about surrounding environment and robot's intention. This project focuses on developing and testing the methodologies for inter-robot knowledge representation. This project is developed as a subsystem of [https://utnuclearroboticspublic.github.io/temoto2/index.html TeMoto].&lt;br /&gt;
&lt;br /&gt;
== TeMoto Based Smart Home Control ==&lt;br /&gt;
The project involves designing a open-source ROS+[https://utnuclearroboticspublic.github.io/temoto2/index.html TeMoto based scalable smart home controller.&lt;br /&gt;
&lt;br /&gt;
== Detection of hardware and software resources for smart integration of robots ==&lt;br /&gt;
Vast majority of today’s robotic applications rely on hard-coded device and algorithm usage. This project focuses on developing a Resource Snooper software, that can detect addition or removal of resources for the benefit of dynamic reconfiguration of robotic systems. This project is developed as a subsystem of [https://utnuclearroboticspublic.github.io/temoto2/index.html TeMoto].&lt;br /&gt;
&lt;br /&gt;
= Completed projects =&lt;br /&gt;
*Kristo Allaje, Draiveripakett käte jälgimise seadme Leap Motion™ kontroller kasutamiseks robootika arendusplatvormil ROS, BS thesis, 2018&lt;br /&gt;
*Martin Maidla, Avatud robotiarendusplatvormi Robotont omniliikumise ja odomeetria arendamine, BS thesis, 2018&lt;br /&gt;
*Raid Vellerind, [http://hdl.handle.net/10062/56559 Avatud robotiarendusplatvormi ROS võimekuse loomine Tartu Ülikooli Robotexi robootikaplatvormile], BS thesis, 2017&lt;/div&gt;</summary>
		<author><name>Robertvalner</name></author>
	</entry>
	<entry>
		<id>https://ims.ut.ee/index.php?title=Franka_Emika_Panda&amp;diff=17841</id>
		<title>Franka Emika Panda</title>
		<link rel="alternate" type="text/html" href="https://ims.ut.ee/index.php?title=Franka_Emika_Panda&amp;diff=17841"/>
		<updated>2018-02-14T08:07:13Z</updated>

		<summary type="html">&lt;p&gt;Robertvalner: Created page with &amp;quot;Category:IMS-robotics  '''The beginner guide for Panda robot can be accessed from here: https://github.com/ut-ims-robotics/tutorials/wiki/Franka-Emika-Panda-beginner-guide&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:IMS-robotics]]&lt;br /&gt;
&lt;br /&gt;
'''The beginner guide for Panda robot can be accessed from here: https://github.com/ut-ims-robotics/tutorials/wiki/Franka-Emika-Panda-beginner-guide&lt;/div&gt;</summary>
		<author><name>Robertvalner</name></author>
	</entry>
</feed>