Skip to content
Snippets Groups Projects
Commit ff8f0a2b authored by Kai Biermeier's avatar Kai Biermeier
Browse files

Merge branch 'feature/website_enhancement_v3' into 'develop'

Feature/website enhancement v3

See merge request skrings/varobotprojectgroup!114
parents 11f2dd83 179c001c
No related branches found
No related tags found
No related merge requests found
Showing
with 31 additions and 32 deletions
......@@ -5,9 +5,9 @@
<meta name="viewport" content="width=device-width, initial-scale=1, shrink-to-fit=no" />
<meta name="description" content="" />
<meta name="author" content="" />
<title>Augmented Reality - VARobot</title>
<title>Robot Simulation - VARobot</title>
<!-- Favicon-->
<link rel="icon" type="image/x-icon" href="assets/img/upbicon.ico" />
<link rel="icon" type="image/x-icon" href="assets/img/upbicon.png" />
<!-- Font Awesome icons (free version)-->
<script src="https://use.fontawesome.com/releases/v5.15.1/js/all.js" crossorigin="anonymous"></script>
<!-- Google fonts-->
......@@ -44,26 +44,28 @@
<div class="row justify-content-center">
<div class="col-lg-8 text-center">
<h2 class="text-white mt-0">Augmented Reality</h2>
<hr class="divider light my-4" />
<hr class="divider my-4" />
</div>
</div>
<div class="cardalt text-white bg-orange" style="max-width: 50%;">
<div class="cardalt" style="max-width: 50%;">
<div class="card-body">
<p class="card-text" style="text-align:justify">Augmented Reality (AR for short) describes the real-time combination of real-world input with virtual enhancements. This usually happened by displaying real-life camera footage and displaying virtual objects in it. A very commonly known example is the application “Pokémon Go”, where it is possible to see virtual creatures (augmented) in the real world. AR can commonly be used either on a handheld device, like a smartphone or tablet, or on a head mounted device like the HoloLens.<br />In our project, both kinds of devices can be used in an approach to achieve interoperability and a wider selection of interaction possibilities. The handheld devices are limited to Android-capable ones, since we used ARCore for its advanced features.</p>
<p class="card-text text-black" style="text-align:justify">Augmented Reality (AR for short) describes the real-time combination of real-world input with virtual enhancements. This usually happens by displaying real-life camera footage and displaying virtual objects in it. A very commonly known example is the application “Pokémon Go”, where it is possible to see virtual creatures (augmented) in the real world. AR can commonly be used either on a handheld device, like a smartphone or tablet, or on a head mounted device like the HoloLens.<br />In our project, both kinds of devices can be used in an approach to achieve interoperability and a wider selection of interaction possibilities. The handheld devices are limited to Android-capable ones, since we used ARCore for its advanced features.</p>
<div class="text-center">
<div class="">
<img src="assets/img/AR/ARCoreImage.png" width="910" height="910" alt="" class="img-fluid">
<figcaption>The ARCore plane tracking: the plane is visualized by the dotted surface, the virtual Android robot in placed on top of it</figcaption>
<figcaption>The ARCore plane tracking: the plane is visualized by the dotted surface, the virtual Android robot is placed on top of it</figcaption>
</div>
</div>
<br />
<p class="card-text" style="text-align:justify">ARCore offers so-called “plane detection” which is the possibility to scan a room with the smartphone camera and detect any planes. On these planes, objects can be placed and they will be displayed in the chosen position even if the user moves around. For this, special sensors are necessary, but luckily, most modern smartphones are equipped with these.<br />For the head-mounted device we use the HoloLens in combination with the MRTK, a toolkit for head mounted AR. The MRTK enables us to use, among other features, hand gestures or even eye tracking, offering new interaction possibilities.</p>
<p class="card-text text-black" style="text-align:justify">ARCore offers so-called “plane detection” which is the possibility to scan a room with the smartphone camera and detect any planes. On these planes, objects can be placed and they will be displayed in the chosen position even if the user moves around. For this, special sensors are necessary, but luckily, most modern smartphones are equipped with these.<br />For the head-mounted device we use the HoloLens in combination with the MRTK, a toolkit for head mounted AR. The MRTK enables us to use, among other features, hand gestures or even eye tracking, offering new interaction possibilities.</p>
<!--
<div class="text-center">
<div class="">
<img src="assets/img/" width="910" alt="" class="img-fluid">
<figcaption>An interface in AR. The information and control elements are in sight at all time without disturbing the workflow.</figcaption>
</div>
</div>
-->
</div>
</div>
</section>
......@@ -72,13 +74,14 @@
<section class="white-page arial" id="benefits">
<div class="row justify-content-center">
<div class="col-lg-8 text-center">
<h2 class="text-black mt-0">Benefits of AR for our Use Case</h2>
<hr class="divideralt my-4" />
<h2 class="text-white mt-0">Benefits of AR for our Use Case</h2>
<hr class="divider my-4" />
</div>
</div>
<div class="cardalt text-black bg-white" style="max-width: 50%;">
<div class="cardalt text-black" style="max-width: 50%;">
<div class="card-body">
<p class="card-text" style="text-align:justify">The benefits of AR are diverse and play into our research in different ways. Firstly, of course, there is the general point that AR can make interactions easier. Especially when wearing the HoloLens, the user can use both hands while still seeing information. In our framework, we use this to display interface elements like the code editor, so the user always has them in view. The components can also still be clicked in a similar way to non-AR applications, but in addition, the user can use additional options, like the hand gestures on the HoloLens, for interaction. This way, the user can choose the option that is the most comfortable for them.<br />The central point that sets the VAROBOT framework apart from others, however, is the fact that AR is used to simulate robot program executions. This way, a created program can be executed on a simulated robot before trying it on a real robot. Since robot programming is highly complex and needs a lot of knowledge about the surroundings, conventional methods are often very time consuming and error prone. One big problem is the need to deploy the program to a real robot every time for testing it and risking the robot getting damaged if there was a mistake in the program and the real robot, for example, collides with something. This can, in the worst case, be extremely expensive and might even stall production processes. The SPEARED solves this issue by simulating the robot programs in AR, making it possible to see how the robot would react and move virtually in the real space the robot should operate in. This way, it is possible to see if there are any collisions and if the robot moves as expected. Furthermore, the real robot can even continue working undisturbed by the development, as a good part of testing out programs can take place in AR. If the robot program is completed in AR, there is also an option to deploy it directly to the robot.</p>
<p class="card-text" style="text-align:justify">The benefits of AR are diverse and play into our research in different ways. Firstly, of course, there is the general point that AR can make interactions easier. Especially when wearing the HoloLens, the user can use both hands while still seeing information. In our framework, we use this to display interface elements like the code editor, so the user always has them in view. The components can also still be clicked in a similar way to non-AR applications, but in addition, the user can use additional options, like the hand gestures on the HoloLens, for interaction. This way, the user can choose the option that is the most comfortable for them.<br />The central point that sets the VAROBOT framework apart from others, however, is the fact that AR is used to simulate robot program executions. This way, a created program can be executed on a simulated robot before trying it on a real robot. Since robot programming is highly complex and needs a lot of knowledge about the surroundings, conventional methods are often very time consuming and error prone. One big problem is the need to deploy the program to a real robot every time for testing it and risking the robot getting damaged if there was a mistake in the program and the real robot, for example, collides with something. This can, in the worst case, be extremely expensive and might even stall production processes. The VAROBOT framework solves this issue by simulating the robot programs in AR, making it possible to see how the robot would react and move virtually in the real space the robot should operate in. This way, it is possible to see if there are any collisions and if the robot moves as expected. Furthermore, the real robot can even continue working undisturbed by the development, as a good part of testing out programs can take place in AR. If the robot program is completed in AR, there is also an option to deploy it directly to the robot.</p>
<!--
<div class="text-center">
<div class="">
<img src="assets/img/" width="910" alt="" class="img-fluid">
......@@ -86,7 +89,8 @@
</div>
</div>
<br />
<p class="card-text" style="text-align:justify">We hope to help to make the programming more effective and less time- and resource consuming, as this would save a lot of costs. This might also support new developments in robots, since developing new solutions becomes more feasible. A reduction in development costs could even make robots become more accessible for small and medium enterprises.</p>
-->
<p class="card-text" style="text-align:justify">We hope to help making the programming more effective and less time- and resource consuming, as this would save a lot of costs. This might also support new developments in robots, since developing new solutions becomes more feasible. A reduction in development costs could even make robots become more accessible for small and medium enterprises.</p>
</div>
</div>
......
......@@ -7,7 +7,7 @@
<meta name="author" content="" />
<title>Architecture - VARobot</title>
<!-- Favicon-->
<link rel="icon" type="image/x-icon" href="assets/img/upbicon.ico" />
<link rel="icon" type="image/x-icon" href="assets/img/upbicon.png" />
<!-- Font Awesome icons (free version)-->
<script src="https://use.fontawesome.com/releases/v5.15.1/js/all.js" crossorigin="anonymous"></script>
<!-- Google fonts-->
......@@ -43,29 +43,18 @@
<div class="row justify-content-center">
<div class="col-lg-8 text-center">
<h2 class="text-white mt-0">Architecture</h2>
<hr class="divider light my-4" />
<hr class="divider my-4" />
</div>
</div>
<div class="cardalt text-white bg-orange" style="max-width: 50%;">
<div class="cardalt text-black" style="max-width: 50%;">
<div class="card-body">
<p class="card-text" style="text-align:justify">The SPEARED 2.0 framework is targeted for Head-mounted AR devices and handheld devices.</p>
<div>
<ul>
<li>Architecture Overview of SPEARED 2.0 For Head-mounted AR Devices:</li>
<p class="card-text" style="text-align:justify">The architecture of our framework is composed of 5 different layers as illustrated below.<br />The <b>Concrete Layer</b> consists of the robots and their simulation models. Currently, we only support two different robots: the Lego Mindstorms NXT and the Dobot Magician system. Our framework both supports a real and a simulated variant.<br />The second layer is the <b>Translation Layer</b>. In this layer, all components that handle the execution of robots are defined. We implement components that describe how to execute programs on the real robot and on the simulated one. Furthermore, there are components that describe how programs in our formalism can be compiled.<br />On the <b>Conception Layer</b>, we have the Core and the Code WebEditor packages. In the Core Package, the actual program structure and components are defined. The Code WebEditor package allows defining programs based on the blockly framework. The blockly program can be translated to a program by means of our Core Package.<br />On the <b>Visualization Layer</b>, we can observe components for on-screen UI, AR UI, and voice commands. They are interconnected because they basically all implement the same abstract UI.<br />In the <b>Utilities</b> package, all special features are collected that do not fit well to another category. Especially, this holds for Session Management and Location Selection. Location Selection is often also labeled as target sphere selection or also sometimes programming by demonstration.<br />For getting more detailed insight into the behavior of most of the components, consider the motivation section.</p>
<div class="text-center">
<div class="">
<img src="assets/img/architecture/head-mounted.png" width="910" height="910" alt="" class="img-fluid">
<img src="assets/img/architecture.svg" width="910" height="500" alt="" class="img-fluid">
<figcaption>Architecture overview of the framework</figcaption>
</div>
</div>
<br />
<li>Architecture Overview of SPEARED 2.0 For Handheld Devices:</li>
<div class="text-center">
<div class="">
<img src="assets/img/architecture/handheld.png" width="910" height="910" alt="" class="img-fluid">
</div>
</div>
</ul>
</div>
</div>
</div>
</section>
......
This diff is collapsed.
Website/dist/assets/img/K8.png

299 KiB

Image diff could not be displayed: it is too large. Options to address this: view the blob.
Website/dist/assets/img/SICP.png

19.9 KiB

Website/dist/assets/img/Seminar.jpg

43.4 KiB

Website/dist/assets/img/UIDesign.png

115 KiB

Website/dist/assets/img/Usabilitystudy.png

63.2 KiB

Website/dist/assets/img/Voicecomm.jpeg

15.2 KiB

Website/dist/assets/img/Websitedev.jpg

50.2 KiB

This diff is collapsed.
Website/dist/assets/img/architecture_nottrans.png

181 KiB

Website/dist/assets/img/codePanel.png

101 KiB

Website/dist/assets/img/dobot_pg.PNG

18.1 KiB

Website/dist/assets/img/dobot_pg_trans.png

17 KiB

Website/dist/assets/img/dobot_raw.png

3.26 MiB

Website/dist/assets/img/grab/1.png

202 KiB

Website/dist/assets/img/grab/2.png

199 KiB

Website/dist/assets/img/grab/3.png

194 KiB

0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Please register or to comment