VR Workbench

360 Image Menu

image.png

What is a 360 image?

A 360 or panorama image, is a special type of image that allows the user to experience being inside the image in 3D. It simulates the user being in the shoes of the photographer, who took the picture, and looking around to the left, right, up and down as desired as well as sometimes zooming.  The user can view the scene on a mobile, using Facebook, using other desktop viewers, or embed it into a website.

image.png

How to create a 360 image?

Create a VR camera, and place it in a good position in the center of the scene. In Render Output Settings it is preferred that the output ratio be set to 2:1, SPP 3000, and Rendering Mode to Interior Renderer with Light and Specular depths set to 100. Then select Fixed to File from the 360 Image menu.

This process is shown in the following tutorial.

Ways to share your 360 images 

  1. On Facebook
  2. On mobile, using SimLab VR Viewer
  3. On  desktop using SimLab VR Viewer
  4. Adding code to embed 360 images in your website

Sharing 360 images on Facebook

For 360 images to be automatically detected on Facebook, width to height ratio should be 2 to 1, for example, if the width is 2000 the height of the image should be 1000.  Another restriction is with maximum file dimensions of 6000 x 3000 pixels

Sharing 360 images on Mobile

Using the FREE SimLab VR Viewer; with a 360 image created *.png transfer the image to a mobile, and download File Manager (Mobile, AsusTek Computer Inc.). 

image.png

image.png

image.png image.png

For more details and examples on how to embed a 360 image in a website visit these links: Link_1, Link_2

Refer to SimLab 360 Technology page

Multiple 360s

Multiple 360s allow users to view 360 images that have been created and move among them in VR Viewer.

To use multiple 360s, Follow the steps below:

  1. Create VR Camera from the 360 Image Menu and move the camera to the proper location, duplicate the camera and add multiple cameras in the scene (At least two cameras).
  2. Select Multiple 360s from the 360 Image Menu to render multiple 360 images.
  3. When the renders are finished, the render result will appear in the Properties Dialog of the VR Camera under the 360 Section.  There the user can preview and change the 360 images, also can make the camera position appear behind walls, and move among the 360 images in the VR Viewer (it's Not Visible by default).

    image.png


  4. To view multiple 360s in VR Viewer, select Show in Viewer from the VR Viewer menu. This will open the Viewer Modes dialog, select Desktop, then from the Start Viewer dialog select 360.


    image.png

    image.png


360/Panorama Grid (How to create tutorial):

Teleportation Points

Use Teleportation and VR cameras to navigate large scenes fast, by activating the X-ray view in SimLab VR/Desktop Viewer.

This is a guide showing the user how to walk through, and Teleport a scene

High Dynamic Range Image (HDR) Creation

HDR (High Dynamic Range images) or environment images, is one of the preferred methods to light a 3D scene for rendering.

Originally these images were generated using specialized cameras, or through specialized software. Due to the processes involved in generating HDRs, they usually carry licenses that prevents free distribution.

SimLab Composer supports creating: 3D scene HDRs, Studio HDRs, and HDR Studio.

image.png

Generating HDRs with SimLab Composer

SimLab Composer makes it easy to generate HDR images, with the freedom of distributing them, as any other rendered image created using SimLab Composer. With SimLab Composer, the user can create multiple 360 render images from the VR Viewer faster and easier without consuming the processor of the device.

View the following tutorial to learn how to create HDR images using SimLab Composer.

Sharing HDRs

If you generate a special HDR that you want to share with other SimLab Composer users, please send an email to support@simlab-soft.com and we will be glad to share it with the community.

VR Effects / Media Menu

This menu allows the user to add visual effects and dynamic elements to the VR Experience .

It is organized in the following groups:

 image.png

Particles Group  

Includes the below visual effects supported in SimLab Composer/VR Studio:

Make Fluid
Create Fire
Create Smoke
Create Spark 
Create Shower
Create Steam
Create Wire/Rope
Create Hose
Create Spring

Make Fluid

Enables users to convert selected 3D shapes into fluids. To learn more about this feature check the following tutorial:

Create Fire and Smoke

Enables the user to add fire effect to the scene, this can be a large trailing fire, a stove, or a candle.
Fire effect automatically includes smoke option to be shown with the fire. If the user wants smoke alone without fire, then Create Smoke can be used.

The following tutorial shows the process of creating fire and smoke in Composer/VR Studio:

Create Spark 

Spark effect can be used to show electrical sparks, or welding effect. The following tutorial shows how to create the spark effect: 

Create Shower

For creating moving water or liquid effect in SimLab Composer or VR Studio, this can be done using shower effect. It allows the user to control color, speed and strength of water flow. The following tutorial shows the process of creating the shower effect:

Create Steam

Creates steam effect in the VR Experience with the VR Properties shown below. The user can change the steam strength and color in the Properties dialog. 

image.png

Create Wire/Rope

Creates connecting wires in the scene, like connecting the soldering kit shown below with its electric plug.

Create_Wire2.png

The following tutorial shows the process of creating a wire:

Create Hose

Clicking this tool will create two points in the 3D area, a start point and an end point. Adjust their location to the desired location. For more about this tool check this tutorial.

Create Spring

Creates a 3D spring model with the length specified by the start and end points selected by the user. A 3D Geom for the spring will be added to the Scene Browser with its two endpoints. These points can be selected and transformed, thus changing the shape/length of the spring. 

image.pngWith the created spring selected, its Properties panel will appear, VR Spring properties Radius, and Thickness can be edited. 











Landscape Group

Make Water

Make water works on one object or group of objects, the process is as follows: 

  1. From the Effects menu » select Make Water.
  2. Select 3D object(s) to become water, then click image-1641559302867.png

In the Scene Browser notice that the shape of the icon for the 3D object(s) has changed to a water shape.

The water settings can be adjusted from the VR Water tab in the Properties dialog. Water settings are:

image-1640854173697.png

To remove the water effect, from Scene Browser, top combo box, select VR Landscape. An 'X' mark will appear next to the water geometry, click it to remove the water effect. 

Remove_water.png

Make Ground

Ground objects are objects the user can teleport to in VR. By default the user does not need to set ground objects. Any object with good orientation (has normal close to the up vector) is considered ground.

If the user wants to have more control and wants to prevent others from moving all around the scene, the user can use Make Ground to define the object(s) the user can teleport to. When Make Ground tool is used, default ground is turned off, and only objects defined as ground by the user are considered as ground 

Push and hold the stick on a VR controller to activate teleportation mode. Then Release to teleport to a valid destination.

image-1639765256812.png

image-1639765261188.png

Grass

Enable users to add realistic grass effect to VR scenes. Selecting this VR effect will open the VR Grass Generator dialog.

image-1640855433922.png

VR Grass Generation dialog allows the user to control:

The following video shows a sample of Grass effect in action: 

The following tutorial shows how to use grass in Composer or VR Studio:

Media Group

Make Video

With Make Video effect, the user can convert any 3D object to a video player. The video should be an MP4 file.

The user can use a URL link in which case the video will be streamed during the VR Experience, or a local link for a file on the users machine. Local videos are automatically added to the VR Experience package, so the videos can still play when VR Experiences are distributed

The following dialog shows the VR Video properties:

image-1639765420586.png

To learn how to use Video effect in VR, check the following tutorial:

Remove Video

To remove a video click the "X" mark in the Scene Browser next to the 3D geometry including the video.

image-1639765500793.png

Video Action response in VR Training Builder can be used to control the Video play in VR. It can be used to pause, play, toggle, or seek in VR, as a response to an event. For more info about Video Actions in VR Training Builder check this tutorial.

Create Sound

This function adds 3D Sound object to the scene, and the Scene Browser. Selecting the 3D Sound form the tree will display its properties in the Properties Panel, VR Sound tab. The 3D sound object occupies a location in the 3D scene.

The user can determine if the sound should be played on start of the scene, or not. To be played once or in a loop.

The type of the 3D Sound can either be Spatial, which means the sound volume is automatically affected by the location of the 3D sound in the scene, or Constant. Constant sounds when played has the same volume in all areas of the scene.

image-1640856128399.png

For more information about VR Sound check this tutorial

Create Web Browser

Adds a flat screen VR Web Browser element to the Scene Browser, and the 3D area. In the Properties dialog under the VR Web Browser tab the user can input the URL for the browser to display.

image-1640856957969.png

The user can change the width and height of the Web Browser plane using the white points that appear on it when selected. The web browser can be placed on a PC screen, or a full wall in the VR Experience.

VR_WebBrowser.png

Create Surveillance Plane

Surveillance Plane allows the user to view far or not easy to view parts of the scene using a VR Camera. VR Surveillance plane is linked to the camera so it shows what the camera is seeing. The following tutorial shows how to use VR Surveillance.

Creating VR Camera is available under VR Viewer menu.

image-1640857427829.png

VR Slides Generator

PowerPoint presentations or PDF files generated in the past can now be utilized easily in VR Experiences. In few clicks you will be able to get the presentation and optionally a small podium version of it for an  amazing VR Experience.  
VR Slides Generator tool will automatically create the needed training builder blocks to make it easy to utilize slide switching as an event, for example to show/hide a 3D object that works well withimage.png your presentation. To learn more about this tool check this tutorial.




UI Panel Group

Create Variable Writer

This tool enables the user to track the value of a VR Variable in the VR experience.
Click Here to learn more about creating VR Variables.

The value of the variable will be updated dynamically, so whenever the variable value changes the variable writer will be updated to show the new value of the variable. The user can control the size, and place for the variable writer, so it can be aligned on a wall in the scene or on a monitor.

User selects which variable to view, color of the text, and prefix text.

You can also change its horizontal and vertical text alignment, check "Max String Length" to keep the font size fixed, and finally check "Editable" if you want to be able to edit the variable that it contains directly while running the VR Experience in the Viewer

Create Scene States Lister

Scene State Lister automates the processes of showing the Scene States in the scenes and allows the user to switch between them during the VR Experience. 

Adding a Scene State Lister will add it to the 3D area, and the Scene Browser. The user can control Scene State Lister parameters from VR Scene State Lister tab in the Properties Panel, as shown in the following image.

image-1640859822645.png

The following tutorial shows how to use Scene State Lister in a VR Experience:

Interactions Menu

Allows users to define interactions in VR scenes.  

        image.png

Grabbing Group

Make Grabbable

Makes selected object(s) grabbable in the scene. A grabbable object is an object that can be grabbed by a controller in a VR Experience or using the desktop hand and can be moved from one place to another. 

image-1640597943069.png

Grabbable objects are highlighted clearly in the Scene Browser 






Grabbable objects are highlighted in the Desktop or VR viewing modes with orange or red highlight , as shown in the following image:

image-1640598102544.png

Make Grabbable Sequence

Grabbable sequences allow users to interact with 3D object(s) in the Viewer by linking them to animation sequences. In the image below, an animation to control the arm was saved as a sequence, then linked to the arm object using Grabbable Sequence. When the arm is grabbed in the Viewer the path appears and the user can move the grabbed object along the path.

image-1639835396166.png

The following tutorial shows how to create and use Grabbable Sequence

 

Dynamics Group

Enable Physics

The physics feature allows for the object to have physical properties such as gravity, mass and friction. To add physics to an object select Enable Physics then select the object you want.

Make Climbable

With this feature, selected objects in the scene can be made climbable such as ladders. Just click Make Climbable and select the object you want. 

Check this tutorial for more about these two features.

 

To adjust the Density and Restitution properties, which impact how objects behave during physics simulations, follow these steps:

1- Choose the material(s) applied to the physics objects.
2- Navigate to the Physics properties.
3- Update the Density and Restitution values according to the instructions illustrated in the accompanying image.

image.png

Training builder gives the user the option to enable/disable physics on physics objects in the VR Experience.

The response for doing that is shown in the following image:

image.png


Clipping Planes

image-1639838990390.png

Also known as Section Planes in VR, it can be static, animated, grabbable, and turned on/off dynamically. By determining the Clipping Plane and Clippable Objects the user can make clipping effects for some or all objects in the scene. The clipping plane can be static or it can be animated showing Clippable Objects as it moves, the Clipping Planes tutorial will show you how to use Clipping Planes and assign Clippable Objects

From VR Clipping Plane tab in the Properties Panel, the user can select to show or hide the clipping plane. The direction of the clipping plane normal determines the clipped section of the model.

image.png

Disassembly and Docking Group

Make Dock and Target

Dock is the static object, and Target is the (grabbable) moving one, so Target needs to be made Grababble first. After selecting this function the user will be asked to Select Dock Node, then to Select Target Node, and last to configure object orientation.

In the VR Experience, the user grabs the Target and when it is released (grab ends) while it is intersecting with the Dock, it is moved to the docking target's predefined position.

Remove Dock and Target

Selecting this function will prompt the user to Select Dock Node, and Select Target Node then the relation will be removed

The following tutorial shows how to create Dock and Target interaction in SimLab Composer and Studio:

Manage (Dis)Assembly Process

In case the assembly/disassembly process is linear, which means it goes in a defined order this function can be used. It simplifies the process and makes it move in a very specific scenario. Highlights will guide the user throughout the assembly/disassembly process with ease and without any confusion. Green highlight object in an assembly refers to the currently available part of the system that can be removed. Yellow highlight indicates that the object can be grabbed.

image-1639839159650.png

image-1639839164279.png


image-1639835151251.png

image-1639835155701.png

Blue highlight in the assembly indicates that the object can be placed there.

Press Right Mouse to toggle the hand. The hand has a fixed position in the view, and will teleport to the grabbable object when you click on it, and will keep grabbing as long as you hold the button.

The assembly management interactions, and settings:

Note: Training Builder can be used for creating a more open assembly/disassembly process. It can be used to define the logic for the processes, giving the user more control without forcing her/him to adjust the model structure in Scene Browser or go through a single path of exclusion.


VR Assembly System

The new Assembly/Disassembly System released in V14 of SimLab Composer and Studio (Ultimate)

Key Features of the New System: Comprehensive Assembly/Disassembly Support. Seamlessly assemble and disassemble components with precision and flexibility.

Logical Order Enforcement: The system ensures a logical process, such as requiring screws to be removed before detaching the parts they secure. However, the sequence of removing screws is flexible, take them out in any order you prefer!

Interchangeable Parts Recognition: Recognizes parts like washers and screws as interchangeable, allowing substitutions as needed.

Tool Integration: Supports the use of tools to remove or place objects, making the process more realistic and efficient.

Optional Ghost Guides Enable ghosted visuals to easily identify the correct placement for objects, simplifying the assembly process.

This system is robust enough to handle even large and complex assemblies, as demonstrated in the video below.


Industries Solutions

SimLab Composer is continuously adding new features to make the experience of its users more joyful, and easier.

Door/Cabinet Animation

Automatically creates an animation for an object depending on its type. This can be useful for scenes with multiple objects of the same type. There are two main methods for defining which objects to animate:

image-1639846169901.png


Object Selection

With this method, the user will manually select the components in the scene to create an animation for.
This method is suitable for scenes with a small number of objects. Check the following tutorial on automatic animation for Doors/ Cabinets using the Object Selection Method

Object Name

With this method, the user will type in the name of the components, and the software will automatically find them in the scene and list them.
This method is more suitable for scenes with a large number of objects to animate. Check this tutorial on automatic animation for Doors/ Cabinets using the Object Name Method

Doors and Cabinets Settings

image-1640601023394.png

Visualize Scene Options

Visualize scene options (scene states and animation sequences) by displaying them as entities in a 3D world for easier and more practical VR Experiences. Check this tutorial on how to visualize scene options in VR.

image-1639846690281.png

Press "3" on your keyboard in the VR Viewer to show available Pins, "LBUTTON" to select Scene States/Animation Sequences that appear as entities from the Pins.

image-1639848777780.png

1 Select a Scene Node from the Objects Tree or from the 3D area, then click this option to add a Pin to the selected node, showing its scene states if interacted by the user in SimLab VR Viewer. 
2 Remove a Trigger and eliminate any associated Pins and scene options.
3 The list of Triggers.
4 Activates Scene States library to drag-and-drop Scene States from it into Pin nodes.
5 Activates Animation Sequences library to drag-and-drop Animation Sequences from it into Pin nodes.
6 Add a new Pin to the selected node to hold other scene options chosen by the user. Once added, either Scene States or Animation Sequences from their corresponding libraries can be dragged and dropped into it.
7 Remove the selected Pin node from its diagram.
8 Scene States/Animation Sequences can be dragged-and-dropped here from their corresponding libraries.
9 Added Scene States/Animation Sequences
Visualize Scene Options - Step by step
  1. Create multiple Scene States/Animation sequences (at least two), for a 3D model.
  2. From the Interactions menu click Visualize Scene Options.
  3. Select the object for which to make a list, Ex: Bedside Table, select the table model then Add a Trigger by clicking image-1639851544234.png


  4. The object name (Bedside Table) should appear on the Pin and Scene Options side as shown in the image.

    image-1639851925566.png

  5. Drag and drop created Scene States/Animation Sequence (created in step one) from their libraries into VR List1 one by one
    1. image-1639852036367.png
  6. The Scene States/Animation Sequence will appear under the VR List1
  7. Select List 1 to display its properties, and change its name, and propertiesimage-1639852422955.png


Snap and Socket

VR Snapping feature makes building scenes in VR easier and more intuitive. Whether you're designing a kitchen or creating an educational lab, VR Snapping allows you to effortlessly add items and align them perfectly without any extra effort. Setting up VR Snapping in Composer/Studio is simple and gives you full control over how it behaves in the VR environment. Watch the video below to learn how to use VR Snapping


XR Anchors

When viewing scenes in XR mode on your Quest headset, you'll be able to attach 3D models to specific physical locations in the real world.

This feature enables you to:
Place training models in actual examination rooms.
Add visual highlights to real-world objects.
Visualize furniture placement in your space.

Watch the tutorial below to see XR Anchors in action.


User Gadgets Group

Add User Gadgets

This function will add some gadgets to the Viewer start model, as shown in the dialog below:

image-1640602517072.png

Adjacent Objects: These are objects that move with the Viewer start model in the VR environment. They can be something like tools (screwdriver or hammer). Check this part of the tutorial for more about Adjacent Objects.

Controllers: This allows the user to change the VR controllers with selected 3D models for hands, or gloves. Controllers appearance is only available in VR Mode. The following video shows the effect of using Controllers in the VR experience.

The following tutorial shows how to use controllers gadget in SimLab Composer/Studio. 



Hud: Hud is a plane shown in front of the user so it is easy to show instructions or current status. The following video shows the Hud appearance in a VR Experience.

The following tutorial describes how to add/use Hud and Adjacent Objects

Change Equipment State response in the Training Builder, which can be used to control equipment state (Equipped or Unequipped) for all gadgets (Adjacent Objects, Controllers, and Hud)

image-1640603991677.png

Avatar Generator

In this dialog users can build their own avatars from a collection of Faces, Uniforms, Headwear, and Glasses. Avatars are used in VR Collaboration, in SimLab VR Viewer. For more about Avatar Types and Creation check this tutorial 

image.png  vwXimage.png

image.png TPWimage.png 

Voice Commands

For improving VR based training that involves interacting with humans in the real world, voice commands support is added.

Voice commands are new events that are fired when the appropriate voice command is said in the experience. SimLab Composer will support voice commands in 20 languages

The following example shows how voice commands can be used in the VR experience.

The following tutorial shows how to add voice commands in SimLab Composer/Studio: 

AI Agents

image.png

The user can make it work with 3D Avatars, by describing the agent role, talk to it and get answers in different languages. 

The following image shows how to set your OpenAI API key in the viewer

image.png

With this feature To learn more about this feature check the following tutorial.


Find & Control

Searchable Components 

With a complex CAD model with good tree structure, this feature can be very useful. It allows users to search for objects then show/hide, isolate, highlight, and grab them while running the experience in the Viewer


Training Builder Menu

Training Builder is a visual tool that allows VR Experience designers to control the logic in the VR Experience without the need to write any code.

Training Builder allows the user to link an Event with Response(s)

An Event is fired when something happens in the experience, for example when the user clicks on a specific object, or when an object collides with another object. A Response is a reaction the VR Experience should do when an Event takes place.  

For example, when the user clicks on Globe Object (Earth_geo) Scene Node in the Node Triggered Event, it starts the rotation sequence (Play Sequence) Response, as shown in the following image;

image.png

There is no limit to the number of elements in Training Builder. The user can add as many elements as needed in diagrams.  
Each Diagram can have up to 300 elements to keep things organized for big projects. For small projects one diagram should be enough, for larger projects the user needs to organize work by keeping up to 300 elements in each diagram.


Creating your first VR experience can serve as a good first interaction with the Training Builder to see it in action: 



The following video provides more details about using Training Builder: 



SimLab Academy (in the Library panel) under VR Creation includes a section dedicated for Training Builder tutorials.

image.png

Variables and Expressions

Using Training Builder enables users to add variables and use them in creating VR Experiences. Check this tutorial about using variables in Training Builder.

Supported variables are of the following types:

String Variable: Stores object names, message data, ..etc
Number Variable: Supports both integer and float numbers. Can be used for calculating, and storing values
Time Variable: Can save time at any stage of the VR Experience

String Variables

The initial value can be set to any string in the variable editor in the training builder.
When used in Change Variable response, the string should be surrounded by single quotations (' ')

Expression-supported operations include adding strings and substring

Number Variables

Initial Value can be set to any float or integer numbers, the following operations are supported for number variables
Increment: Adds one to the current value
Decrement: Subtracts one from the current value
Time Difference: Calculates the difference in seconds (up millisecond precision) between two-time variables
Expression:  large number of expressions are supported, list of supported expressions can be found in the following link, The following tutorial shows how expressions can be used in Training Builder

 

Time Variables

The initial value for all time variables is set to the start time of the VR Experience, at any point of the VR Experience the user can capture the current time and store it in a time variable

Variable writer

This tool enables the user to track the value of a variable in the VR Experience

The value of the variable is updated dynamically, so whenever the variable value changes the variable writer will be updated to show the new value of the variable. Variable Writer is available under the VR Effects Menu.

The user can control its size, and place, so it can be aligned on a wall in the scene or on a monitor, user selects which variable to view, the color of the text, and the prefix text.

Create/Edit Variables

Variables can be created and edited by clicking the Edit Variables button at the top right part of the Training Builder. It allows the user to view/ delete/ change the initial values of existing variables, or create new variables.

Clicking '+' at the top right corner of the Edit Variables and Initial Values dialog will add a new variable. The user can click to change the name of the variable, select its type from the combo box, and set its initial value.  

image-1640786112008.png

Inputs

Inputs are entities used as triggers for actions in the Training Builder diagram. For example, if multiple nodes in the diagram use a Scene Node, it can be used as input and be connected to multiple blocks. Updating the input once will be reflected on all blocks using this input.

image.png

Icons Inputs Name

image.png

Sequence

image.png

Scene State

image.png

Scene Node (object)

image.png

String

image.png

Variable

image.png

Action

image.png

Boolean

image.png

Hand

Events

Events are triggered when something happens in the VR Experience.

image.png

The following image shows how to use Scene Start Event, which takes place as soon as the VR Experience starts, to play Sound Action Response.   

image.png

Icons Events Name

image.png

Scene Start

image.png

Scene State Applied

image.png

Sequence Ended

image.png

Node Entered Node

image.png

Node Exited Node

image.png

Hand Entered Node

image.png

Hand Exited Node

image.png

Voice Command Recognizer 

image.png

Node Triggered

image.png

Node Grab Started

image.png

Node Grab Ended

image.png

Delay

image.png

Grip Pressed

image.png

Hand Entered Hand

image.png

Hand Exited Hand

image.png

Voice Command Test

To learn more about Grip Press event check this tutorial

Responses

Responses are defined reactions to events. To link a Response to an event, the user needs to simply connect the Execute channel from the Event to the Execute channel of a Response. The user can connect the output Execute channel from a Response to the input Execute channel of another response to guarantee the order of execution and to link multiple responses to an Event.

image.png

The diagram below shows responses for clicking on an object (Node Triggered)
1- Play a Sequence
2- Wait for 2 seconds
3- Hide an object from the scene 

image.png

Icons Responses List

image.png

Apply Scene State

image.png

Change Node Grabbable State

image.png

Play Sequence

image.png

Stop Sequence

image.png

Show/Hide

image.png

Show Message Box

image.png

Teleport to Camera

image.png

Change Node Action

image.png

Sound Action

image.png

Video Action

image.png

Report user-defined measurement

image.png

Active Quiz/Survey 

image.png

Change Equipment State

image.png

Change Grabbable Sequence  

image.png

Enable/Disable Physics

image.png

Vibrate Controller

image.png

Advanced Change Variable Value

image.png

Branch (Checks value, if true follows one path, if false follows the other)

image.png

Get Attribute Value

image.png

Set Attribute Value

image.png

Glow Object

image.png

Un-glow Object

image.png

Point To Object

image.png

Remove Point To Object

image.png

Get Position

image.png

Set Position

image.png

Fall to Surface

image.png

Reset Rotation

image.png

Set Parent

image.png

Reset Parent 

To learn more about Vibrate response check this tutorial

Booleans

Booleans include two groups States, and Operations. 

image.png

States are similar to events, and the current state can always be gotten from them. For example in the following image, Objects Overlap is shown, if the two objects Overlap you get OnTrue execution, as soon as they do not, you get OnFalse execution. The output Boolean value can be checked at any point to see if they are Overlapping on not. OnTrue or OnFlase are only fired when the state changes.

image.png

Operations allow running Boolean operations on Boolean variables. Supported operations include And, Or, and Not 

Icons Booleans List


image.png

Compare Variable Value

image.png

Objects Overlap

image.png

Object is Grabbed

image.png

And Operation

image.png

Or Operation

image.png

Not Operation

Templates 

Training Builder works great for describing multiple-step training scenarios. But what if the same behavior is repeated for a class of objects? For example, if the user ends up grabbing any of the tools in the training, the response should be for it to fall to the ground, this is when templates are used.

To Create a Template diagram in the Training Builder click Create Template button at the top right side of the Training Builder.

image.png

After clicking Create Template, a new Template Diagram is added. A template diagram is different from a regular Training Builder diagram in its orange background marks, and in Template Scene Node block added to it. This block can not be deleted. 

image.png

Template Scene Node block selects a group of Nodes based on a query that can include one or more rows connected with And or Or, as shown in the following image.


image.png

Logic connected to the Templet Scene Node block will be applied to each Node that satisfies the selection query, for example in the previous image, each object having "box" in its name, and "wood" in the value of its material attribute will be selected. 

The following tutorial shows the power of Template Diagrams in the training builder

Advanced Features 

Copy Part of a Diagram

To repeat the same logic for more than one object (in case templates did not do the job), part of the diagram can be copied. This is done by using the Left Mouse button to highlight the part of the diagram to copy, while the section is highlighted click CTRL + C to copy it, then CTRL + V to Paste. After that, the Scene Node Object needs to be changed, and any block needs update. 

image.png

Drag/Drop of the Scene

To add inputs like a Scene State or a Sequence, the following process is used:

1- Click on Sequence from the toolbar
2- From Sequence Library select the desired Sequence
3- Click attach Sequence 

As shown in the following image

image.png

If you know the Sequence or the Scene State you can directly drag it from the Library and drop it on the Training Builder 

image.png

Right Mouse

Clicking the Right click in the training builder shows all toolbar elements, organized, so you can add any block without moving the mouse to the toolbar. 

image.png

You can also click a few characters in the filter to find an element quickly, as shown in the following image:

image.png

Attributes 

Attributes can be used with Training Builder in many ways:

It is a good idea to be familiar with attributes to create advanced VR Experiences. To learn more about  attributes check the following tutorial:

Enhancing Medical Training with VR Palpation Simulation

The Training Builder Hand Source and Events for VR medical simulations empowers VR Experience designers with unprecedented control over advanced hand skills training, particularly in processes like patient palpation. 

Watch the demonstration in the video below to witness the immersive and customizable experience:

Tools to organize Training Builder experience 

Some users are creating large and complex training, and virtual reality (VR) Experiences using the training builder. To help manage these complex diagrams, the following tools have been provided:

image.png

Notes: Use notes to describe the logic, making it simpler for both you and others to understand and update the training builder diagrams in the future.

Groups: You can organize nodes into groups and give them clear names. This makes it easier to find and update the logic of your training builder.
Search: Suppose you've improved an animation sequence and want to replace the old one in the training builder. Use the search tool to find all instances of the old sequence and replace them with the new one.

image.png


Dynamic Builder

In template diagrams, a user was able to define logic for a group of objects based on name, or attribute criteria. Dynamic diagrams take this to the next level, they allow defining interaction between multiple templates, in a dynamic way. What this mean is that a user can set a dynamic criteria for triggering actions and responses mainly by defining multiple queries using multiple scene nodes.

When you have access to the Dynamic Builder, the default diagram will be a Dynamic diagram, and to add a new one, click the highlighted Add a new dynamic diagram button at the top right side of Training Builder diagram. 

Add dyn.jpg

A new Dynamic Diagram will open, and the Dynamic Diagram menu will appear.

Dyn new.JPG

Dyn menu.JPG

Dynamic training builder menu is divided into four groups:

Sources

A source is an expression that will generate a value that can be used to trigger an event in the Dynamic Builder. Sources are divided into two tabs, Main Sources and Template Sources. 

Main Sources 

image.png

Icon Source Name

image.png

Scene Node

image.png

Scene State

image.png

Expression 

image.png

Number 

image.png

Boolean

image.png

Hand

image.png

Sequence 

image.png

Variable 

image.png

String
Template Sources

image.png

Icon Source Name

image.png

Scene Node Query

image.png

User Query

Events 

Events are triggered when something happens in the VR Experience. They are arranged into two tabs Main Events and Object Interactions 

Main Events

image.png

Icon Event Name

image.png

Scene Start

image.png

Node Triggered

bprimage.png

Scene State Applied

image.png

Sequence Ended

image.png

Node Grab Started

image.png

Node Grab Ended

 PLHimage.png

Grip Pressed

image.png

Grip Released

image.png

Voice Command Recognized

image.png

Ticker
Object Interactions

image.png

Icon Event Name

image.png

Node Enter Node

image.png

Node Exited Node

image.png

User Enter Node

image.png

User Exited Node

image.png

Node Hover Started

image.png

Node Hover Ended

image.png

Hand Enter Node

image.png

Hand Exited Node

image.png

Hand Entered Hand

image.png

Hand Exited Hand

Responses

Responses in SimLab Dynamic Builder are arranged in a number of tabs to make it easier to use. 

Main Responses

image.png

Icon Event Name

image.png

Play Animation Sequence

image.png

Apply Scene State

image.png

Set Parent Node

image.png

Reset Parent Node

image.png

Set Node Grabbable State

image.png

Delay

image.png

Branch on Expression

image.png

Loop

image.png

Show/Hide

image.png

Set Node Glow State

image.png

Enable Controller Vibration

image.png

Disable Controller Vibration
Objects Behavior

image.png

Animation Sequences 

image.png

Numbers

image.png

String

image.png

Time Variable

image.png

Media

image.png

Attributes 

image.png

User

image.png

States

image.png

The following video shows the behavior in action, if you do not get why this is useful, do not worry about it for now.
In the future when you create more dynamic experiences, you will be glad to have this at your disposal. 

The following diagram shows how loop end is used, it is combined with the fact that in Dynamic Builder the show message strings can also be dynamic, so we can show a message depending on the execution of the experience

image.png


VR Catalog Menu

image.png

Generate Catalog

With this feature, users can insert ready models into the scene such as tables, chairs, doors, trees, and much more directly in VR Viewer without the need to go back to SimLab Composer and run the Viewer again. Also, new models can be added to the Catalog and shared with others online.

image.png

To create a VR Catalog in Simlab Composer/Studio for VR Viewer follow the steps below:

  1. Export your scenes/models as a .vrpackege into the same folder in your device as shown in the screenshot below:

    image.png


  2. Create a preview image (PNG) for the catalog to display in the catalog menu in VR Viewer, a square image with a size of 256x256 pixels is recommended.
  3. Select Generate Catalog and select the folder containing the VR Packages, and select the preview image. Give a name to the catalog, and click Generate. The new "*.vrcatalog" will be created in the same folder where the "*.vrpackege" models are saved.

    image.png


  4. From the object tree select Environment, then in the Properties dialog/VR tab, select Add from file (The folder icon) then select the (*.vrcatalog) created.

    image.png

The (*.vrcatalog) file can be uploaded online to a website, and its URL can be added by pressing the plus icon.

image.png

Now all models in the Industrial catalog we created, will appear in the Catalog menu in the VR Viewer, just one click on the model to insert it into the scene.

image.png

Generate Catalog List

In this dialog the user can add more than one (*.vrcatalog) file from a local computer or website "as a URL link", then Save it as a .vrcataloglist.  This catalog list can be loaded into other scenes to be used in VR Viewer.

image.png

The tutorial below shows how to use models from the VR Catalog in SimLab VR Viewer.

The tutorial below shows how to load and use more than one VR catalog in SimLab VR Viewer.

You can visit the VR Catalog web page on our website by this link.








VR Viewer Menu

VR Viewer menu is where the user can export 3D scenes into *.vrpackage file format, and view it in either Desktop Or VR Mode.

 image.png 

Show in Viewer

SimLab VR Viewer enables users to experience rich and interactive environments in three navigation modes

image-1639929466797.png

Texture Optimization 

More information about Textures Optimization can be found here

Settings

Optimize Package: Optimize package merges all objects sharing the same material into one object, which in turn speeds up the experience and smooths it significantly, especially when the model has a large number of objects. Optimize package will affect static objects only, so if you have objects that have animation, scene states, or are part of any interaction they will be excluded from the process and any interactions in VR will occur without any problems. 
VR Viewer Only: When checked, the exported VR Package can not be imported by SimLab Composer
Enable VR Menu: The experience designer can control what the user running the VR Experience will be able to view, or how the user can interact with the scene in the VR Viewer

image-1639929500514.png

Open Viewer 

Opens the stand-alone VR Viewer, viewer can be then used to open a VR Package, or to share a VR Package with others.


VR Environmentimage.png

Displays the Properties dialog at the right side of the application window. In this dialog with its different tabs the user can change settings for application/ VR environment.

We will explain some settings which may not have a clear meaning:

Under the HDR tab, "Auto Adjust HRD Brightness (VR)" will put a limit on the brightness of the HDR when running the VR Experience on the Viewer, if you want to have high brightness for the Viewer, you should disable this setting.

The Ground tab affects only Rendering and thus is only found in Composer.

Under the VR tab, let's clarify the following settings:
"Enable VR flying": Since users can't switch between navigation modes while running experiences in VR mode, this is the setting to make the VR Experience run with flying navigation mode On.
"Use Light Maps": Allows you to use baked lighting, either created in SimLab Composer or imported with the model. Light maps simulate lighting effects without requiring real-time calculations, reducing performance demands and ensuring a smoother VR Experience, especially on lower-end devices.
"Multiple lights on Standalone Devices": This affects artificial lights (lights that you add from the "Create" menu), it will enable or disable them when specifically running them on standalone devices, since these lights can heavily affect the performance on them.
"Enable Highlights": This will enable/ disable the highlights that you see on objects that you can interact with or grab. 
"Accurate Physics": 
If you have physics in you experience, this will increase the accuracy by making more calculations for the simulation of physics, but with a cost on performance.
"Adding Catalogs": Catalogs are specialized asset collections that can be created in SimLab Composer and uploaded online for easy access. This feature allows you to link external assets to a scene, enabling users to import them dynamically during the VR experience runtime. Unlike standard assets, catalogs can be stored and retrieved from an online repository, making them accessible across multiple projects and devices.


VR Display Settings

The user can adjust the display settings in the VR Viewer using the below dialog. Check this tutorial for more information. 

image.png








Set Start Position

Loads the Viewer Start model, clicking on any place in the scene after clicking Set Start Position will snap the Viewer Start model to the picked location. The Viewer Start model defines the start place for the VR Experience viewer and the orientation at the experience will start at. 

image-1640952738805.png

If Viewer Start model was already added to the scene, no new model will be added and the user can directly click to snap the Viewer Start model.

Users starting the VR Experience without calling Set Start Position will automatically have Viewer Start model added to the scene and will be able to set its location in the scene.

Viewer Start model can be used in Training Builder, for example, Viewer Start model is used in Node Entered Node Event, in Training Builder diagram, to turn lights on, when a VR Experience user enters a room.

image-1640952473898.png

Create VR Camera

Creating VR Camera adds a new VR Camera to the scene. The user should place the camera and set its orientation, currently selected VR Camera appears in wireframe mode.

image-1640953465481.png

User can place multiple VR Cameras in the scene, while running the VR Experience the user can activate the Camera Teleport mode, by clicking on it, or using the shortcut (4), as shown in the following image

image-1640954049910.png

When starting Camera Teleport mode, the scene turns into a ghost view, VR Cameras are shown as white spheres, and hovering over one of the spheres shows the view from the camera's perspective, clicking on the white sphere moves the user to the VR Camera's location.

image-1640954119879.png

VR Camera can also be used in Training Builder, in the Teleport to Camera response block. When this response is called, the viewer is moved from the current location to the location of the VR Camera connected to the response block 

image-1640954432540.png

When moving the viewer to a VR Camera, we can select to align the viewer orientation with the orientation of the VR Camera, this is done by checking the option Apply Orientation in the VR Camera Properties Panel, as shown in the following image:

image-1640954703768.png

VR Camera can also be linked to the Surveillance Plane under the VR Effects menu, to learn more about using VR Camera with a VR Surveillance camera use the following link

Export VR Package

Export the current scene as a VR Package (*.vrpackage), which can be opened using the free SimLab VR Viewer. When exporting the VR Package the user can set:

image-1640958886443.png

This information will be visible in the VR Viewer to help users to find the correct VR Experience.

Export VR Catalog Item

VR Catalog Items are very similar to VR Packages, and they share the same extension (*.vrpackage), the only difference is that they will not require adding Viewer Start model, as they are not planned to be used as stand-alone VR Experiences, they are created to be used in VR Catalogs, visible while running the VR Experience, as the one shown in the image below:

image-1640959394922.png

Creation of new VR Catalogs requires the ultimate edition of SimLab Composer

Export EXE File

EXE files can be exported from the Windows version of SimLab Composer, they are similar to VR Packages, as they include the current scene. EXE files can be shared with users who do not have VR Viewer installed.
Upon clicking on the EXE, it will try to find VR Viewer, if found it will use it and open the scene. If VR Viewer was not found, it will automatically install the latest version of VR Viewer, then it will use it and open the scene, which makes it easier to share VR Experiences with new users.

VR Store

Designed to simplify the process for VR Experience designers to market and sell their creations. In earlier versions, we tested a VR warehouse model, but based on feedback from designers, it was clear that they preferred licensing their VR Experiences instead of selling zim files. This allowed them to retain copyrights for the zim files, enabling them to build new experiences on top of them or customize them for specific users. SimLab VR Store addresses these concerns by offering an alternative avenue for marketing and selling VR Experiences, while still maintaining the availability of traditional methods. This new feature is particularly advantageous for larger organizations, providing them with an easier way to sell their VR content and gain more visibility. Any experiences created using the free SimLab Composer and VR Studio, after undergoing the review process, will be showcased in the VR Store. These experiences can be either paid or free and will appear without watermarks. This flexibility is beneficial for new users who want to start showcasing their work without any initial financial investment, and it also caters to the needs of educational institutes. The VR Store is intended for creators, and we welcome any notes, requests, or feedback from users. Your input is valuable in helping us refine and improve this platform to better serve the VR community. Please feel free to share your thoughts with us.

image.png

Check this article to learn how to protect your VR product and prevent Copyright infringement.

Viewer Settings

This option gives users quick access to viewer settings, to check and update those settings before exporting the VR Package.

image.png

Languages

image.png

This option allows the user to make the VR Experience support multiple languages, the user defines special objects for each language, those objects can be Objects, Videos, Sounds, Presentations, or Text. 
When starting the experience, the use will select the language he/ she prefers to run the experiences in, all objects in other languages will be automatically deleted, so the VR Experience will run in the selected language.

The following video shows how to use multiple language support


Mixed Reality Elements

The Quest 2 and Quest 3 now offer mixed reality support, enabling users to engage in interactive experiences while maintaining visibility of their actual surroundings.

With the Mixed Reality option, designers can specify the environmental elements within the experience. When users run the experience in Mixed Reality (MR) or Augmented Reality (AR) mode, these elements will be concealed, whereas they will be visible in other modes. This flexibility empowers designers to craft a unified experience that seamlessly transitions across Virtual Reality (VR), Mixed Reality (MR), or Augmented Reality (AR) modes.


image.png

The following tutorial shows how to use the MR elements:

In addition to single user MR mode, MR mode can be used in collaboration, the following tutorial shows how users located in the same room perform synchronization to work effectively in the MR mode.

VR Evaluation

image-1639929586893.png

The VR Evaluation menu gives the user access to the full VR Evaluation process which includes

Creating Quizzes/Surveys
Distributing VR experiences to students
Viewing Results



VR Evaluation Chapter goes through the process of creating and distributing a successful VR Evaluation.