Three JS, Unreal, Unity. VR/AR/MR without 9 circles of hell
1. Three JS, Unreal, Unity.
VR/AR without 9 circles of hell
by Marina Kolesnichenko, Software Engineer at ElifTech
2. Three.js cookbook
Three.js is a cross-browser
JavaScript library/API used to create
and display animated 3D computer
graphics in a web browser.
Three.js uses WebGL.
3. Quick start
http://threejs.org/
Click the “Download” link on the left side of your screen. Once the zip has
finished downloading, open it up and go to the build folder. Inside, you’ll
find a file called three.min.js and, if you’re following along, you should copy
this file into your local development directory.
What about NPM?
npm install three
But this library version is 'limited' at moment.
So I recommend download library from official
site.
8. Basic
var loader = new THREE.OBJLoader();
// load a resource
loader.load(
// resource URL
'models/monster.obj',
// function when resource is loaded
function ( object ) {
scene.add( object );
}
);
10. What?
Material Library File (.mtl)
Material library files contain one or more material definitions, each of which
includes the color, texture, and reflection map of individual materials. These
are applied to the surfaces and vertices of objects. Material files are stored
in ASCII format and have the .mtl extension.
18. Shaders
A Shader is a piece of code that runs directly on the
GPU. Most modern devices have powerful GPUs
designed to handle graphics effects without taxing the
CPU.
Pixels Vertex
Shaders
or Babylon...
19. Pixels
Pixel Shaders modify or draw the pixels in a scene. They
are used to render a 3D scene into pixels, and also
typically used to add lighting and other effects to a 3D
scene.
20. Pixels
Shaders that draw an image or texture directly. These
types of shaders can be loaded into a
THREE.ShaderMaterial to give cool textures to 3D
objects.
21. Pixels
Shaders that modify another image or texture. These
allow you to do post-processing on an existing texture,
for example to add a glow or blur to a 3D scene.
30. Raycaster
this.raycaster = new THREE.Raycaster();
this.raycaster.setFromCamera(new THREE.Vector2(),
this.camera);
let intersects = this.raycaster.intersectObject(object,
true);
31. OSGJS
OSGJS is based on OpenSceneGraph API, which itself is
based on a few concepts that allow for a solid grasp
around the whole library once and for all, and those are
mandatory in order to dive into code.
32. Difference
viewer = new osgViewer.Viewer(canvas, {antialias : true,
alpha: true });
rotate = new osg.MatrixTransform();
viewer.setupManipulator(new osgGA.OrbitManipulator());
viewer.getManipulator().setDistance(20.0);
viewer.run();
33. Why not?
1. FPS and freezes
2. Limited scene
3. Difficult work with animation and shaders
35. Unreal Engine uses C++ and Unity uses mostly C# or JavaScript. Unreal
Engine has Blueprint visual scripting. Technically you don't ever need to
write a single line of code.
36. All pointers of objects that are at the level should point to objects at the
level and objects outside the level can't have pointers pointing to objects at
the level.
37. VR
Virtual reality (VR), which can be referred to as immersive multimedia or
computer-simulated reality, replicates an environment that simulates a
physical presence in places in the real world or an imagined world, allowing
the user to interact in that world.
42. AR
Augmented reality (AR) is a live, direct or indirect view of a physical, real-
world environment whose elements are augmented (or supplemented) by
computer-generated sensory input such as sound, video, graphics or GPS
data
43. VR, ok! AR, ok!
MR O_O?
Mixed reality - is the merging of real and virtual worlds to produce
new environments and visualisations where physical and digital
objects co-exist and interact in real time. Mixed reality is an
overlay of synthetic content on the real world that is anchored to
and interacts with the real world—picture surgeons overlaying
virtual ultrasound images on their patient while performing an
operation, for example. The key characteristic of MR is that the
synthetic content and the real-world content are able to react to
each other in real time.
46. Overdraw
Overdraw view allows you to see what objects are drawn
on top of another, which is a waste of GPU time. Look at
reducing overdraw as much as possible. You can view
overdraw in the Scene View by using the Scene View
Control Bar.
48. Occlusion
Occlusion Culling stops objects from being rendered if
they cannot be seen. For example, we don’t want to
render another room if a door is closed and it cannot be
seen.
50. Rendering
Approximating your own distortion solution, even when it “looks about
right,” is often discomforting for users.
Any deviation from the optical flow that accompanies real world head
movement creates oculomotor issues and bodily discomfort.
Consider supersampling and/or anti-aliasing to remedy low apparent
resolution, which will appear worst at the center of each eye’s screen.
Tabu
51. Minimizing Latency
Your code should run at a frame rate equal to or greater than the Rift
display refresh rate, v-synced and unbuffered. Lag and dropped frames
produce judder which is discomforting in VR.
Ideally, target 20ms or less motion-to-photon latency (measurable with the
Rift’s built-in latency tester). Organize your code to minimize the time from
sensor fusion (reading the Rift sensors) to rendering.
Decrease eye-render buffer resolution to save video memory and increase
frame rate.
Avoid visuals that upset the user’s sense of stability in their
environment. Rotating or moving the horizon line or other large
components of the user’s environment in conflict with the user’s real-
world self-motion (or lack thereof) can be discomforting.