Q&A

Augmented Reality

Augmented reality is the overlay of computer-presented information on top of the real world. In this context real world is experienced by the utilized sensor technology. The additional content includes any media such as sound, image, and videos. We should use the mobile phone or wearable AR display such as Google Glass, Epson Glasses or Microsoft HoloLens.

Location based Augmented Reality shows Augmented Reality content that is attached to a specific location, which can be outdoor or indoor.

Outdoor location based Augmented Reality

Outdoor location based Augmented Reality shows Augmented Reality content triggered by specific GPS location.

In order for GPS location Augmented Reality apps to work a user’s phone needs to have GPS option turned on. GPS accuracy is affected by a number of factors, including satellite positions, noise in the radio signal, atmospheric conditions and  natural barriers to the signal. Noise can create an error between 1 to 10 meters and results from static or interference from something near the receiver or something on the same frequency. Objects such as mountains or buildings between the satellite and the receiver can also produce error, sometimes up to 30 meters. Therefore, much more accurate user position is shown when the user stands on an open field instead in a city surrounded by buildings.

Indoor location based Augmented Reality

GPS location is not really suitable for indoor usage since the use case quite often requires an accuracy of less than 10m. Therefore, we offer two different indoor location based Augmented Reality solutions.

Beacon (Bluetooth)

The position within a building can be determined through the use of Bluetooth Beacon (iBeacon). The use of Bluetooth Low Energy (BLE) technology enables the use of batteries for months, without having to resort to external power supplies. This means that temporary installations, such as exhibitions, can also be realized in unusual locations. The position can be identified inside enclosed buildings to within a few meters.

Smartphone sensors

The utilization of the internal sensors of a smart phone also plays an important role in indoor positioning and indoor navigation. Here a variety of sensors in a mobile device (smart phone) are used and evaluated: GSM, 3G/4G (LTE), magnetic fields, compass, air pressure, barometer, accelerometer and gyroscope. Positioning can be significantly improved through the unique combination of these technologies with the other options for navigation inside enclosed buildings.

 

In order to give your users a smooth experience while using your Augmented Reality apps, it is very important to use markers that work well with the app. We strongly recommend that you follow tips below while designing your markers.

During the marker design process, there is an option to check your marker quality by yourself using the ARgenie Creator. When you upload a marker in the ARgenie Creator, the unique fingerprints on the marker will be extracted and analyzed by ARgenie technology, so that this marker will be recognized by ARgenie mobile app.

The marker should:

  • have rich details
  • good contrast
  • be rigid, not flexible; with matte, but no gloss


The marker should NOT:

  • have large portion of blank area
  • contain mainly texts or texts only
  • only contain graphics that look the same in any orientation, such as the shape of a circle
  • only contain graphics with repetitive patterns

Marker format

  • Must be in PNG or JPG formats
  • Must be less than 2 MB in size

Marker physical size

  • The recommended size varies based on the actual target rating and the distance to the physical image target
  • A physical printed marker should be at least 3 cm in width and of reasonable height for a good Augmented Reality experience
  • Consider increasing the size of your marker if the distance of the marker is higher
  • You can estimate the minimum size that your marker should have by dividing your camera-to-marker distance by approximately 10

For instance, a 20 cm wide marker would be detectable up to a distance of about 2 meters (20 cm x 10). Note, however, that this is just a rough indication and the actual working distance/size ratio can vary based on lighting conditions, camera focus, and marker rating.

Lighting conditions

The lighting conditions in your test environment can significantly affect marker detection and tracking.

Make sure that there is enough light in your room or operating environment so that the scene details and marker features are well visible in the camera view. Markers should be viewed under moderately bright and diffuse lighting. The surfaces of the object should be evenly lit. Indoor scenarios generally work well.

If your application use case and scenarios require operating in dark environments, we recommend using the option of enabling the device Flash torch (if your device has one).

Printed marker - flatness

The quality of detecting the marker can degrade significantly when the printed markers are not flat. When designing the physical printouts, game boards, play pieces, try to ensure that the markers do not bend, coil up, and are not creased or wrinkled.

Printed marker - glossiness

Printouts from modern laser printers might also be glossy. Under ambient lighting conditions a glossy surface is not a problem. But under certain angles some light sources, such as a lamp, window, or the sun, can create a glossy reflection that covers up large parts of the original texture of the printout. The reflection can create issues with tracking and detection, since this problem is very similar to partially occluding the marker.

Viewing angle

The marker features will be harder to detect and tracking can also be less stable if you are looking at the marker from a very steep angle, or your marker appears very oblique with regard to the camera. When defining your use scenarios, keep in mind that a marker facing the camera whose normal is well aligned with the camera viewing direction, will have a better chance to get detected and tracked.

The proper use of Augmented Reality apps require a device with a back camera, GPS, compass, accelerometer, gyroscope and an active Wi-Fi or mobile Internet connection.

We support both Android and iOS platforms.

Supported devices include the iPhone 4, iPhone 5, iPod touch - 4e, 5e and 6e generation, (running iOS 6.0 or later), Android devices (running v2.3.3 or later).

We can differentiate among supported formats for audio, video, 2D models and 3D models.

MP3 is the supported audio format and MP4 is the supported video format.

Supported 2D models format are png with or without transparency.

Supported 3D models formats are fbx, obj, 3ds max 2015.

The advanced capability is support for alpha channel transparency in videos.

There are cases where you may want to use video elements in a more immersive way. Now with alpha channel transparency support, videos can use masks to isolate items and create transparency, eliminating the rectangular frame. Think of it as the video equivalent of transparent PNGs, while JPGs always have a rectangular shape.

Take, for instance, the example below. Our original video asset of a tiger walking on a green screen can now be modified to make the green background disappear, making for a more immersive video when viewed in Augmented Reality. In the top left is the original video, below that is the alpha matte we created, and on the right is how the video would appear in Augmented Reality.

But you don't need green screen footage in order to use transparency in videos. You can either create masks on-the-fly with video effects software (like rotoscoping, which can be very difficult), or you can just use animated digital assets created with video effects or 3D modeling software (like animated text, characters or logos) that are already isolated on a transparent background.

Supported 2D

Supported 2D models format: png with or without transparency

Supported 3D models format: fbx, obj, 3ds max 2015

  • Up to 100,000 triangles per scene under one camera view

NOTE: Consider what you’re making and how much screen space it will take up - you won’t need 10,000 triangles and a 1024 square texture for an object that may only be 3cm high on an iPad screen.

Textures/Shaders

  • The use of smaller textures are recommended where appropriate

  • Diffuse and normal maps are recommended

  • Displacement maps are NOT supported

  • Alpha on solid objects is not recommended and is better suited to planes.

  • Material names should contain the object names that are affiliated to it.

  • Everything must be converted to editable polys and with all the modifiers collapsed unless it is going to be rigged, in which case only the skinning modifier and morpher (if used) would remain

  • Individual texture maps (.png format) have to be the power of 2 textures:

    • 64x64

    • 128x128

    • 256x256

    • 512x512

    • 1024x1024

    • no bigger than 1024x1024

  • No procedural shaders, nodes or textures: all the textures have to be baked to bitmaps or use standard shader with color value.

  • Combine objects with the same material so that it runs faster on the device, while multi sub shaders are not supported and there is no more than one shader per object.

Rig

  • Bone names must not contain spaces, brackets or special characters – replace spaces with underscores if needed

  • The export rig should contain a maximum of 100 bones to optimize playback – the performance hit comes largely from skinning

  • It is advisable to name all bones properly

  • Objects cannot be linked/parented to non-export objects – constraints are ok

  • Do not scale anything

  • Make sure every vertex has a skinning weight

Animation

  • No animated materials/material parameters

  • Scaling is correctly supported in animation

  • No animated light color

  • All animation transforms are supported (Translate, Rotate and Scale)

  • Animated groups of objects are not supported

Preparing the scene for export

It is not necessary to export all objects within your 3D scene, things such as controllers/animation handles should not be exported unless desired. You should export:

  • Geometry

  • Lights

  • Any bones/joints which deform geometry

Virtual Reality

Virtual Reality (VR) is the full immersive 3D experience that the users receive with VR Wearables. This computer generated environment is interacted by a person who either becomes part of this virtual world or is immersed within this environment and is capable of manipulating objects or performing a series of actions.

The most popular VR devices can be classified in two groups: the cardboard and the plastic ones. The advantage of cardboard VR devices, which are produced by Google predominantly, is that they can be more easily branded and customized. The plastic VR devices are Homido VR, Samsung VR Gear, Oculus Rift and HTC Vive.

Google Cardboard VR is affordable VR device that supports a variety of phones. The cardboard is customizable and can have custom branding or printing on it.

Homido VR is a plastic VR headset supported by a range of phone types and brands.

Samsung VR Gear is adapted for specific Samsung Phones (e.g. Samusng Galaxy S6) and when the phone is docked into this device the Samsung VR Store opens.

Oculus Rift is the high performance VR device that has two built in HD screens. It features a number of custom made apps and games for the device and it requires a high-end PC to run.

HTC Vive is a High Performance VR device that has two built in HD screens. It works with special awareness sensors so as to turn a room into a 3D VR environment. Also, it features a number of custom-made apps and games for the device. HTC Vive requires a high-end PC to run.

Virtual reality content can be grouped into: 360° Photos, 360° Video, 3D Graphics and Animation.

360-degree virtual reality is an audiovisual simulation of an altered environment that enables the user to look around in all directions, as if in real life.

There are diverse kinds of 360-degree VR, such as live and previously captured video or real-time, as well as pre-rendered computer graphics imagery (CGI) and real-time rendered 3D games. 

Games

Serious games are designed for a purpose beyond pure entertainment and they are used in diverse professional contexts such as training, assessment, recruitment, knowledge management, education, innovation and scientific research.

These games use the motivation parts of game design (collaboration, curiosity, competition, individual challenge) and game media, as well as board games through physical representation or video games, and avatars and 3D immersion. The aim of serious gaming is to motivate participants additionally to engage in dull or complex tasks.

Serious games involve goals beyond entertainment and incorporate digital tools such as story, art and software. They can be classified into games which promote gamification principles in various industries and games for educational purposes. For example, serious games focus on behavior changes in industries such as healthcare, marketing, business. In terms of education, serious games can help users inform, memorize, and learn new material, train each other, develop skills in all aspects of education. Regardless if games are used with the aim of developing the business brand or for education, they are created to engage, stimulate and reward users.