Introducing Augmented Reality Core – what you should know

Spread the love

Augmented reality (AR) is an interactive experience of a real-world environment aided by technology. Here the objects that reside in the real world are amplified by computer-generated perceptual information, sometimes across a variety of sensory modalities. Those modalities can be visual, auditory, haptic, somatosensory and olfactory.

Augmented Reality can be defined as a system that fulfills three basic features: 

  • Combining real and virtual worlds and environments.
  • Real-time interaction.
  • Accurate 3-dimensional registration of virtual and real objects.

The overlaid sensory information can be constructive (i.e. an improvement to the natural environment) or be destructive (i.e. negatively covering up the natural environment). Such an experience is flawlessly interlaced with our world in a way that it is perceived as an immersive aspect of the real environment.

This way, augmented reality can help alter a person’s ongoing perception of the real-world environment, while virtual reality completely replaces a user’s real-world environment with one that is stimulated. AR itself is related to two terms that are quite synonymous: computer-mediated reality and mixed reality (making money).

What was augmented reality like some time ago?

The first steps for using augmented reality were done using only a smartphone, with Android starting it three years ago with project Tango, developed by Google. To run Tango, people needed a special device made and sold by Google, which thus limited its availability to a certain number of developers. Hence, Google has created Augmented Reality Core (ARCore).

The looming advantage ARCore has is that it works without any additional hardware as it can scale easily across the Android ecosystem. ARCore is already supported by multiple Android and iOS devices, and is also expanding quickly with new device and system releases.

How does ARCore work?

ARCore uses Software Development Kits (SDKs) in providing Java/OpenGL (Android), Unity, Unreal and Web with native APIs for all essential features of augmented reality:

  • Motion Tracking: By using a phone’s camera in observing feature points in a room as well as IMU sensor data, ARCore is able to determine both, the position and orientation (pose) of the phone as it moves. Virtual objects hence remain placed accurately.
  • Environmental Understanding: Horizontal surfaces can be detected by using the same feature points that are used for tracking motion. It is common for AR objects to be placed on a surface (either on a floor or on a table).
  • Light estimation: This helps allow observation of the environment’s ambient light, which makes it possible to help light up virtual objects to help match their surroundings, making their appearance more realistic.

With these three key technologies developers can create completely new AR experiences or enhance existing apps with AR features.

Basic concepts of ARCore

It is imperative to understand the following fundamental concepts of ARCore. Specialists of app development Toronto state that such is crucial to getting a start on making experiences helping virtual content appear to rest on real surfaces or be attached to real locations. 

The basic concepts worth understanding are as under:

  • Anchor: This describes a fixed location and orientation in the real world. To stay at a fixed location in physical space, the numerical description of this position updates as ARCore improves its understanding of the real physical space.
  • HitResult: This defines an intersection between a ray of light and the estimate of real-world geometry.
  • Plane: This describes the current best knowledge of a real-time planar surface.
  • PlaneHitResult: This defines an intersection between a ray and plane geometry.
  • PointCloud: This contains a set of observed 3D points and confidence values.
  • Pose: This represents an immutable rigid transformation from one coordinate frame to another. It always describes the transformation from an object’s local coordinate frame to the other. This describes the transformation from an object’s local coordinate frame to the world coordinate frame. In short, Poses from ARCore APIs can be considered equivalent to OpenGL model matrices. The transformation is defined using quaternion rotation about the origin, followed by a translation.
  • Session: This manages the AR system state and handles the session lifecycle. This is the main entry point to the API of ARCore. It allows users to create a session, configure it, start it, stop it, and most important of all, receive frames allowing access to camera image and device pose.

Spread the love

Jeff Bailey is a tech enthusiast and gadget guru with a profound understanding of the ever-evolving world of technology. With a keen eye for innovation and a passion for staying ahead of the curve, Jeff brings insightful perspectives on the latest gadgets and tech trends.