Monday, November 23, 2020

Introduction to AR Core

What is AR (Augmented Reality)

  • Blend real world in to the digital world
  • Integrate virtual content with the real world as seen through phone's camera
ARCore is Google’s platform for building augmented reality experiences
 


What is Google’s Tango Project (Prior to AR Core)
        ARCore has its origins in Tango, which is/was a more advanced AR toolkit that used special sensors built into consumer mobile device.
        Google’s (Peanut phone , Yellow Stone tablet) , Intel’s Real sense Smart phone , Lenovo Phab -2 , Asus Zen phone are equipped  with special sensors
        Later Google stopped Tango project as ARCore is evolved with less dependent on sensors as compared to Tango

Hardware of Phone used by ARCore

  • Fundamental concepts behind ARCore
  • Motion tracking
  • Feature points (through camera) + Motion sensors (through IMU: Accelero , Gyro)
  • Point , PointCloud , Pose are some of the classes exposed by ARCore SDK we can see how  user's position is tracked in relation to the feature points identified on the real couch.


  • Environmental understanding 
  • Uses technique called meshing 
  • Cluster of feature points used and returned as planes to applications along with each plane’s boundary

How does AR Core works

AR Core fundamentally does two things

    1Motion Tracking
        Building it’s own understanding of real world (Environmental understanding , Light Estimation , Orientation points , Anchors , Trackables

    2. Environmental limitations
        For now, limitations that may hinder accurate understanding of surfaces include:

  • Flat surfaces without texture, such as a white desk
  • Environments with dim lighting
  • Extremely bright environments
  • Transparent or reflective surfaces like glass
  • Dynamic or moving surfaces, such as blades of grass or ripples in water
        When users encounter environmental limitations, indicate what went wrong and point them in the right direction.   

            AR application that has identified a real-world surface through meshing. The plane is identified by the white dots. In the background, we can see how the user has already placed various virtual objects on the surface.


Light estimation

        Average intensity and color correction of a given camera image
Scene of realism is increased by applying the same light condition to virtual objects LightEstimate , LightEstimate.State are some of the classes used to get lightning condition User Interaction , Orientation points , Anchors and Trackables.
        Takes an (x,y) coordinate corresponding to the phone's screen
Projects ray into camera’s view of the real world
        Returns Plane/feature points that the ray intersects and Pose (kind of OpenGL model matrices) of that intersection in world space
HitResult , Pose , Anchor (fixed location and orientation of real world) are some of the classes provided by ARCore in this context
Trackable (interface) is something that ARCore can track and that Anchors can be attached to 

Note: 
        Because ARCore uses clusters of feature points to detect the surface's angle, surfaces without texture, such as a white wall, may not be detected properly.
        To reduce CPU costs, reuse anchors when possible and detach anchors that you no longer need.


Books on ARCore

Did you find our blog helpful ? Share in comment section. Feel free to share on Twitter or Facebook by using the share buttons

Share:

0 comments:

Post a Comment

Popular Posts

Contact Form

Name

Email *

Message *

Pages