This New update in Google AR core will make your life easier !
Read to know the update in Google AR core. Augmented Reality is in the commercial growth phase. In past, few years we have witnessed the adoption of technology growing exponentially. Augmented Reality technology is gradually reaching the masses through its use case.
CEOs of top-notch companies like Google, Apple, Facebook, Microsoft is seen vouching for the technology. You can read what Tim Cook and Satya Nadella said here. In the current scenario, these companies are leaving no stone unturned to leave their footprint on the technology. The fidelity of these people on Augmented Reality has encouraged thousands of entrepreneurs to embrace Augmented Reality. Google is all in to maximize the use case of Augmented Reality. “In August 2020, Google Ventures invested around USD 14.5 million in Blue vision Labs, an AR startup company based in the U.K”. Biggies are investing and collaborating with startups to get a winning edge over competitors.
Google launched Google AR core in 2018. This helped developers to build applications having Augmented Reality. Google has introduced google 3D animals, Google ARghost on Halloween and Google AR makeup a few months ago. Google is trying to connect the masses to the Augmented Reality technology. Recently, Google smart glass is also in news. Google has updated Google AR core, let us know more about it. But first thing first,
What is Google AR core?
Google AR core is a Software Development Kit (SDK) developed by Google to help Augmented Reality developers. Google AR core supports Android as well as iOS. You need to be well versed with SDKs to build dynamic apps. AR SDK uses computer vision technology to build robust AR apps. We are living in an app-driven world and applications have become an important part of our lifestyle. The SDKs such as Google AR core helps to build apps that ease users lifestyle or entertains users. All your favourite brands – such as Nykaa, Amazon, Flipkart, Loreal, IKEA, Toyota, etc- are trying to integrate AR apps into their business model.
Google AR core helps developers to blend the digital and physical world. Some of the important features are environmental understanding, motion tracking, Light estimation. As per Google, “ Roughly 850 million Android devices are certified to run ARcore now.
Statistics says, the tremendous increase in the adoption of AR technology depicts the increasing need for Augmented Reality apps. To know more about different SDKs that you can rely on, to build Augmented Reality apps click here.
Uses of Google AR core
· Helps developers build apps that sense the environment better and place a virtual object in it.
· User can interact with the information placed inside the environment.
What is Google Input/Output 2021:
Google I/O 2021 is an event or workshop organized by Google. The conference is primarily for developers. The event was virtually held on May 18, 2021. The event is being held annually since May 28, 2008. Here, we will go through a crisp conclusion and know the update Google announced for Google AR core. In the event, the CEO- Sundar Pichai- addresses the attendee and updates them about the doings of Google.
Update in Google AR core:
Google will be adding a bunch of new features in Google AR core as said in Google I/O 2021. Developers will get two new APIs to enhance the creation of AR experiences. These APIs will be available in ARcore version 1.24. As per Google, the first APIs is, “Raw Depth API”. The second API included in the version is “ Recording and Playback API”. The Third update in Google AR core is “Depth hit tests”.
How the Big I/O 2021 update in Google AR core will help people?
· Use of Raw Depth API:
The Raw Depth API will help people to get a detailed geometry of the objects present in the scene. This means it will increase the accuracy and precision of 3D objects to better understand geometry and size. AR features are easily implemented exquisitely on a flat surface, but people might find it difficult to get an idea on a rough surface. Using Raw Depth API people will be able to know the rough surface through textures. Features of Raw Depth API are better than Full Depth API.
· Use of Recording and Playback API:
Recording and Playback API is packed with depth sensors and IMU. It captures data as well enables to access captured data during the playback. Using Recording Playback API you can also add objects after capturing the scene. It allows the addition of custom data during the recording and retrieves it afterwards if needed. This feature will help developers to save time and accelerate the overall process.
· Depth hit tests:
Initially, Depth hit tests was limited to a flat surface. Now, it can be conducted on a flat surface as well as a rough surface. It uses smooth and raw depth information to give the best result.
Apart from the above-mentioned changes, bugs are fixed in the new version of Google AR core.
Features present in different versions of Google AR core:
· New debugging tools:
The new debugging tool is for, Android (Java/Kotlin), Android NDK (C), ARCore Extensions for AR foundation and ARCore SDK for Unity.
· Dual camera support:
Dual camera support was pending in Google AR core version 1.23.0. This feature will roll out in the upcoming week. User can have an augmented experience through the dual camera. The feature will support a device that has passed the Google certification process. Google checks the user experience through motion tracking and then certify. The official website says, “ Certification is important because we want users to have a good experience with your AR application. This is primarily related to sensitive motion tracking, which is done by combining the camera image and the motion sensor input to determine how the user’s device moves through the real world”.
The update has enhanced the capabilities of Google AR core. Though, users need to wait to experience the feature until developers update it. Augmented Reality has an immense use case, Google is working on a lot and a lot is yet untapped.
Credit: Source link