Apple

Apple had already planned this product for years

Apple’s rhetoric regarding its technological advances is, to say the least, special and differentiated. And when it comes to his mixed reality glasses, which are expected to be featured at WWDC, he’s been giving us clues about his intentions since 2017. All this, of course, without mentioning anything at all about a self-created product. So in this post we are going to see the evolution of how the company has been introducing technologies to build the foundations of this new product.

A synthesis that can make all the sense in the world

Although it is not an official summary by the company, it is a retrospective look that 9to5Mac has carried out since then. From the implementation of ARKit, to Animojis, through the LiDAR sensors of the iPad, and even different applications that were already beginning to embrace augmented reality. And it is that this journey of tracks begins in 2017, when the company already started to implement the first versions of ARKit.

Although it is important to note that one thing is technology, and another is the product itself. Well, everything they were presenting on this topic was focused “at first only to iPhone and iPad”. Small subtle changes, a priori, such as facial detection to unlock FaceID, the incorporation of a LiDAR sensor in the iPhone cameras, or having a face mapping tool, to play with technology by creating animated emoticons.

Animoji

The implementation of ARKit, to begin with, meant the opening of a new way of creating and working. The possibility of combining tangible reality with the contents of the screen provided the opportunity to create new types of applications, not only to carry out professional tasks. But also, for games that have made use of these technologies that, until now, were not conceivable.

iphone lidar sensor

The incorporation of new sensors in our iPhone and iPad they also provided a new way of interacting with devices. But what at first seemed like an additive change, has been an element to change the ecosystem of products and the very conception of these. LiDAR sensors in the iPad and iPhone cameras, for recognition of spaces, depths and measurements. That the device itself knows what the loo in front of them is, and how it can calculate the position of the elements that surround them.

Or the support of the Neural Engine processor, to be able to process much faster and more efficiently everything that has to do with that mixed reality. Small pieces of a puzzle, which in the end will have served to improve what they already had. But also, to create something new from the improvements that they have been able to carry out during all these years.

And the most curious thing of all is that everything has been taught to us. The technology itself has been shown to us. However, given the imminent WWDC, and highly anticipated mixed reality glasses, perhaps this indicates that they have kept many aces up their sleeve, and have not revealed the true “back room”.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *