Google explains the technology behind the Motion Photos of the Pixel 2

Google explains the technology behind the Motion Photos of the Pixel 2

Together to the Google Pixel 2 and Pixel 2 XL presented in October last year , Google introduced a host of exclusive news in its software, designed to make a clear difference between what we know as “Pure Android” and the Android that includes this duo of devices .

Among the exclusive features, many of them were aimed at the photographic section. Some, like the portrait mode, ended up being ported to other devices through modified apps, while others, such as stickers in augmented reality or the Motion Photos, still to this day, they remain exclusively in this duo of smartphones star made by Google .

And precisely from the Motion Photos we want to talk today, because Google, through his official development and research blog , has wanted to explain how the fascinating technology behind these animated images works.

This is how the Pixel 2’s Photos Photos work

Google explains the technology behind the Motion Photos of the Pixel 2

Before you go straight to the comments section to say that This feature is nothing more than a copy of the Live Photos of the iPhone , you must know that, although they are two tools similar to general features, the truth is that their operation has nothing to do with each other.

Google explains that, every time a photograph is taken with the Pixel 2, the camera app, automatically – unless we deactivate this option- It also captures up to three seconds of video to create the animated images . However, for all the Motion Photos to be perfect, it is used the advanced stabilization algorithm , introduced for the first time in the Motion Stills app .

Google explains the technology behind the Motion Photos of the Pixel 2

This software uses data extracted directly from the frames of the video to detect and track an object during several consecutive frames , and later these are classified between foreground objects and background objects. However, Google says that this system is not perfect , and could err in complex situations.

That’s why, in addition to using this software-based stabilization system, Motion Photos also take advantage of device hardware. Every time a photo is captured, the three-second video contains movement metadata, generated by the gyroscope and the optical stabilizer -OIS- -, in order to help the system to cut and stabilize only the area that will be displayed in the animated photograph.

Google explains the technology behind the Motion Photos of the Pixel 2

In this way, Google claims to have created a hybrid system of motion estimation , integrated in the Motion Photos of its latest terminals, which aligns the background of the image with greater precision than if only the hardware or software were used individually. Also, with this system it is possible create perfectly stabilized Motion Photos even in the most complex situations. Google gives us some examples of how this technology works in Pixel 2.

Although it may seem like a simple function, Motion Photos hide a fascinating technology behind , which combines hardware and software so that users are able to capture any moment in small videos of three seconds, perfectly stabilized. It is worth mentioning that this type of images, developed in conjunction with the teams of Google Research, Google Photos and Google Pixel, can already be exported as GIFs or videos in the latest version of the Google photo organization app.

Do you have Facebook installed? Get the best article of each day in our Page .

Facebook Comments