Apple iPhone Xs draws its power from Machine Learning
Apple’s iPhone Xs has a new A12 Bionic chip, equipped with a Neural Engine dedicated to Machine Learning. This powerful component offers new possibilities for developers of iOS applications.
As part of its keynote of September 12, 2018, Apple has just unveiled the iPhone Xs, Xs Max and Xr. In appearance, these three new smartphones seem similar to the iPhone X of 2017. However, under the hood, the new A12 Bionic chip brings an impressive performance gain linked largely to artificial intelligence and Machine Learning .
Apple iPhone Xs ships the A12 Bionic processor and its octa-core Neural Engine
The A12 Bionic is the first chip engraved in 7 nanometers in the smartphone industry, and Apple presents it as its most powerful chip to date. It combines a six-core CPU, a four-core GPU, but also a “Neural Engine” processor dedicated to Machine Learning.
This Neural Engine has 8 hearts . In comparison, that of the A11 Bionic processor of the iPhone X had only two hearts. Thus, the A12 Bionic is capable of performing 5 trillion operations per second , against 600 billion for the A11 Bionic.
This new chip aims to help smartphones understand the world around them using artificial intelligence algorithms. This notably allows new iPhones to offer more convincing photographic effects, or even augmented reality experiences.
Machine learning algorithms allow applications to understand and react to what is happening in photos and videos. For example, when the user presses the button to take a photo, the Neural Engine executes code to quickly understand the scene they are trying to photograph .
He is particularly capable of distinguishing a person from a landscape. This improves the Bokeh effect of portrait mode, or even changes the depth of field of a photo after taking it. Machine Learning in real time will also allow users of new iPhones to create personalized Siri Shortcuts, or personalized “Memojis” emojis that take on the appearance of the user .
Apple’s iPhone Xs Helps Developers Use A12 Bionic with Core ML 2
In 2017, it was thanks to the Neural Engine of the A11 Bionic processor that the iPhone X could offer the Face ID facial recognition functionality. However, at the time, the Neural Engine was not connected to the Core ML framework which allows developers to inject AI into their applications. With the A12 Bionic, this is now the case. Thus, third-party developers will be able to exploit all the possibilities offered by this powerful chip.
Machine Learning applications created with Core ML 2 can also be executed nine times faster with the A12 Bionic than with the A11 Bionic of the iPhone X, while consuming ten times less energy.
To illustrate the possibilities offered, the developer Nex Team presented an improved version of its HomeCourt application . This new version is based on the combination of Machine Learning and augmented reality.
Dedicated to basketball, the application analyzes videos filmed by iPhone or iPad in real time to detect shots, baskets marked or missed, and the position of the player on the field. According to the developer, it took about two seconds with the iPhone X to present these statistics to the user. With the iPhone Xs and its new processor, the statistics are displayed in real time without any delay.
Artificial intelligence has become one of the main fronts on the battlefield of the smartphone industry. In October 2017, Google also launched its Pixel 2 smartphone equipped with an image processor capable of running artificial neural networks to take better photos.
In fact, in April 2018, Apple recruited former Google AI director John Giannandrea. Today, Apple offers more than 400 job offers related to Machine Learning .