Apple debuts Core ML 3 with on-device machine learning

Apple today introduced Core ML 3, the latest iteration of its machine learning model framework for iOS developers to bring machine intelligence to smartphone apps. Core ML 3 will for the first time bring on-device machine learning to iOS apps for personalized experiences. The ability to train multiple models with different data sets will also be part of Core ML 3 for applications like object detection and identifying sounds.

Apple’s machine learning framework will be able to support models with more than 100 layers.

On-device machine learning is growing in popularity as a way to deploy quickly on the edge and respect privacy. In recent months, solutions were made for popular frameworks like Google’s TensorFlow and Facebook’s PyTorch to supply on-device machine learning through approaches like federated learning.

The news was announced at Apple’s Worldwide Developers Conference (WWDC) being held this week in San Jose, California.

VentureBeat has reached out to an Apple spokesperson for more details about Core ML 3. This story will be updated with additional details as they become available.

Also announced today: watchOS 6 with Voice Memos and menstrual cycle tracking, iOS 13 with more expressive Siri and personalized results with Apple’s Home Pod, a modular Mac on wheels, and the ability to control Apple’s tvOS with PlayStation and Xbox controllers.

The Core ML framework is used internally at Apple to do things like train Siri, QuickType keyboard and other services as well as things like language learning for the app Memrise and predictions for the Polarr photo editing app.

Last year at WWDC, Apple introduced Core ML 2, a framework Apple called 30% faster than its predecessor, and Create ML, a GPU-accelerated framework for training custom AI models with Xcode and the Swift programming language. The initial Core ML framework for iOS was introduced at WWDC in 2017 and incorporated into iOS 11. Pre-made models that work right out of the box include Apple’s Vision API and Natural Language Framework.

Also Read:  Automated machine learning or AutoML explained

Unlike Google’s ML Kit that works for both Android and iOS developers, Core ML is made exclusively for developers creating apps for Apple’s iOS operating system. Google integrated Core ML with its TensorFlow Lite back in late 2017.

You might also like More from author

Comments are closed.