Wednesday, June 6, 2018

iOS 12 Machine Learning

Apple introduced Core ML2. This said to enable apps to be built with intelligence.

In addition to supporting extensive deep learning with over 30 layer types, it also supports standard models such as tree ensembles, SVMs, and generalized linear models. Because it’s built on top of low level technologies like Metal and Accelerate, Core ML seamlessly takes advantage of the CPU and GPU to provide maximum performance and efficiency. You can run machine learning models on the device so data doesn't need to leave the device to be analyzed.

Machine learning includes vision APIs too. Supported features include face tracking, face detection, landmarks, text detection, rectangle detection, barcode detection, object tracking, and image registration.

The Natural Language framework is a new framework apps can use to analyze natural language text and deduce its language-specific metadata. Apps can use this framework with Create ML to train and deploy custom NLP models.

Core ML work with Core ML Models, Apple has provided tools to create new models or use some of the existing ones

references:
https://developer.apple.com/machine-learning/

No comments:

Post a Comment