Core ML and Machine Learning in iOS

core ML and machine learning

machine learningCore ML enables app to use Machine Learning models with less power consumption, efficient processing speed and low memory usage. Core ML supports various models including neural networks, tree ensembles, support vector machines, generalized linear models, feature engineering and pipeline models. However, all of the models need to be converted to .mlmodel file extension.

Conversion workflow

Basically, Core ML model is the conversion of a trained machine learning model to .mlmodel file.

Convert to Core ML

Apple supports below mentioned formats for conversion as of now:

  • Caffe
  • Keras
  • XGBoost
  • Scikit-learn
  • libsvm

Model formats

To convert already trained models to .mlmodel format, Apple has introduced coremltools, which is open source tool.

Note : Try coremltools with Python 2.7 as other versions in macOS have some errors while configuration.

Steps

Try this commands:

pip install virtualenv
virtualenv --python=/usr/bin/python2.7 python27
source python27/bin/activate

Check the current version of Python installed on your system:

python --version

Install coremltools in system:

pip install -U coremltools

- U is for temporary environment. Also our installation part is over here.

As of now, we are ready to convert already trained models in .mlmodel file.
Files required: bvlc_alexnet.caffemodel, deploy.prototxt, class_labels.txt

So choose the dictionary in terminal,

cd
import coremltools
# Convert a caffe model to a classifier in Core ML
coreml_model = coremltools.converters.caffe.convert(('bvlc_alexnet.caffemodel', 'deploy.prototxt'), predicted_feature_name='class_labels.txt')

Here,

deploy.prototxt – which contains the structure of neural network inside.
‘bvlc_alexnet.caffemodel – already trained caffe model.
class_labels.txt – It contains list of names of flowers.

To begin with conversion, you just need to press ENTER and the system will start its operation of converting the model.

It will be >>> sign when conversion gets completed. So save it by,

# Now save the model
coreml_model.save('BVLCObjectClassifier.mlmodel')

And you’ll see a ‘Flowers.mlmodel’ file in the same folder. That’s it. Integrate recently created model in Xcode and start identifying objects.

Integrating Core ML Model into Xcode Project

Drag the Core ML model to Xcode navigator. Xcode will detect it as Core ML model and will display its description, input – output parameters for the model with other necessary information.

XCode integration

It’ll show a message like ‘Model is not part of any target…’. Just add the model to Target by selecting the target.

XCode integration

Now it’ll generate a new object with the name of your model and its properties will be automatically generated.

XCode integration

Now, in the ViewController, Add the code to pick image and after that predict the result.

Code

 

Code

Code

Now, run the app and pick image from Photos. The model will predict the age of the portrait picked will display with probability of it.

Limitations of Core ML :

- At present apple only supports trained model.

- Not able to train models inside app.

- Conversion of models is limited to few formats only.

- Developers will not have option to choose between CPU or GPU.

- It’ll increase the size of the app.

- It may result in different result on different platform as .mlmodel is not supported on all platforms.

The following two tabs change content below.
Milan Manwar
I'm a passionate programmer who is always keen to learn new technologies and latest technology trends. I always prefer to explore new technologies and experiment on its applications and to share the same with people around me.
Milan Manwar

Latest posts by Milan Manwar (see all)

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>