codingecho

日々の体験などを書いてます

Create a Generative Adversarial Network iOS App with CoreML

I created the app that generates a handwritten image with CoreML on iOS.

I'm going to explain the process to release this app from beginning to end.

The software versions

The process to make handwritten image generator app on iOS and submit it to the App Store

As shown in figure 1, it illustrates the path of distributing this app through App Store.

  1. Create a GAN (Generative Adversarial Network) model on Keras
  2. Convert a Keras GAN model to CoreML model
  3. Create an app to show handwritten images generated by the CoreML model
  4. Distribute the app

the flow of distributing an app Figure 1. The path of distributing this app through App Store

Install Docker and run Jupyter Notebook that includes TensorFlow and Keras

At first, To create a Keras model, we have to install Docker.

Install Docker

On Mac, get it in this page at the Stable channel and just install it.

Run Jupiter Notebook with TensorFlow

Fortunately, there is the Docker image that includes TensorFlow.

Just type this command:

$ docker run -d --name notebook tensorflow/tensorflow:latest

Install Keras and CoreML converter

$ docker exec -it notebook /bin/bash
$ pip install -U keras
$ pip install -U coremltools

Check that you can access the Jupyter Notebook

Before access it, you need to get an access token.

$ docker logs notebook
# You will find an access token like this:
#
#    Copy/paste this URL into your browser when you connect for the first time,
#    to login with a token:
#        http://localhost:8888/?token=ecf1a2471670eb6863195ab530d6ac1d5cc27511faca0afe

Copy the URL with an access token and access on your browser.

Now it's done! You come can execute Keras code!

Make a Keras model

Add a new note to Jupyter Notebook and write a GAN model. I referred to this source code. Copy this code and paste to your Jupyter Notebook's cell and add the code below. It will save a model in a directory.

if __name__ == '__main__':
    gan = GAN()
    gan.train(epochs=30000, batch_size=32, save_interval=200)
    gan.discriminator.save('./discriminator.h5') // Specify the directory saved the discriminator model
    gan.generator.save('./generator.h5') // Specify the directory saved the generator model

Then just run the cell. It took time about for 25 minutes on my MacBook Pro 2016 to training this model.

Keras create a learned model in the directory where executed the cell. It is followed by ".h5".

Convert a Keras model to CoreML model

$ coremlconverter --srcModelPath ./keras_model.h5 --dstModelPath ./coreml_model.mlmodel --inputNames gunInput --outputNames ganOutput

This command uses TensorFlow in the background and converts to a CoreML model. After the converting, it makes a CoreML model is followed by .mlmodel in the specified directory. This file is a CoreML converted model from Keras model.

Create an app to show handwritten images

There are some ways to create an app that draws a handwritten image. For instance, as a web application or mobile application. This time, I created an app for iOS.

Create the app

Here is the source code for the app: https://github.com/yanak/gangen/blob/master/Gangen/HandwrittenImage.swift

figure 2 is the app's screenshot.

Figure 2. The GANs generator app's screenshot

This app generates a 28 x 28 handwritten image using the CoreML model converted from Keras, and that renders the image by UIKit. There is the regenerate button that regenerates a handwritten image.

Generate a handwritten image

First, import the CoreML model converted from the Keras model to Xcode project directory. It shows like figure 3:

Figure 3. the CoreML model properties

After Xcode imports, a model, the class of a model is automatically generated by Xcode. this class has the prediction method and it specifies Keras model's input as the first argument. The first argument's label uses the name of the CoreML converter's --inputName option value. In this case, The class defines the label of prediction as ganInput and a generated handwritten image's shape as ganOutput. You can see its definitions in the Model Evaluation Parameters in the CoreML model(see figure 3.)

// Create a gan instance
var model = gan()
// Generate a handwritten image
var output = model.prediction()

// render hand-written image
let HIGHT = 27
let WIDTH = 27

for i in 0...HIGHT {
  for j in 0...WIDTH {
    //create the path
    let plusPath = UIBezierPath()

    //set the path's line width to the height of the stroke
    plusPath.lineWidth = Constants.plusLineWidth

    //move the initial point of the path
    //to the start of the horizontal stroke
    plusPath.move(to: CGPoint(
      x: CGFloat(j * 10),
      y: CGFloat(i * 10) + Constants.plusLineWidth / 2
    ))

    //add a point to the path at the end of the stroke
    plusPath.addLine(to: CGPoint(
      x: CGFloat((j * 10) + 10),
      y: CGFloat(i * 10) + Constants.plusLineWidth / 2
    ))

    //set the stroke color
    let index: [NSNumber] = [0 as NSNumber, i as NSNumber, j as NSNumber]
    UIColor(white: CGFloat(truncating: out.gan_out[index]), alpha: CGFloat(1)).setStroke()

    //draw the stroke
    plusPath.stroke()
  }
}

Open this project, you can check to run it in a simulator like this:

Enroll Apple Developer Program

After check running the app, submit it to the App Store! If you don't enroll Apple Developer Program yet, you need to do it in here because it needs to submit the app to the App Store.

Submit to the App Store

In detail, see Submitting Your Apps

  1. Archiving To archive, on Xcode, Product > Archive

  2. Validating Xcode shows the archived app in the Archives organizer. Validate an archived app before uploading to App Store.

  3. Upload to iTunes connect Click Upload to App Store

Publish app

In detail, see iTunes Connect Developer Help page.

  1. Add an app in iTunes Connect
  2. Add app icon, app previews and screenshots.
  3. Submit an app to App Review
  4. Release an app