Using the GAN models in iOS

If you try to use the TensorFlow pod in your iOS app and load the gan_mnist.pb file, you'll get an error:

Could not create TensorFlow Graph: Invalid argument: No OpKernel was registered to support Op 'RandomStandardNormal' with these attrs. Registered devices: [CPU], Registered kernels:
<no registered kernels>
[[Node: z_1/RandomStandardNormal = RandomStandardNormal[T=DT_INT32, _output_shapes=[[50,100]], dtype=DT_FLOAT, seed=0, seed2=0](z_1/shape)]]

Make sure your tensorflow/contrib/makefile/tf_op_files.txt file has tensorflow/core/kernels/random_op.cc, which implements the RandomStandardNormal operation, and the libtensorflow-core.a is built with tensorflow/contrib/makefile/build_all_ios.sh after the line is added to tf_op_files.txt.

Furthermore, if you try to load the pix2pix_transformed_memmapped.pb even in the custom TensorFlow library built with TensorFlow 1.4, you'll get the following error:

No OpKernel was registered to support Op 'FIFOQueueV2' with these attrs. Registered devices: [CPU], Registered kernels:
<no registered kernels>
[[Node: batch/fifo_queue = FIFOQueueV2[_output_shapes=[[]], capacity=32, component_types=[DT_STRING, DT_FLOAT, DT_FLOAT], container="", shapes=[[], [256,256,1], [256,256,2]], shared_name=""]()]]

You'll need to add tensorflow/core/kernels/fifo_queue_op.cc to the tf_op_files.txt and rebuild the iOS library. But if you use TensorFlow 1.5 or 1.6, the tensorflow/core/kernels/fifo_queue_op.cc file has already been added to the tf_op_files.txt file. With each new version of TensorFlow, more and more kernels are added to tf_op_files.txt by default.

With the TensorFlow iOS library built for our models, let's create a new project named GAN in Xcode and set up TensorFlow in the project as we did in Chapter 8, Predicting Stock Price with RNN, and other chapters not using the TensorFlow pod. Then drag and drop the two model files gan_mnist.pb and pix2pix_transformed_memmapped.pb and one test image to the project. Also, copy the tensorflow_utils.htensorflow_utils.mmios_image_load.h and ios_image_load.mm files from the iOS project in Chapter 6Describing Images in Natural Language, to the GAN project. Rename ViewController.m to ViewController.mm.

Now your Xcode should look like Figure 9.2:

Figure 9.2: Showing the GAN app in Xcode

We'll create a button that when tapped, prompts the user to pick a model to generate digits or enhance an image:

- (IBAction)btnTapped:(id)sender {
UIAlertAction* mnist = [UIAlertAction actionWithTitle:@"Generate Digits" style:UIAlertActionStyleDefault handler:^(UIAlertAction * action) {
_iv.image = NULL;
dispatch_async(dispatch_get_global_queue(0, 0), ^{
NSArray *arrayGreyscaleValues = [self runMNISTModel];
dispatch_async(dispatch_get_main_queue(), ^{
UIImage *imgDigit = [self createMNISTImageInRect:_iv.frame values:arrayGreyscaleValues];
_iv.image = imgDigit;
});
});
}];
UIAlertAction* pix2pix = [UIAlertAction actionWithTitle:@"Enhance Image" style:UIAlertActionStyleDefault handler:^(UIAlertAction * action) {
_iv.image = [UIImage imageNamed:image_name];
dispatch_async(dispatch_get_global_queue(0, 0), ^{
NSArray *arrayRGBValues = [self runPix2PixBlurryModel];
dispatch_async(dispatch_get_main_queue(), ^{
UIImage *imgTranslated = [self createTranslatedImageInRect:_iv.frame values:arrayRGBValues];
_iv.image = imgTranslated;
});
});
}];

UIAlertAction* none = [UIAlertAction actionWithTitle:@"None" style:UIAlertActionStyleDefault handler:^(UIAlertAction * action) {}];

UIAlertController* alert = [UIAlertController alertControllerWithTitle:@"Use GAN to" message:nil preferredStyle:UIAlertControllerStyleAlert];
[alert addAction:mnist];
[alert addAction:pix2pix];
[alert addAction:none];
[self presentViewController:alert animated:YES completion:nil];
}

The code here is pretty straightforward. The app's main functionality is implemented in four methods: runMNISTModelrunPix2PixBlurryModel createMNISTImageInRect, and createTranslatedImageInRect