Machine Learning with CoreML

Rise of CoreML

We iOS Developers are a lucky bunch – apart from the usual holidays in December, we enjoy a special Christmas every June – thanks to the World Wide Developers Conference organized by Apple. 2017 was no exception either. So when Apple unwrapped the boxes for us – out came the new HomePod Speaker, the new beast called iMac Pro, Mac OSX High Sierra – everything was awesome! Then there were the toys for developers. I probably have been a very nice guy all year because it was a pleasant surprise for me when Apple revealed their new CoreML – machine learning integration framework – because out of professional curiosity I have been dabbling with Machine Learning for the past few months. So having the opportunity to implement the power of machine learning in iOS – I could not wait to get my hands wet! Here’s an outline of what I learned:

What is Machine Learning

Before we jump right down the cliff, let’s discuss a little about that what’s beneath.

You see, when a human child puts her first step at the doorway of learning, she can not learn by herself. Instead, she needs her hand carefully held by the teacher, and with intensive guidance, she is steered along the path of acquiring knowledge. As she learns, she also gains experience.


The trusted friend who would, one day, take the job off her teacher’s careful hands and become her lifelong guide and companion – growing together as she passes through the oft-perilous ways of life.  And exactly there, dear reader, a machine have differed from a human being thus far. A machine could be taught, but it could not teach itself – until – machine learning evolved. Machine Learning provides the experience factor to the intelligence system of a machine which is also known as Artificial Intelligence. It’s the science of getting computers to act without being explicitly programmed.

A computer program is said to learn from experience E with respect to some class of tasks T and performance measure P if its performance at tasks in T, as measured by P, improves with experience E.
 – Mitchell, T. (1997). Machine Learning. McGraw-Hill. p. 2. ISBN 0-07-042807-7.

Types of Learning

Based on algorithms used for training a machine to gain experience, Machine Learning can be grouped into two major categories – Supervised Learning and Unsupervised Learning. Supervised Learning is where a machine is trained with a complete set of labeled training data and outcomes. On the other hand, unsupervised learning is where the machine can not be trained with labeled training data. Using supervised learning a machine can either solve a classification or a regression problem. On the other hand, a machine can solve a clustering and some other types of problems using unsupervised learning. Following are some examples of the problems:

  • Classification: The machine is given a dataset and based of specific parameters it classifies them into different categories. For example: Based on the size and shape of a tumor, the machine can classify a tumor to be malignant or benign.
  • Regression: Based on various parameters, like product price, demand, distribution and other factors, based on historical data, a machine can predict the profit for the current or future years.
  • Clustering: The best example of clustering should probably be the Google News. It uses an algorithm to group news of same topic and content and show them together. Pattern recognition plays a key part in clustering solutions.

Once such an algorithm is generated, a model can be generated which enables a machine to refer to it to make further predictions, inferences and deductions. Machine Learning tools can generate such a model, but once generated they can not be used in iOS apps as is. They need to be converted to Xcode supported .mlmodel format.


Apple provides the link to a few open source CoreML models that solve some classification problems like detecting the major object(s) in a picture or detecting a scene from a picture.

However, apart from these, any machine learning model generated by any machine learning tool can be converted into a CoreML model using CoreML Tools – that can be used in the app.

Core ML lets you integrate a broad variety of machine learning model types into your app. In addition to supporting extensive deep learning with over 30 layer types, it also supports standard models such as tree ensembles, SVMs, and generalized linear models. Because it’s built on top of low level technologies like Metal and Accelerate, Core ML seamlessly takes advantage of the CPU and GPU to provide maximum performance and efficiency. You can run machine learning models on the device so data doesn’t need to leave the device to be analyzed.

Using the CoreML model, and Vision framework, it’s really easy to build an iOS app that – given a photo – can detect scenes or major objects from that and display. I won’t go into the irrelevant details of building this app from scratch but rather would discuss the heart of the application – the fun part – and it’s just a few steps.

The Photo Recognition App

I will assume that the app is setup to provide an image, by either picking a photo from the native photo picker or by taking a photo with the camera.

Step 1. Now the first step would be to download a machine learning model from Apple’s website and include it in the app. Here I am using Inceptionv3 model listed in Apple’s website. This seems to be a very good model with much better accuracy than the others – although a bit heavy in size. Now Xcode does some heavy lifting for you. As soon as the model is added, Xcode generates a model class named after the model name. To see it, just highlight the model in Xcode files navigator:

Screen Shot 2017-07-11 at 11.26.25 AM

In the next steps we would refer to this class as Inceptionv3.

Step 2. Now it’s time for some code. Import Vision and CoreML frameworks which will aid us in our journey. Then implement the following code.

import Vision
import CoreML
guard let model = try? VNCoreMLModel(for: Inceptionv3().model) else {
fatalError("can't load CoreML model")

Here we create an instance of the VNCoreMLModel class for the CoreML model Inceptionv3. It’s sometimes recommended to initialize this early so that it’s faster once the image is selected for recognition.

Step 3. Now we need to create a VNCoreMLRequest which would query the MLModel with our image to find out what it is.

let request = VNCoreMLRequest(model: model) { [weak self] (request, error) in
    guard let results = request.results as? [VNClassificationObservation],
    let topResult = results.first else {
          fatalError("Result not in correct format")

    DispatchQueue.main.async {[weak self] in
          self?.predictionLabel.text = "\(topResult.confidence * 100)%
chance to be \(topResult.identifier)"

Here, first we create a VNCoreMLRequest and specify a completion block once it finishes execution. The completion block just takes the first result from the prediction set received as an array of VNClassificationObservation class. As I discussed before, classification is one type of observation. There are other types of observations like clustering, regression. Notice that, VNClassificationObservation is a subclass of VNObservation.

The VNCoreMLRequest uses a VNCoreMLModel that is based on a CoreML based model to run predictions with that model. Depending on the model the returned observation is either a VNClassificationObservation for classifier models, VNPixelBufferObservations for image-to-image models or VNMLFeatureValueObservation for everything else.

Step 4. We are almost there. The last and final step is to actually executing the request. A job well suited for the one and only VNImageRequestHandler.

let handler = VNImageRequestHandler(ciImage: image) .userInteractive).async {
      do {
        try handler.perform([request])
      } catch {

All the code listed above can be included in one method and once it is executed, the answerLabel prints the name of the major object on the picture along with the accuracy.


A note on accuracy

From the above screenshot, it might appear that the world of machine learning and prediction is all rainbows and unicorns like this, but in reality it’s far from that. Machine Learning is still in it’s infancy and has much room for improvement. As for the iOS app, it all depends on the model used, and its very easy to miss the optimal sweet spot and instead under-train or overtrain the model. In case of overtraining, the models start focussing on the quirks of the training set more and hence it’s accuracy gets diminished.


Using a CoreML model and Vision framework to leverage machine learning to create perception about the outside world opens up endless possibilities. Once the machine recognizes an object, it’s probably the next obvious step to respond to it. In iOS 11, ARKit provides Augmented Reality – one of many options to do something with this new super power the iPhones have got. I intend to touch up on that in my next post. Meanwhile, have fun and learn how to train your machine!!


All copyrights of images belong to their respective owners.

Swift 2.0 and Unit Testing


As we mature, as programmers and professionals, we see things differently. Our approaches and perspectives change and our paradigms shift. We start weighing things on a different scale, silently laughing at our follies we made in adolescence. I admit, initial days of my journey as a programmer, I used to program like an artist. I would make a basic structure first, then perfect it slowly towards the requirement by debugging and thus narrowing the gap between the current and target state. It had been working fine, but the biggest problem with that approach was — debugging almost always takes up more effort and time than actual development. Though I love debugging, but also, in many cases, I was having to change a lot of code to make things optimized, which could be avoided by a preplanning the coding approach which is evidently lacking in the current modus operandi.

Then I got introduced to Test Driven Development (TDD) approach. I wouldn’t say that I was in love with the approach at the first instance and started following that everywhere — no. Apart from the fact that like most daytime programmers I do not have the luxury to follow any new approach as soon as I encounter, I actually was skeptic about the approach and never thought it to be practical. However, I decided to give it a fair chance and tried it out on a couple of home grown projects, and I liked it. Lot less debugging, hugely less worry about changing code and its impact, and lot less effort to develop.

Today’s post is not about TDD, but actually about Unit Testing. In the past days, writing a unit test for the piece of code written by the developer seemed impractical and frown upon by the developers and major part of unit testing was done manually by the developer before handing over to the QA team. However, as I mentioned before, the programming community is now matured enough not to ignore the boons of unit testing. Apple’s flagship IDE Xcode comes with XCTest unit testing framework bundled within. We will dive into the framework and testing techniques in this post.

What is Unit Testing

In the term Unit Testing, unit represents the smallest possible testable bit of code written. It might be a method, a class or a whole functionality, based on the viewpoint of the programmer. A test is a piece of code that exercises the code written to make an app, a library or a functionality and provides a status of pass or fail based on some given criteria. The pass and fail of a test is derived by checking for correct state of certain objects that are expected to change their states after an operation is done, or whether a piece of code throws an exception based on a specific set of data that passes through it where it is supposed to throw the exception. There are performance tests too, which measures the execution time of a set of code block and determines the pass or fail status based on preset benchmarks.

Different types of Unit Testing

As the unit testing frameworks matured, more and more types of unit testings were made possible. Along with the functional testing framework, non-functional unit tests such as Performance Testing were made possible in the unit testing frameworks. In Xcode 6, Apple introduced performance testing capabilities in it’s XCTest framework. In Xcode 7 they introduced UI Testing. We will dive into each type of testing one by one and see how this can be done.

Setting up Unit Test Project

I will use Xcode 7.0 beta (7A121l) for demonstrating the Unit Testing. When a project is created using Xcode it also sets up a Unit Testing project with it if “Include Unit Tests” checkbox is checked.


Once the project is created, you will find a Test folder is created alongside with it too. Now we will see how we write tests. TDD approach talks about writing the test cases before you even start writing code. This approach asks you to write test cases for the code that does not exist and run the test case which fails. Then write the code to make it pass. However, here, for the sake of simplicity and for the sake of the very basic nature of the post, I will show a basic test scenario based on the code already written.

Main Project

So, I will start with a very basic project I created just for the demonstration purpose of this article called BookCatalog. The project is actually a slight variation of the time stamp sample project you get when you create a master detail project for the first time. So you have a plus button at the top and tapping that the table gets populated with name of books from an array which contains names of books and their authors.


So to demonstrate how tests work, I take an example of a method called "populateBookModel". The method looks something like below:

let books = ["The Great Gatsby by F. Scott Fitzgerald",
             "The Prince by Niccolo Machiavelli",
             "Slaughterhouse-Five by Kurt Vonnegut",
             "1984 by George Orwell",
             "The Republic by Plato",
             "Brothers Karamazov by Fyodor Dostoevsky",
             "The Catcher in the Rye by J.D. Salinger",
             "The Wealth of Nations by Adam Smith",
             "For Whom the Bell Tolls by Ernest Hemingway",
             "The Grapes of Wrath by John Steinbeck",
             "Brave New World by Aldous Huxley",
             "How To Win Friends And Influence People by Dale Carnegie",
             "The Rise of Theodore Roosevelt by Edmund Morris",
             "Dharma Bums by Jack Kerouac",
             "Catch-22 by Joseph Heller",
             "Walden by Henry David Thoreau",
             "Lord of the Flies by William Golding",
             "The Master and Margarita by by Mikhail Bulgakov",
             "Bluebeard by Kurt Vonnegut",
             "Atlas Shrugged by Ayn Rand",
             "The Metamorphosis by Franz Kafka",
             "Another Roadside Attraction by Tom Robbins",
             "White Noise by Don Delillo",
             "Ulysses by James Joyce",
             "The Young Man’s Guide by William Alcott",
             "Blood Meridian, or the Evening Redness in the West by Cormac McCarthy",
             "Seek: Reports from the Edges of America & Beyond by Denis Johnson",
             "Crime And Punishment by Fyodor Dostoevsky",
             "Steppenwolf by Herman Hesse",
             "East of Eden by John Steinbeck",
             "Essential Manners for Men by Peter Post",]

var bookObjects = [AnyObject]()

func populateBookModel () { { 
       (book: String) -> String in bookObjects.append(Books(bookName: book.componentsSeparatedByString(" by ")[0], author: book.componentsSeparatedByString(" by ")[1]))
       return book

The above code actually parses the above array, populates the Books model.

class Books {
    let bookName: String
    let author: String
    init(bookName: String, author: String) {
        self.bookName = bookName = author

Test Project

So, I would like to test this populateBookModel method to check whether the books are getting populated nicely. So, I would create a file in the test project with a name which signifies the class I am going to test. This file will contain all the tests for all the methods/functionalities I like to test from the MasterViewController class. Now I have to decide how to verify that the method actually executed without any problem? If you examine the code of the MasterViewController as displayed above, you will find that I actually take the books from books array and populate them into bookObjects array. So, if I check the count of these two arrays and they match, that would indicate that the population was successful. To achieve this, I write the following test —

import XCTest
@testable import BookCatalog

class BookCatalogTests: XCTestCase {
  func testpopulateBookModel() {
          let masterVC = MasterViewController()
          XCTAssert(masterVC.bookObjects.count == masterVC.books.count, "Book objects are \(masterVC.bookObjects.count) and books are \(masterVC.books.count) in number")

If you execute the above test, the test will pass as the count of the books array matches with the count of bookObjects. So, the above test says that the population was successful.

Sample Project

I have put together a project which shows how to setup your unit testing project and write unit tests. All the examples in this article can be found there. You can grab the project at this location.


Unit Testing is a vast subject. With this article, I have touched the tip of the iceberg — because I wanted to write down my learnings as fast as possible. As I go on exploring more I will post more articles on unit testing. I hope the post was helpful and interesting to you and you will love writing unit tests as much as I do now. Please feel free to post comments and suggestions as always!

Asynchronous Networking Approaches…

How should Asynchronous Networking be handled? This is quite a common question in various places, starting from interviews to forums like Stackoverflow. Yet, this is not a question to be answered in a sentence. There are several ways with their own strengths and weaknesses. This article is a humble effort to outline them all.

NSURLConnection Approach – New approach

The most modern of the approaches would be to use sendAsynchronousRequest:queue:completionHandler:
Following is an example of the usage of the method:

[NSURLConnection sendAsynchronousRequest:request
                                       queue:[NSOperationQueue mainQueue]
                           completionHandler:^(NSURLResponse *response, NSData *data, NSError *error) {
        [self doSomethingWithData:data];

This approach has the following benefits:

  1. This does away with the repetitive code that handles intermediate results
  2. Race conditions are avoided which sometimes occur if using NSURLConnection delegates, as some of the part of the delegate gets prematurely released. The block here is retained
  3. In case of multiple asynchronous request, the code for handling each of the response is cleanly separated and thus less chance of mixup

But all these good things comes with a price tag.

  • You lose some control over the operations. Imagine the scenario when you will need to cancel the download of a large chunk of data. In the above implementation, there is no way you can actually cancel the request without leaking memory.

You can probably try cancelling the NSOperation within the queue that is sent as a parameter to the method, but that does not necessarily cancel the operation. It merely marks the operation as cancelled so that when you query the isCancelled property of the operation you get back a positive. But you will have to cancel all your activities yourself based on this isCancelled flag.

  • As stated in the first beneficial point, you can not handle intermediate results.
  • With this approach when a request is made, it either fails or succeeds, and it fails even for authentication challenges.

NSURLConnection – Traditional Approach

Then there is the traditional approach where we implement the NSURLConnectionDelegate methods and initiate the request with NSURLRequest. A quick example follows:

-(IBAction)didPressConnectButton:(id)sender {
    NSURL *url = [NSURL URLWithString:@""];
    NSURLRequest *request = [[NSURLRequest alloc] initWithURL:url];
    self.connection1 = [[NSURLConnection alloc] initWithRequest:request delegate:self];

#pragma mark - NSURLConnectionDataDelegate Methods

- (void)connection:(NSURLConnection *)connection didReceiveResponse:(NSURLResponse *)response {
    self.responseData = [[NSMutableData alloc] init];

- (void)connection:(NSURLConnection *)connection didReceiveData:(NSData *)data {
   [self.responseData appendData:data];


- (void)connectionDidFinishLoading:(NSURLConnection *)connection {
    if ([connection isEqual: self.connection1]) {
        NSData *data = self.responseData;
        //Do something with the data

#pragma mark - NSURLConnectionDelegate Methods

- (void)connection:(NSURLConnection *)connection didFailWithError:(NSError *)error {
    //Handle error scenario

One benefit of using the traditional approach with NSURLConnection is that you get to handle authentication challenges through delegates. Though handling authentication challenges properly might be a lengthy and difficult, but it is nonetheless possible.

Following is the delegate method which handles authentication challenge:

- (void)connection:(NSURLConnection *)connection willSendRequestForAuthenticationChallenge:(NSURLAuthenticationChallenge *)challenge {


But if there are multiple requests, then in the authentication challenge handler it becomes difficult to understand for which request the authentication challenge is thrown.

A Better Approach – NSURLSession

As we discussed both of the above approaches has their pros and cons. So, Apple has come up with an approach which takes the best of both. This is the approach with NSURLSession

Block based approach

NSString *imageUrl = @"";
    NSURLSessionConfiguration *config = [NSURLSessionConfiguration defaultSessionConfiguration];
    NSURLSession *session = [NSURLSession sessionWithConfiguration:config delegate:self delegateQueue:nil ];

    NSURLSessionTask *downloadTask = [session downloadTaskWithURL:[NSURL URLWithString:imageUrl] completionHandler:^(NSURL *location, NSURLResponse *response, NSError *error) {
        UIImage *downloadedImage = [UIImage imageWithData:[NSData dataWithContentsOfURL:location]];
        dispatch_async(dispatch_get_main_queue(), ^{
            self.imageView.image = downloadedImage;

    [downloadTask resume];

Delegate based approach

 - (void) downloadImage {
    NSString *imageUrl = @"";
    NSURLSessionConfiguration *config = [NSURLSessionConfiguration defaultSessionConfiguration];

    NSURLSession *session = [NSURLSession sessionWithConfiguration:config delegate:self delegateQueue:nil ];

    NSURLSessionTask *downloadTask = [session downloadTaskWithURL:[NSURL URLWithString:imageUrl]];

    [downloadTask resume];

-(void)URLSession:(NSURLSession *)session downloadTask:(NSURLSessionDownloadTask *)downloadTask
didFinishDownloadingToURL:(NSURL *)location
    // use code above from completion handler

//For progress indication
-(void)URLSession:(NSURLSession *)session downloadTask:(NSURLSessionDownloadTask *)downloadTask didWriteData:(int64_t)bytesWritten totalBytesWritten:(int64_t)totalBytesWritten totalBytesExpectedToWrite:(int64_t)totalBytesExpectedToWrite
    NSLog(@"%f / %f", (double)totalBytesWritten,

Finally, the best approach in my humble opinion, would be to use AFNetworking, or RESTKit. There are other third party APIs too, like MKNetworkKit etc. I have not used MKNetworkKit by Mugunth Kumar, but the other two are really good when it comes to asynchronous networking and a myriad of other related features.

With AFNetworking, the above task can be performed as:

NSURLSessionConfiguration *configuration = [NSURLSessionConfiguration defaultSessionConfiguration];
AFURLSessionManager *manager = [[AFURLSessionManager alloc] initWithSessionConfiguration:configuration];

NSURL *URL = [NSURL URLWithString:@""];
NSURLRequest *request = [NSURLRequest requestWithURL:URL];

NSURLSessionDownloadTask *downloadTask = [manager downloadTaskWithRequest:request progress:nil destination:^NSURL *(NSURL *targetPath, NSURLResponse *response) {
    NSURL *documentsDirectoryURL = [[NSFileManager defaultManager] URLForDirectory:NSDocumentDirectory inDomain:NSUserDomainMask appropriateForURL:nil create:NO error:nil];
    return [documentsDirectoryURL URLByAppendingPathComponent:[response suggestedFilename]];
} completionHandler:^(NSURLResponse *response, NSURL *filePath, NSError *error) {
    NSLog(@"File downloaded to: %@", filePath);
[downloadTask resume];

AFNetworking also allows to track progress with multipart request. The following is an example of an upload task with progress indicator:

NSMutableURLRequest *request = [[AFHTTPRequestSerializer serializer] multipartFormRequestWithMethod:@"POST" URLString:@"" parameters:nil constructingBodyWithBlock:^(id<AFMultipartFormData> formData) {
        [formData appendPartWithFileURL:[NSURL fileURLWithPath:@"file://path/to/image.jpg"] name:@"file" fileName:@"filename.jpg" mimeType:@"image/jpeg" error:nil];
    } error:nil];

AFURLSessionManager *manager = [[AFURLSessionManager alloc] initWithSessionConfiguration:[NSURLSessionConfiguration defaultSessionConfiguration]];
NSProgress *progress = nil;

NSURLSessionUploadTask *uploadTask = [manager uploadTaskWithStreamedRequest:request progress:&progress completionHandler:^(NSURLResponse *response, id responseObject, NSError *error) {
    if (error) {
        NSLog(@"Error: %@", error);
    } else {
        NSLog(@"%@ %@", response, responseObject);

[uploadTask resume];

Apple Watch Glance and Inter Device Communications…

As a continuation to the series of articles I have been writing about Apple Watch and WatchKit (which you can find here and here), today I intend to discuss about the “Glance” feature of Apple Watch. But as Christmas is drawing near, and Mr. Clause would like me to do so, in addition to Glance, I’ll give you something extra. Very recently, Apple has revealed a new feature to WatchKit. The feature is inter device communication, which bridges the information gap between the iPhone device and Apple Watch.  I’ll cover that too!

So, as an aggregation measure and in the urge to know everything right now and here, which is an instance of an insatiable inquisitiveness inherent in invariably all individuals such as I, this article enjoys the daylight.


Glances are a summary view of the actual application running in the Apple Watch. You can very well compare them with the live tiles in Windows Phone. In Apple’s own words —

Glances are a browsable collection of timely and contextually relevant moments from the wearer’s favorite apps. Individually, a Glance is a quick view of your app’s most important content.

Keeping that intention in mind, Apple has put severe restrictions to the design of Glances. The first of them is —

  1. Template Based: they are template based. So, there is no way other than following where Apple wants you to put your views.
  2. Single Action: Another one is, they only can host one single action. If the user taps on the Glance, the Watch App launches.

Inter Device Communication

It would have been really good if the communication was really two way, in an impartial manner. Unfortunately, iPhone turned out to be too shy to start the conversation with Apple Watch. And in such a scenario, Apple Watch does just what a guy/girl does when trying his/her luck with their shy counterpart, it takes the initiative and approaches. Hopefully, if s/he wishes to do so, iPhone can reply back. Romantic, isn’t it?

How to Glance (and not watch!)

Today, I will not delve deep into step by step tutorials. Because, making an Apple Watch app with Glances is really easy and only takes a heartfelt tick on the tick box that says “Include Glance Scene”.


I will rather explain what I want to present to you in terms of source code, which as usual is uploaded in Github (MIT license).

Application Overview (my big idea)

My idea of utilising both Glance and Inter Device Communication is as follows:

Let me tell you a secret, I love loans. There is no other thing in the world that has such a tremendous power to provide you endless sleepless nights (for two) and at an extreme, even the unique opportunity to be homeless again, only at the nominal cost of a little temporary happiness! That’s why, I would like to make a banking app which lets you view your loan balance, and unlike other selfish banks, encourages you to pay back the loan and be out of that debt soon (so that you can borrow an even larger amount soon enough!)

The original iPhone application shows your loan account number and a pie chart that depicts how much you have paid back and how much outstanding amount you have to pay further.


This information will also be available in my Apple Watch app. In the Glances view, the user can see the graph in the iPhone app which urges her to make the whole green.


And here will be the Glance view for the data.


The graph is generated using the famed CorePlot. Apple Watch unfortunately does not have the guts to use CorePlot yet, so it will have to suffice with a PNG representation of the graph view which will be thrown to the Watch App upon request.

iPhone App – with all her beauty, waits for her knight in shining armour

Our iPhone application has a JSON file, which contains the following data. Of course In the real life scenario all these data would be coming from the server locked in encryptions with keys thrown in the water.

      "LoanAmount": "70000",
      "Outstanding": "20000",
      "Paid": "50000",
      "AccountNo" : "3423847289",
      "NextInstallment": "01/01/2015"

The iPhone app reads the data and generates the pie chart using core plot API. Finally the graph is converted into png image and is saved in the document directory.

What Apple (Knight) Watch does

The Apple Watch have the ability to invoke the parent app. So, when the user taps on the “Refresh” button, the iPhone app launches and generates the graph.

- (IBAction)refreshGlance {
    [self openParentAppToRefreshGraph];

-(void) openParentAppToRefreshGraph {
    [WKInterfaceController openParentApplication:[NSDictionary dictionaryWithObjectsAndKeys:@"ImageName", @"chartImage.png", nil] reply:^(NSDictionary *replyInfo, NSError *error) {
        NSData *pngData = replyInfo[@"Image"];
        NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
        NSString *documentsPath = [paths objectAtIndex:0]; //Get the docs directory
        NSString *filePath = [documentsPath stringByAppendingPathComponent:@"chartImage.png"]; //Add the file name
        [pngData writeToFile:filePath atomically:YES];

Her silent reply…

In the callback to the Apple Watch, the image generated from the graph is sent back to the Apple Watch for display.

    // Called when the application is about to terminate. Save data if appropriate. See also applicationDidEnterBackground:.

-(void)application:(UIApplication *)application handleWatchKitExtensionRequest:(NSDictionary *)userInfo reply:(void (^)(NSDictionary *))reply {
    [[NSNotificationCenter defaultCenter] postNotificationName:@"WatchKitNotification" object:nil];
    NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
    NSString *documentsPath = [paths objectAtIndex:0]; //Get the docs directory
    NSString *filePath = [documentsPath stringByAppendingPathComponent:@"chartImage.png"];
    NSData *pngData = [NSData dataWithContentsOfFile:filePath];
    NSDictionary *response = @{@"Image" : pngData};

Phew!… It’s a Yes !!

Once the data is received, Apple Watch then saves the image in the documents directory. So, when user goes to glance, the new updated graph is ready.

- (void)awakeWithContext:(id)context {
    [super awakeWithContext:context];

    NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
    NSString *documentsPath = [paths objectAtIndex:0]; //Get the docs directory
    NSString *filePath = [documentsPath stringByAppendingPathComponent:@"chartImage.png"];
    NSData *pngData = [NSData dataWithContentsOfFile:filePath];
    [self.glanceImage setImageData:pngData];  


And the Apple Watch and iPhone lived happily ever after (for many many years).

Hope you liked the story. The code is uploaded in Github, you can grab it here. You’re welcome! 🙂

Apple Watch Notifications

Remote and local notifications are not at all a new thing. From the very day – the 17th of June, 2009, they were introduced, they have always been, arguably, the most favourite mechanism to deliver messages to the user. The messages has ranged from proximity alert to a store, new update to an app, directional instructions, illegal promotions to deranged gibberish that does not make any sense:


 But in Apple watch, notifications take a new dimension. From the way the notifications are designed for  Watch, it appears quite evident that Apple has spent considerable amount of energy to make them better and more meaningful. Following is the flow showing how a notification is presented to an  Watch user.

  1. According to the  Watch Human Interface Guidelines and WatchKit Programming Guide, when an app running in the iPhone receives a notification, iOS decides where to show the notification. Though this is kind of a vague statement, but as of now there seems to be no control over this even if we specifically need the notification to be shown in  the  Watch. Also, I could not find any information on how iOS decides to show the notification where it decides to show. Guess we’ll have to wait a bit for to know that.
  2. If the notification is sent to the  Watch, the user feels a subtle vibration on his wrist or a mild audio cue based on the notification’s payload.
  3. Alarmed, as the user raises his hand, a very minimalistic screen is presentednotification_short_look which is called Short Look interface. This is an immutable, non-scrollable screen that conveys the user the most important information about the notification. iOS designs them based on a predetermined template and present on the screen. This is  Watch’s interpretation of your everyday notification, with just a title provided by you in the notification payload.
  4. All work and no play makes Jack a dull boy. Who understands this better than Apple? So, here is the shining playful part. The customisable, scrollable, actionable notification screen. After the Short Look notification is displayed, if the user continues to keep his hand raised (in the hope that something else will happen…soon…well..anytime now…), or taps on the Short Look interface the Long Look interface is displayed.

Apple has given you freedom within the aesthetic boundary to design the Long Look interface. You can add buttons and graphics and customise the behaviour when the user taps on it.

But what happens if you don’t provide the Long Look interface? Well, Apple has a backup plan. iOS displays a default interface with the app icon, title string and alert message. Tapping on it launches the app.

OK, so let’s not allow Apple to have all the fun and design our own Long Look interface!

A Long Look interface has three major parts —

  • The Sash at the top of the screen — this includes the title string and the icon of the app
  • The content — this is your playground. Add any graphics and buttons
  • The dismiss button — this is always present, added by the iOS, and dismisses the notification once tapped

 In the Sash section, as a third party developer, you have basic freedom. You can change the tint color and title of the notification.

In the content, you have much more liberty. You can modify and design the label to show your notification message. You can add buttons, but not directly. All the buttons should come in a specific format through a JSON payload that will invoke the notification. The SDK already generates one  such JSON payload file while creating  notification scene for testing purpose.

Screen Shot 2014-12-04 at 02.29.48Changing the alert, title and category controls what the notification screen will display.

As you see above, in the “WatchKit Simulator Actions” array is holding a collection of buttons in the form of dictionary which can be used to add/remove/modify buttons in the notification.

To create a notification, create a new project and add a new Watch Target as discussed in my previous post. This time keep the “Include Notification Scene” checkbox selected to include our notification interfaces.

Screen Shot 2014-12-04 at 02.35.44

Include all the necessary app icon images. Apple could not think of any more, so they want only the following few dimensions :

  • 29 X 29
  • 36 X 36
  • 58 X 58
  • 80 X 80
  • 88 X 88
  • 196 X 196
  • 172 X 172

Xcode will generate two extra interfaces for you in the interface.storyboard (other than the usual screen for your watch app) inside your project. They are —

  • Static Interface — A notification interface that can be configured design time. It is mandatory to keep this interface in your app bundle.
  • Dynamic Interface — A notification interface that can be decorated with dynamic data in runtime. This is not mandatory.

When running the app, iOS first looks for Dynamic Interface. If not found it falls back to Static Interface. If the static interface suffices your n0tification requirement, it is safe to delete the dynamic interface. Also, it can be explicitly instructed not to show the dynamic interface.

For the time being, lets change the Static Interface. What we are trying to do here is —

  •  show a notification stating that Japan is already enjoying the new year with an action button.
  • Tapping on the “View” button the app will launch and
  • display the current time in Tokyo

Now create a new scheme for notification and map the notification executable to that scheme to view the  Watch notification in iOS simulator.

Screen Shot 2014-12-04 at 03.07.07

Screen Shot 2014-12-04 at 03.07.28

Screen Shot 2014-12-04 at 03.15.01

If you build and run your app now, the default static notification screen will show. Take into notice that the message text, button text everything is being pulled from the JSON file that is included in the bundle.  you can try changing them to see if the change takes place in the notification.

Before we do some action, lets modify the JSON file to suit our needs by changing title, message and category. (Screenshot above) Lets name them as follows:

&quot;aps&quot;: {
        &quot;alert&quot;: &quot;Japan is already celebrating new year!&quot;,
        &quot;title&quot;: &quot;Happy New Year!&quot;,
        &quot;category&quot;: &quot;NewYear&quot;

Also, in the Interface.storyboard file, select the “Category” under Static Notification Interface Controller and in the attributes inspector, change it to “NewYear”. Make sure that the category names are matching in the JSON as well as in the storyboard. Otherwise, the app will not build at all.

Now we want the user to tap on the button and make something happen. Let’s add a date label to the interface of the  Watch app which will display the date based on the timezone set. Hook it to the InterfaceController as dateLabel.

Inside the Interface Controller, we can handle the notification like so:

@IBOutlet weak var dateLabel: WKInterfaceDate!
override func handleActionWithIdentifier(identifier: String?, forRemoteNotification remoteNotification: [NSObject : AnyObject]) {

        if let id = identifier {
            if id == &amp;quot;firstButtonAction&amp;quot; {

                var plistKeys: NSDictionary?
                var timeZones: NSDictionary?

                if let path = NSBundle.mainBundle().pathForResource(&amp;quot;Timezones&amp;quot;, ofType: &amp;quot;plist&amp;quot;) {
                    plistKeys = NSDictionary(contentsOfFile: path)!
                    timeZones = plistKeys![&amp;quot;TimeZones&amp;quot;] as NSDictionary?

                if let dict = timeZones {
                    NSLog(&amp;quot;%@&amp;quot;, dict.valueForKey(&amp;quot;Tokyo&amp;quot;) as String)
                    dateLabel.setTimeZone(NSTimeZone(name: dict.valueForKey(&amp;quot;Tokyo&amp;quot;) as String))



Now build and run the app. At first it will show your designed screen. Tapping on the dismiss button dismisses the notification screen. Tapping on “View” button shows Tokyo’s current time.

Right now, the app is showing the static notification screen. If you want to show a custom dynamic notification with data that can only dynamically be passed into the interface, we need to modify the “Dynamic Interface”. The Dynamic Interface is controlled by the NotificationController.swift. If you navigate there, you will find two functions commented out

override func didReceiveRemoteNotification(remoteNotification: [NSObject : AnyObject], withCompletion completionHandler: ((WKUserNotificationInterfaceType) -&gt; Void))

override func didReceiveLocalNotification(localNotification: UILocalNotification, withCompletion completionHandler: (WKUserNotificationInterfaceType) -&gt; Void)

Uncomment the

didReceiveRemoteNotification(remoteNotification: [NSObject : AnyObject], withCompletion completionHandler: ((WKUserNotificationInterfaceType) -&gt; Void))

and make sure that the completionHandler is set to be .Custom

override func didReceiveRemoteNotification(remoteNotification: [NSObject : AnyObject], withCompletion completionHandler: ((WKUserNotificationInterfaceType) -&gt; Void)) {
        // Tell WatchKit to display the custom interface.

Now if we make any modification to the dynamic interface, you will see that the dynamic interface with its changes is being shown as the notification screen. This is because, as I mentioned earlier, iOS searches for the custom dynamic interface first. If it can not find one, then only it loads the static one.

Try changing the .Custom to .Default to see your static interface.

You can download the whole project from Github —

Hope you will enjoy building for  Watch as much as I did. I will try putting in something more as I learn. Please do leave a reply and feel free to share if you like!

Hope this helps!

Watch and Learn!

“Tell me and I forget. Teach me and I remember. Involve me and I learn”
— Benjamin Franklin

WatchKit was released a few days back, and its probably the beginning of a new era. It is almost like the time when the first iPhone Development Kit was released for registered Apple developers. Though this time it’s a little disheartening to understand the limitations and constraints imposed on the  Watch apps, but knowing Apple, I am sure it won’t last long. Its a really clever move from Apple to make the Watch extremely lightweight and thus very much battery efficient. Still I believe – it’s just a matter of time before Apple finds a way to improve the hardware and battery life of the watches to make them self reliant and sufficient to host their own native standalone apps.

It’s a very interesting time for iOS developers, not quite like ever before. We have a new faster, cleaner and better language and a new device to develop for. So, I decided to learn them together. You have guessed it quite right, this post is an effort to  assimilate my knowledge on Swift and WatchKit. I really got inspired by Natasha the Robot, who has recently posted a series of excellent posts describing in depth about making  Watch apps. Her blog is worth a visit. Also she has a guest post in NSHipster as well! So, coming back to the point, I intend to follow her loosely as I embark on the journey to learn WatchKit and perform the same exercise in Swift. There is no better way to learn other than doing it ourselves, is there?

Before we start, here is something on NDA on WatchKit in case you are wondering:

A Note on NDAs – while an Apple developer account with NDA is required to download Xcode betas and actually use WatchKit, all of the reference documentation is publicly available at this link.  The programming guide is similarly available publicly here. Since all of this information is public, we don’t have to worry about NDA trouble.



Before we delve into coding, let me explain the architecture of  Watch apps. This will help us understand the apps better, let us take informed design decisions and help cultivate useful and innovative ideas leveraging the  Watch and its capabilities.

The  Watch apps are nothing but extensions of app extensions and are designed the same way as the recently introduced extensions are designed. So, the WatchKit apps will have 2 parts —

  1. The application extension which will be running on a paired iPhone
  2. The app installed in the  Watch

As a basic principle of App extensions in Apple platform, a container app and an app extension can not interact directly, however, they may do so indirectly through intermediator. The intermediator would be WatchKit.

All the heavy liftings, like – implementation of business logic and data manipulation etc. would be done in the app running on the iPhone, whereas the Watch app will contain the main story board and other UI resources. These resources will form the outward interface of the Watch app which will be powered by the iPhone app. In a simpler sense, the Watch app would be the face and the iPhone app will be the brain behind it.

WatchKit Architecture

Watch App Architecture

In the above diagram, the left hand side box represents the iPhone and the right side box  Watch. As the smaller boxes depict, the app extension (WatchKit Extension) runs on the iPhone and interacts with the WatchKit. The Watch App contains all the UI elements required for displaying the app and these resources can not be changed in run time. So, if you need to show a custom view at some point of time in the life cycle of the application, you would need to plan ahead and keep a hidden view already there in the Watch app which you can unhide and display.

So, when the user of the  Watch touches a notification or views a “glance” in the Watch App, the Watch invokes the app installed in it and opens up the appropriate storyboard scene and UI resources. The app then requests the WatchKit extension running on the iPhone through the WatchKit framework to respond to the events and updates the UI based on the response received from it. This communication takes place for all the user events like touch and other gestures registered by the Watch app. The code executed in the iPhone app is responsible for updating the UI of the Watch app and perform any necessary operations including generating dynamic content and sending over to the Watch app for display.

Po2Enough theory, let’s code!

For the following codes we assume that Xcode 6.2 (beta at the time of writing this) or above is installed.

Hello, World!

I will stick to the tradition and start by creating an  Watch application which says to the wearer — “Hello, World!” Then we shall move on to create a little more complex and interesting things.

  • Create a new project in Xcode with “Single View Application”. Lets name it as “AppleWatchDemo”. Make sure you select the language as “Swift”, we need to learn Swift too — isn’t it?

 Screen Shot 2014-11-26 at 22.07.12

  • Add a new target by selecting “Edit –> Add Target” or by selecting the project file and then in the properties window, expand the target dropdown and select “Add Target”.
Screen Shot 2014-11-26 at 22.10.49 Screen Shot 2014-11-26 at 22.11.25
  •  In the template selection window, go to “Apple Watch” section and well, you know what template to choose here. 🙂

Screen Shot 2014-11-26 at 22.20.28

  • Click on next and if you want, uncheck the “Include Notification Scene” and “Glance Scene” checkboxes.

Screen Shot 2014-11-26 at 22.24.59

  • Click on “Finish”. As alway Apple promises, you already have a Watch App ready to be deployed. Only it’s blank. So let’s put something in it to show. We are going to show “Hello, World!” in a UILabel.
  • Now, go to the Interface.storyboard file of the Watch App. You will see an interface of the Watch is present there. At the top right there will be a time label, one of those fancy labels Apple has created which make you life easier by showing the time, or displaying count down timer. It’s a watch after all, why shall we be surprised? 🙂 As you can clearly guess, if you run the app now, it will show a small screen with current time at the top right corner.

Screen Shot 2014-11-26 at 22.56.21

  • Now lets add a label. Drag and drop a UILabel into the window. To centre align — select the label and go to the Attributes Inspector. Under “Position” section change both the drop downs to “Centre”. This is how the interface design works in  Watch. There is no fixed frame, no autolayout. Everything you layout on the screen will be laid out horizontally side by side. Now change the text to read “Hello, World!”.

Screen Shot 2014-11-26 at 23.08.51

  • And…you are done. Select the target to be Watch App and then run the project. You will see the iPhone simulator and the Watch Simulator launch together and the Watch display “Hello, World!”
  • Congrats! You have made your first  Watch App!

Delving Deeper in to the world of the clocks and watches

But we are serious developers, why should we be happy with just a “Hello World” app? We need more, isn’t it? Let’s build some tableview goodness.

Along with many interesting features, Apple has also provided the Watch with some extra UI elements which are enhancements of previous primitive ones. Timer and Date Labels are two of them. When a date label is displayed, it shows the current time at your convenient format, without having you to write a single line of code for it. Leveraging them we are going to build a world clock (watch!) application which will help us explore the tableview as well as the Date Labels. The final product will look something like this —


As you can see, I am sitting in London and the current time is being displayed at the top right corner of the watch which is 9:43 PM.  The other cities are also displaying their respective times.

  • So, lets remove the Label displaying “Hello, World!” and add a Table to the Interface.storyboard, the main story board file of your watch app. You will see a Table Row Controller getting added to the Table automatically.
  • So, these will be the template for our rows. Lets add a Label to show the city name and a Date Label for the times. Vertically centre align the labels and set appropriate width the same way we did for Hello World label.
  • Set the Format of the date to be custom — hh:mm a which will display similar to 09.00 pm. Set the font to be “System Bold” and of size 13. Also, for the City label, let’s make the font to be System and size to be 13 as well.
  • The attribute inspector also provide lots of attributes to play with, tinker to your heart’s content!


Screen Shot 2014-11-27 at 21.49.18Screen Shot 2014-11-27 at 21.49.01
  • Finally, some code. Create a new file in the WorldWatch WatchKit Extension and name it LocalTimeRowController.swift. Make it inherit from NSObject and import WatchKit into it.
    import WatchKit
    class LocalTimeRowController: NSObject {
    @IBOutlet weak var countryLabel: WKInterfaceLabel!
    @IBOutlet weak var localTimeLabel: WKInterfaceDate!
  • Now lets add a yellow background colour to the row and set its height to be “Fixed Height” and make it 30 Screen Shot 2014-11-27 at 22.14.06
  •  Move over to your Interface.storyboard in the WorldWatch Watch App and select the Table Row Controller in the left hand pane. Change the class name for the controller to be LocalTimeRowController.Screen Shot 2014-11-27 at 22.41.31
  • Also, change the Row controller identifier in the Attributes Inspector to be “LocalTimeRowController”.Screen Shot 2014-11-27 at 22.41.42
  • Create Outlets of the labels we created in the LocalTimeRowController. They will help us set the text and attributes.


  • Since we will be showing times for multiple cities, we will need to know the timezone names for all the cities and how to refer them in the code. Fortunately this useful Gist provides what we are looking for in a nice plist.
    I have extracted out the relevant part and you can download it from here. Download the file and include the plist file into extension project.
  • Now head over to InterfaceController.swift and lets put the real logic there for populating the table. As you see, there is absolutely no code we are putting in the Watch app project. All the controlling logic code goes to the extension which is about to run in the iPhone. Implement the following method which populates the table —
      private func populateTable () {
            var plistKeys: NSDictionary?
            var timeZones: NSDictionary?
            if let path = NSBundle.mainBundle().pathForResource("Timezones", ofType: "plist") {
                plistKeys = NSDictionary(contentsOfFile: path)!
                timeZones = plistKeys!["TimeZones"] as NSDictionary?
            if let dict = timeZones {
                table.setNumberOfRows(dict.count, withRowType: "LocalTimeRowController")
                var keyArray = dict.allKeys as [String]
                func alphabaticalSort(s1: String, s2: String) -&gt; Bool {
                    return s1 &lt; s2
                var sortedCityNamesArray = sorted(keyArray, alphabaticalSort)
                for (index, key) in enumerate(sortedCityNamesArray) {
                    let row = table.rowControllerAtIndex(index) as LocalTimeRowController
                    row.countryLabel.setText((key as String))
                    var value: AnyObject? = dict[key as String]
                    row.localTimeLabel.setTimeZone(NSTimeZone(name: value as String))
  •  Essentially, we are just taking all the City names (which are the keys from the plist file we included in the bundle) and displaying them in the City labels. On the other hand the values, which are name of time zones, are being used to set the time zone for the Date labels. So, each of them shows respective time for the timezones assigned to them.
  • Call the populateTable method from the init of the class.
        override init(context: AnyObject?) {
            super.init(context: context)
  •  Now select the executable target to be WorldWatch Watch App and run the project. Voila! You can now see through the dates of each of the cities which are being updated realtime.

You can download the whole project from Github —

Hope you will enjoy building for  Watch as much as I did. I will try putting in something more as I learn. Do leave a reply and feel free to share if you like!


Kung Fu Panda Image © Copyright respective owner

App to App Switching in iOS (Part I)

Hyperlinks in websites has been there since the dawn of time, and have well served the human kind to redirect the audience to helpful targets, easing away cross referencing and seamless navigation. So, when the civilisation moved on, from websites to mobile apps, hyperlinks evolved to app-links to keep pace with the ever increasing demand of inter-app travel for users. Even more so, because mobile applications, having always been seen something of a scaled down version of desktop apps to account for the low memory in smartphone, differs from their desktop ancestors mainly in the fact that they do not serve the user with diverse functionalities by its very nature. One app can be very good at doing something, but it must depend on its siblings to do something else. This clear demarkation gives the user more choice and flexibility to get the best of everything and keep the competition tense, productive and healthy. What was once “One App to Rule ‘em all” is now a constellation of apps who perform their part of the job (with absolute bureaucracy), with absolute brilliance, better than anyone in the market.

Application switching guidelines

As we are talking about a two way communication between the apps, we need to consider both of the scenarios —

  • The current app is invoking another app and
  • The current app is being invoked by another app

Before moving into the discussion on how to achieve them, there are certain precautions and guidelines that one should be aware of.

Invoke another app from current app

While launching another app, the invoking app does not need to be very wary about its own security, unless there are some data that are being exchanged between the apps. It is always good to remember the below points when invoking an application —

  1. The launched app might not be the one you intended to launch and you can never really ensure that. An app can fake itself to be one of yours to fool the user into giving away the credentials. So, if you are transferring any data be sure to encrypt them so that only the intended application may have the sufficient means to decrypt that.
  2. It is a good user experience to allow the user to launch Apple app store to download the app, if it is not installed. So,  its better to check if the app is installed in the device and accordingly launch app store or the app itself
  3. Unless you have very obvious reason for not doing so, its always advisable to inform the user that by invoking the other app he/she is actually going to leave the current app (in the background)
  4. Finally, your app is going background and has the potential risk of getting terminated in case memory become scarce. So, prepare by serialising and saving data to persistent store, so that the user does not lose anything

Invoke current app from another app

Invoking the current app from another app is the riskier part of the whole functionality. Any application can know about the URL scheme used by the current application and launch it by invoking the scheme. So, automatically potential risk of divulging data to unintended recipient gets increased. Following are a few cautionary words to be remembered in this aspect —

  1. A malicious app can pass many parameters along with the URL scheme which, if not handled properly and being relied upon, can cause havoc in the security of the system and  of the user
  2. If you are accepting commands from trusted applications, be wary because hackers will try every possible combinations of keywords to guess the commands that you accept blindly from your trusted apps
  3. Do not accept any commands with arbitrary path names as they can easily be substituted with other path names by people with malicious intentions

Application Switching Techniques

The following paragraphs will delve into simple application switching techniques most commonly used at the time of this writing.  As discussed before, an application switching functionality has 2 aspects, launching another app from current app and launching current app from another app. We will discuss about them separately in the following paragraphs:

Invoke another app from current app

To invoke another app from current app, the custom URL scheme of the other app is required. Custom URL scheme of the app is generally defined in the info.plist file of an application and is generally a short word. The app can be launched using the custom scheme as a protocol. For example, to launch an app with custom URL scheme as “myscheme”, the app must be launched with an URL like — myscheme://.  However, easy as it is to invoke an app from the current app, it also makes sense to take steps to direct the user to the app in the easiest way. Directing an user to the app store when the app is installed in the device, or trying and failing to launch an app just because it is not installed in the device — both have  adverse effect on the user experience. So, the following steps would be providing a good user experience —

  1. On the tap of the application link, ask the user whether he/she is fine to be moving away from the current app
  2. If no, no action to be taken
  3. If yes, internally the application will check if the app is installed in the device by using UIApplication class’s canOpenURL method which returns YES when the targeted app is present in the device and is responsive to the openURL message
  4. If the canOpenURL returns YES, invoke the target application using openURL and passing the target applications custom URL scheme as a NSURL object to it
  5. If the canOpenURL returns NO, that means the application is not installed in the device and the app store needs to be opened. The app can then pass the message openURL to the UIApplication’s shared instance with the target application’s app store URL as a parameter

Following is the code for the above mentioned logic —

  1. Code for showing alert about leaving the app
    [[[UIAlertView alloc] initWithTitle:@"Info" message:@"This will take you out of the current application and will try to launch the target app. Do you want to proceed?" delegate:self cancelButtonTitle:@"OK" otherButtonTitles:nil, nil] show];
  2. Code for launching the app store or the app conditionally based on the existence of the app in the device:
    - (void)alertView:(UIAlertView *)alertView clickedButtonAtIndex:(NSInteger)buttonIndex
     NSURL *customURL = [NSURL URLWithString:@"myappurl://"];
     NSURL *appStoreURL = [NSURL URLWithString:@"itms://"];
     if ([[UIApplication sharedApplication] canOpenURL:customURL])
     [[UIApplication sharedApplication] openURL:customURL];
     [[UIApplication sharedApplication] openURL:appStoreURL];

Invoke current app from another app

When an application is invoked from another application, the invoked application may or may not be running in the background. Since in iOS only one instance of an application can run at a time, the iOS can intelligently understand whether to launch the app, or to show app already running in the background. Following is a flow chart depicting the case when the application is launched (from not-running state) —

Flow chart for launching app

Following is another flow chart which depicts the scenario when an app already running in the background gets invoked because of some other app called its custom URL scheme.

Launching app from background

Inside the launched app, all the handling of the URL invoked with should be dealt with in the following method —

- (BOOL)application:(UIApplication *)application openURL:(NSURL *)url sourceApplication:(NSString *) sourceApplication annotation:(id)annotation

If you are intending to do nothing with any parameters passed into your app, just returning NO will be safe and secure. However, if the received parameters are to be used, it should be filtered out for unwanted commands first.

Next Steps

The next step in this journey would be to display a bar at the top of your application which will allow you to glide back to the application which launched the current foreground application. Whether its a good practice or a outright bad one is an argument for another time, but for the time being we will concentrate on how to achieve this. We are going to discuss this in the next part of this discussion which I am going to post soon.