2.3 SiriKit: Creating an Extension
SiriKit lets your app have a conversation with your user. In this lesson, we are going to create an extension so our app can work with SiriKit, and we'll start the interaction process.
1.Introduction2 lessons, 04:21
2.What's New in iOS 10?5 lessons, 32:38
3.Conclusion1 lesson, 00:51
2.3 SiriKit: Creating an Extension
Hi, and welcome back to what's new in iOS 10. In this lesson, we are going to create our SiriKit extension. Before I begin, let me start with a disclaimer. Natural language processing is hard and therefore Apple has limited the use cases of SiriKit to a few domains that include messaging and calling, payments, photos, workouts, car play, and what we are going to look at, ride booking. The reason I chose this domain is that it allows us to have a conversation with Siri about the pick up and drop off locations as well as the party size. As before we are going to create a new target to our project. This time it is an intense extension. I'm going to call it ride intent and you can also choose to create a UI extension as well to show a custom UI with sciri. I'm going to leave it on but we won't do that in this course. Then we can activate the intent schema. The main entry point for your intent extension is the IntentHandler class. It contains a lot of sample data, or message intents, but we won't need that so let's get rid of it. Okay, now we have a clean slate. Of course the first thing we need to do is to ask our users if we are allowed to use Siri. I'm going to put this in a help method called requestAuthorization. We can call that from application that finished launching. I'm also importing the Intents framework to have access to the preferences. In those preferences, the authorization status where Siri is stored. You can ask for it by calling requestSiriAuthorization on IM preferences. It will have a call back that can either be authorized, which is what we are aiming for, or denied if the user doesn't want to allow the app access to Siri. Let's just add print statements to get a result in the console. When asking for authorization, we can add a custom message that describes what we are planning to do with Siri. This can be set in the info file as privacy Siri usage description. In the intents extension, we need to specify which intent type the app supports. I'm going to remove the predefined message intents and add the IN request ride intent. Okay, now it's time to handle our intent. Your handler requires only one method that is called handler for intent. First, I'm checking if the intent is of the correct type in case you support multiple ones and want to direct to the correct intent every time. And then return a new instance of RideRequest handler which we are going to create in a minute. I'm also passing a FakeRides attribute to the handler that simulates rides in the city of Vienna where I live. If you don't develop your shared data in a framework, you need to add the shared files to the Intent extension, so they can be accessed. You can do this by selecting the relevant files and then the ride intent in the target membership category under the ride. It's time to create a ride request handler so that we can finally interact with Siri. We need to import the intents framework and then create the plain class with an initializer. That's the fakeRide attributes to the constant in our instance. So how does SiriKit work with Intents? Well, you have three phases in the process, resolution, confirmation and execution. The first one is responsible for gathering all the necessary data, meaning asking for pickup and drop-off locations and so on. You can provide the information statically, but if you're required but don't have a value, Siri is going to ask the user. The second phase, Confirmation, summarizes the gathered data and presents the user what is going to happen if he confirms the ride. In the Execution phase, your app will execute their request and supply status to Siri about it. Let's begin with resolution. First, we need to conform to the IMRequestRideIntentHandling protocol. This makes sure it can access all relevant functions to find the necessary information. First, we need a pick up location. To find it, we can use resolvePickupLocation. Within this method I can switch between a few possible cases. If I don't already have a pick application I can call the completion block with the case meets value. Syria is going to request it from the user. If there is some location I could verify it in my app, but in our test case I'm just going to say success and provide the location. The next step is the drop off location. This works in a similar way. Some rides, however, don't need a drop off location. In this case, you can say, not required to indicate that. You can find all possible states in InPlace mark resolution result. Again, if there is a drop-off location, I will return the success case. If I build and run now, there will be an error since the protocol requires one method to be present which belongs to the execution phase. That handles the final request. For now, let's just say we don't offer service in this area and require an app launch. In the execution phase, we don't just need to return a result, we need to initialize a response with a code and a user activity. When running for the first time, you need to run the main app to request authorization for Siri. If you do so in a simulator, you will run into one major issue. You need an entitlements file and therefore, a development account. I need to add the Siri capability to our application. Let me quickly enter my data and enable Siri. Why do I need an entitlements file? Because Siri intents can't run in the simulator at all. I need to test this on an actual device. I have connected my iPhone to my Mac. You won't need an iPhone 7 for this, it just needs to be running iOS 10 and have Siri. You can see that it asks me for permission to use Siri with TutsplusCourses. As I know now the app name is not ideal. And you will see why in the next few interactions with Siri. As I mentioned, natural language processing is hard and even more so if you don't speak in your native language. Next we need to run our intent and have our first conversation with Siri. Find me a ride with tutsPlusCourses. >> Great, confirming you want to use tutsPlusCourses. [SOUND] >> Yes. >> [SOUND] Sorry, I don't know where that is. Sorry, I missed that. >> Book me a ride with- >> What was that again? >> Book me a ride with TutsplusCourses. >> Would you like to use TutsplusCourses? [SOUND] Where should they pick you up? [SOUND] >> Vienna Airport. >> I wish I could, but the airport utility hasn't set that up with me yet. Sorry, I didn't get that. >> Book me a ride with TutsPlusCourses. >> Would you like to use TutsPlusCourses? Where should they pick you up? [SOUND] >> Vienna State Opera. [SOUND] >> TutsplusCourses can be there. Do you want to request it? [SOUND] >> Yes. [SOUND] >> Sorry, TutsplusCourses doesn't offer service in your area. >> Well, it took me a few tries, but finally Siri understood what I want. And as expected, there is no service in the area. I can tell you that the iPhone 7 understands me much better than the iPhone 6 does, which is the phone I recorded from YouTube production reasons. Let's finish this lesson here and continue with the resolution and execution in the next lesson. See you there.