3.2 UI and Plane Detection
In this lesson we'll get the project started and create a simple, usable UI. We'll also enable plane detection.
1.Introduction2 lessons, 06:04
2.Get Started With ARKit5 lessons, 42:51
3.Example Project: Measurement App5 lessons, 20:40
4.Conclusion1 lesson, 01:14
3.2 UI and Plane Detection
Hi, and welcome back to Get started with Augmented Reality for iOS. In this lesson, we get the project for our measurement app started and set up the UI. So, let's create a new project for this app. I'm calling it Measure. We can leave all the other settings as they are. First of all, I'm going to use my iPhone. Then, I'm going to the ViewController and remove the pre-configured asset from the scene. Now, it's time to set up the storyboard. The way that template is set up, you can't really use it with other elements. So, the first thing we have to do is to replace the scene view with a standard UI view. Within the view we can re-add the scene view and resize it to match the screen. I don't want to use constraints for the scene view. So, I'm just going to use the auto resizing mask. Finally, I am setting it as the scene view property of the view controller. The next thing we need is a label. This will be our cross hair. I'm going to use constraints this time to center it horizontally and vertically with a scene view. The cross hair can be a simple plus sign. It's going to be white. Let's build and run the application for the first time, and see if everything is working before we continue with the development. As usual, during the first launch, they have asked for permission to use the camera. So, that's a good sign. And also the cursor is present. Awesome. Let's continue. I also want the label beneath it that tells the user what we can do. It is going to say something like, type screen to start measuring. I'm going to center line horizontally, and then give it a vertical spacing to the other label. I also downsize it a bit and make it wide as well. Our label is going to indicate whether we are ready to measure or not. So, if it's hidden, we aren't and if it's visible, the user can start measuring things. So, I need an IBOutlet, and I call it infoLabel. Now, we can connect as in the story parts to the actual label. The user interface is set up now, so it's time to change the AR kit configuration. We can add the plain detection, and we want to hide the info level initially. To determine if the tracking system is ready to track services, there is a callback method called, session cameraDidChangeTrackingState. Whenever a change is to, or from limited, or no tracking availability, it will call the function. So, we can ask if the camera state is normal. Or even better, use a switch statement that switches between the normal state where we are unhiding the label. And any other state where we are going to hide the label. So, if we aren't ready to track at any time, the info label will be hidden. Otherwise, it will show. This can also happen mid measurements. Now, let's build and run and see how it does. In the beginning it doesn't show a label. Since the system isn't initialized yet, but after a few second it shows up. Great. Our user interface is now set-up. And play detection is enabled, and the next lesson, we are going start measuring by recording points, and displaying them on the screen. See you there!