FREELessons: 8Length: 41 minutes

Next lesson playing in 5 seconds

  • Overview
  • Transcript

2.1 Sample Exploration

In this lesson, we'll explore the sample project in order to learn the basic features of a Cardboard app. We'll look at the component parts of the sample app in more detail, and I'll give you an overview of how it works. We'll look at the Java and XML layout files that work together to present a stereoscopic view to the user.

2.1 Sample Exploration

Hello, welcome back to the Touchplus VR for Google Cardboard and Android course, my name's Sue Smith. Last time we got Android Studio set up to work with the Google sample app for Cardboard. This time we will explore the project. Let's start by opening the main java activity file. At first glance it looks like a lot of code. But a large part of it is for modeling with 3D shapes. The sample project defines the shapes in OpenGL, which uses a substantial amount of code. In reality, you might use Unity or another third party to create your 3D shapes. So a lot of this code might not apply when you build your own projects. Let's have a look through the key components that you'll need when you create any Cardboard app starting with the manifest. We'll assume that you've used Android manifest before. But for Cardboard apps, it's worth noting. The users feature element which allows the app to use OpenGL version 2 Their screen orientation, this must be set to landscape for VR apps. And the Cardboard layout view will only render in landscape. The NFC permission allows the app to access the NFC tag in the user's Cardboard viewer. The permissions to read and write to external storage allow the Cardboard SD key to pair the user device to the Cardboard viewer. The app uses the accelerometer and gyroscope in the user device to track their head movements as they move around a VR scene. And finally, the intent filter for Cardboard will allow the user device to list the art as cardboard compatible. Let's go back and look at the main activity in more detail. The activity extends Cardboard activity. When the device presents a Cardboard activity, it uses the full screen. So the user isn't able to access other parts of the user interface on their screen while viewing a cardboard app. The activity also implements the interface Cardboard view stereo renderer. If you've used any Cardboard apps already, you'll have noticed that the view presents two images side by side, one for each eye. This is known as stereoscopic rendering. The two images create the 3D effect when the user views the app. So the stereo renderer handles the details of presenting the scene in this way. Implementing the interface requires us to provide certain methods within the class. Let's scroll down to the on create method. We see that the activity uses a view called Cardboard view. We can see this in the app layout directory. We'll switch to the text view so that we can see the code. And you'll find at the moment Android studio isn't able to render the Cardboard layout resources graphically. Because these features are still experimental. Notice that the layout file contains two views. One of them is a Cardboard view. This is the view you're likely with to use whenever you create Cardboard apps. It presents the stereo images for the scene that you define in your app. The sample app also uses a Cardboard overlay view. If you're on the app already, you'll have seen the overlay text appear in this view. If we go back to the main activity, we see that the code also sets a renderer on the view. It's able to use the activity class because that implements the stereo renderer interface we looked at earlier. Let's look at the interface methods. On new frame will execute every time the app renders a scene. You can see that as part of the method, the sample app retrieves the user's head position. Which it will use to determine whether or not the user is looking at an object in the 3D scene. The own draw eye method executes when the scene is rendered for each eye. It uses the OpenGL shape code to draw the visible shapes, which use some additional helper methods we'll look at in more detail later in the course. Finally, let's look at two methods in the class that are for user interaction. The app uses a helper method to determine whether or not the user is looking at the cube shape. When the user looks at the cube, the app renders it in a different color. When the user presses the trigger in the Cardboard viewer, the app also calls the method. Displaying the text overlay if the user was looking at the shape. And at that point the app also moves the cube to a different location. Next time we'll start making some changes to the sample app code.

Back to the top