Functions are the core concept of serverless. In this lesson, I'll teach you how to write and deploy them.
Hi, and welcome back to Introduction to Serverless. In this lesson, you are going to learn about the core concept of the serverless technology functions. I've already told you about the Functions as a Service model in the last lesson. One of the most important ways to eliminate servers from your architecture is to remove a server runtime that needs to be executed and up all the time. The main cloud providers all have services to allow you to run individual functions. On Amazon Web Services, it's called Lambda, on Google Cloud it's called Cloud Functions, and on Microsoft Azure it's Azure Functions. They all share the same functionality, they define a function, add a trigger to have it executed in the certain circumstances, and you can connect them to other services very easily. The main point is, did you pay per invocation? Time of execution and the amount of memory used. It is a miniscule amount, so you need to be in the millions of executions to pay above $1 per month. I'm going to continue this lesson now, by using AWS Lambda as an example, but it's similar on the other providers as well. Let's have a look at an example function. I'm already logged in to my account. To get to the Lambda service, you can either type Lambda inside the search box or go to Services, and under the Compute section, which is the first one in the list, you'll find Lambda. If you don't have any functions yet, you will be greeted with this screen. Keep in mind that Lambda is region-aware, so a function created in northern Virginia won't be accessible in the Ohio region, for instance. So let's create a new function with a button on the right. You will be asked to select a blueprint. If you know what you're doing, you can skip that and use Author from scratch to start with a blank slate. We're going to use an example function, however. There are a lot of predefined functions as you can see. They are either written in Node.js and Python, but AWS Lambda supports Java 8 and .NET Core as well. As long as the binaries are in 64-bit mode. Let's filter for the Node.js 6.10 runtime. As you can see, Alexa skills are a very prominent use case for Lambda. But we are going to use the simple hello-world function. Now, Amazon asks us to choose a trigger to execute the function. You can choose to not have one or configure it later. You have a few options here. One is the API Gateway, which we will talk about in the next lesson. Different CloudWatch events and triggers for S3, the Simple Notification Service, SNS, and more. I'm not going to add one for now. It's time to configure the function. You have to give it a name that is unique to you and the region you're in. Then you can add a description, which is optional but always a good idea. Finally, you need to choose the Runtime. As mentioned before, the options are C#, Java, Node.js, or Python. If you use one of the last two, you can edit them inline below, if they are simple enough and don't have external dependency. Otherwise, you would have to upload a ZIP file containing your code. As you can see, the function exports a handler function that simply logs the values of key1, 2, 3 that were passed into it. Finally, it calls callback to indicate that the function has finished. This is very important, as you can have asynchronous executions. For instance, when you create a database that aren't finished when the function returns. Every invocation of a lambda function will be passed three values, an event with a payload, a context object that contains metadata about the execution, and the callback function. To configure Lambda, you can also add environment variables that are set prior to the execution. This is useful if you have secrets that you don't want to have committed to a repository, these can also be encrypted. Finally, we have to configure what to execute when the function runs. The handler will be the file name plus the exported handler. In our case, we just have one file, which is automatically called index, and we exported the function as handler. So index.handler is the value to give. When you have more complex setups, then you want to change this. But I recommend the handler function name as a good convention for every file. We also need a role that is used to execute a function. This role determines what the function can access. You can either choose an existing role or create a new one from a template or from scratch. I'm going to use the template. You have to have at least a basic Lambda permission set so the function can log its output. In one of the next lesson, we are also going to use more advanced permissions to access external services. If you want to know what each of the templates does, click the Learn more link. You can also add tags to your function, like you can with almost every resource in AWS. And if you have advanced needs, there is also a way to increase the memory up to 1.5 gigabyte, or a timeout up to 5 minutes. The timeout will determine when the function will be terminated forcefully. For the memory, use as little as you need. Function invocations are billed as gigabyte seconds, multiplying the allocated memory by the execution time. A function that reserves 128 megabytes can be executed twice as often for the same price as a function that uses 256 megabytes. You can also choose to use a DLQ, or dead-letter queue, whenever an execution failed. Normally, Lambda will retry the execution of an event twice, but you can choose to send it to an SMS topic or SQS queue instead for further processing by yourself. If you want to access resources that are within the custom VPC, you have to choose it here. You will need to give your Lambda role the appropriate permissions to configure it, though. Finally, if you have performance problems with your function, enable active tracking to allow AWS X-Ray to monitor and trace your function execution. Now that we went over all the settings, we can review what to set, and then create the function. Let's test a function manually. When I press Test for the first time, it will show me a configuration screen to define the event data. There is preconfigured test data for all the function for instance, a DynamoDB Update is quite complex. Let's save and test to see our function in action. For the test execution you can see the result on the top of the page. It shows the value returned by it, as well as execution time, billed duration, memory used, and the complete log output. There, all the outputs from console.log are visible. In the Monitoring tab, you can also see statistics about the function locations, including minimum, maximum, and average values. To review the log files in CloudWatch, you can either click the link that's provided here or go to Services > Management Tools > CloudWatch. On the left side you go to Logs, you will see a list of log groups. In our case, it's just the one Lambda function we have. Select it, and also select the one log stream there is. You can see I hit the Test button a bunch to generate some data for monitoring, and this is all the output. To recap, functions are invoked whenever they are needed and eliminate the need for a server executable to run continuously. The amount you pay for each function invocation is extremely small. Triggers execute functions based on predefined condition. On Amazon AWS logging during invocations happens using CloudWatch. In the next lesson, we are going to talk about using functions to run HTTP endpoints. See you there.