# What to Expect From PHP 5.5

The first PHP 5.5 alpha has been publicly released. After having some time to test and experiment with it, we can now bring you our in-depth overview of what to look forward for!

## Installation

If you'd like to follow along with this article, you'll need to install PHP 5.5 for yourself. You can find the link to the source bundle here. Additionally, if you need the Windows binary file, you can grab it from the same page.

Once you have the source code downloaded, extract everything into a folder and navigate to it with your favorite Terminal program. You can install PHP to wherever you like, but, for convenience, I'm going to create a directory in the root of my drive, called PHP5.5. To create a new folder and then make your user the owner of said folder, type the following into your terminal:

 1 2 sudo mkdir /PHP5.5  3 sudo chown username /PHP5.5 

Next, you have to decide which extensions and features you want installed with your copy of PHP. Since this is an Alpha version, meant for testing only, I'm not going to worry about making it fully featured. For this tutorial, I am only going to install cURL, but there might be other things that you'd want to add, such as MySQL or zip support. To view a full list of configuration options, run:

 1 2 ./configure -h 

Besides the option to install cURL, there are two other properties that we need to set: the prefix and with-config-file-path option. These two properties set up the location of the PHP installation and the .ini file, respectively. So in the terminal, type:

 1 2 ./configure --prefix=/PHP5.5 --with-config-file-path=/PHP5.5 --with-curl=ext/curl  3 4 make  5 6 make install 

The first line configures PHP with cURL, and sets the location to the folder we made earlier. The next two lines build PHP and move the files to the specified directory. The next step is to copy the sample php.ini file to the PHP folder. To do this, run:

 1 2 cp php.ini-development /PHP5.5/php.ini 

You should now have everything installed correctly. The easiest way to test this new version out is to run its built in web server. Navigate to the bin folder inside the PHP5.5 directory (cd /PHP5.5/bin), and type ./php -t /Path/To/Directory -S 0.0.0.0:4444.

• The -t option sets the server's root directory (i.e. the location where you will place your PHP files)
• The -S property sets the IP address and port number where the server should bind to. Using 0.0.0.0 for the IP address tells the server to listen on all IP addresses (e.g. localhost, 127.0.0.1, 192.168.0.100, etc.).

If all goes well, you should be greeted with a message telling you that the server is listening on the IP/port specified, and it will tell you the document root where it's serving from. We can now start toying with the new features!

## Generators

Generators allow you to create custom functions, which retain state between runs.

The biggest addition to PHP 5.5 has got to be generators. Generators allow you to create custom functions, which retain state between runs. It works off a new keyword, called yield, which can be used in a function for both inputting and outputting data.

Essentially, when the function gets to a line that contains the yield command, PHP will freeze the function's execution and go back to running the rest of your program. When you call for the function to continue - either by telling it to move on or by sending it data - PHP will go back to the function and pick up where it left off, preserving the values of any local variables that were defined up to there.

This may sound somewhat cool at first, but if you give it some thought, this opens doors to a lot of interesting design options. Firstly, it simulates the effects of other programming languages that have "Lazy Evaluation," like Haskell. This alone allows you to define infinite data sets and model math functions after their actual definition. Besides that, you don't have to create as many global variables; if a variable is only meant for a specific function, you can include it in a generator, and things like counters happen automatically by the generator itself in the form of the returned object's key.

Well that's enough theory for now; let's take a look at a generator in action. To start off, navigate to the document root you defined when running the PHP server, and create a file, called "index.php". Now, open up the file and type in the following:

 1 2 function fibonacci(){  3  $a = 0;  4 $b = 1;  5 6  while(true)  7  {  8  $a =$a + $b;  9 $b = $a -$b;  10  yield $a;  11  }  12 }  This is the "Hello World" function of infinite datasets. It's a function that will output all fibonacci numbers. To use the generator, all you have to do is type:  1 2 $fib = fibonacci();  3 $fib->current();  4 $fib->next();  5 $fib->current();  6 //...  What's happening here is we're making $fib a generator object, and then you have access to the underlying commands, like current() and next(). The current() function returns the current value of the generator; this is the value of whatever you yielded in the function - in our case, $a. You can call this function multiple times and you will always get the same value, because the current() function doesn't tell the generator to continue evaluating its code. That's where the next() function comes in; next() is used to unfreeze the iterator and continue on with the function. Since our function is inside an infinite while loop, it will just freeze again by the next yield command, and we can get the next value with another call to current(). The benefit to generators is that the local variables are persistent. If you needed to do accomplish something like this in the past, you would have to put some kind of for loop which pre-calculates the values into an array, and halts after a certain number of iterations (e.g. 100), so as not to overload PHP. The benefit to generators is that the local variables are persistent, and you can just write what the function is supposed to do, as apposed to how it should do it. What I mean by this is that you simply write the task and don't worry about global variables, and how many iterations should be performed. The other way yield can be used is to receive data. It works in the same way as before: when the function gets to a line with the yield keyword, it will stop, but, instead of reading data with current(), we will give it data with the send() function. Here is an example of this in action:  1 2 function Logger(){  3 $log_num = 1;  4 5  while(true){  6  $f = yield;  7  echo "Log #" .$log_num++ . ": " . $f;  8  }  9 }  10 11 $logger = Logger();  12 13 for($i = 0;$i<10; $i++){  14 $logger->send($i*2);  15 }  This is a simple generator for displaying a log message. All generators start off in a paused state; a call to send (or even current) will start the generator and continue until it hits a yield command. The send command will then enter the sent data and continue to process the function until it comes to the next yield. Every subsequent call to send will process one loop; it will enter the sent data into $f, and then continue until it loops back to the next yield.

So why not just put this into a regular function? Well, you could, but, then, you would either need a separate global variable to keep track of the log number, or you would need to create a custom class.

Don't think of generators as a way to do something that was never possible, but rather as a tool to do things faster and more efficiently.

Even infinite sets were possible, but it would need to reprocess the list from the beginning each time (i.e. go through the math until it gets to the current index), or store all of its data within global variables. With generators, your code is much cleaner and more precise.

## Lists in foreach Statements

The next update that I found to be quite helpful is the ability to break a nested array into a local variable in a foreach statement. The list construct has been around for a while (since PHP4); it maps a list of variables to an array. So, instead of writing something like:

 1 2 $data = array("John", "Smith", "032-234-4923");  3 4 $fName = $data[0];  5 $lName = $data[1];  6 $cell = $data[2];  You could just write:  1 2 $data = array("John", "Smith", "032-234-4923");  3 4 list($fName,$lName, $cell) =$data; 

The only problem was that, if you had an array of arrays (a nested array) of info you wanted to map, you weren't able to cycle through them with a foreach loop. Instead, you would have to assign the foreach result to a local variable, and then map it with a list construct only inside the loop.

As of version 5.5, you can cut out the middle man and clean up your code. Here's an example of the old way, versus the new:

 1 2 //--Old Method--//  3 foreach($parentArr as$childArr){  4  list($one,$two) = $childArr;  5  //Continue with loop  6 }  7 8 //--New Method--//  9 foreach($parentArr as list($one,$two)){  10  //Continue with loop  11 } 

The old way might not seem like too much trouble, but it's messy and makes the code less readable.

On my Mac's built in graphics card, I was able to go through over 200 million hashes a second!

Now this one requires a little knowledge of hashes and encryption to fully appreciate.

The easiest way to hash a password in PHP has been to use something like MD5 or a SHA algorithm. The problem with hash functions like these are that they are incredibly easy to compute. They aren't useful anymore for security. Nowadays, they should only be used for verifying a file's integrity. I installed a GPU hasher on my computer to test out this claim. On my Mac's built in graphics card, I was able to go through over 200 million hashes a second! If you were dedicated, and invested in a top of the line multi-graphics card setup, you could potentially go through billions of hashes a second.

The technology for these methods were not meant to last.

So how do you fix this problem? The answer is you pose an adjustable burden on the algorithm. What I mean by this is that you make it hard to process. Not that it should take a couple of seconds per hash, as that would ruin the user's experience. But, imagine that you made it take half a second to generate. Then, a user likely wouldn't even realize the delay, but someone attempting to bruteforce it would have to run through millions of tries - if not more - and all the half seconds would add up to decades and centuries. What about the problem of computers getting faster over time? That is where the "adjustable" part comes in: every so often, you would raise the complexity to generate a hash. This way, you can ensure that it always takes the same amount of time. This is what the developers of PHP are trying to encourage people to do.

The new PHP library is a hashing "workflow," where people are able to easily encrypt, verify and upgrade hashes and their respective complexities over time. It currently only ships with the bcrypt algorithm, but the PHP team have added an option, named default, which you can use. It will automatically update your hashes to the most secure algorithm, when they add new ones.

The way bcrypt works is it runs your password through blowfish encryption x amount of times, but instead of using blowfish with a key so that you could reverse it later, it passes the previous run as the key to the next iteration. According to Wikipedia, it runs your password through 2 to the x amount. That's the part that you could adjust. Say, right now, you want to use a cost level of 5: bcrypt will run your hash 2 to the 5, or 32 times. This may seem like a low number, but, since the cost parameter adjusts the function exponentially, if you changed it to 15, then the function would run it through 32768 times. The default cost level is 10, but this is configurable in the source code.

With the theory out of the way, let's take a look at a complete example.

 1 2  $pass = "Secret_Password";  3 $hash = password_hash($pass, PASSWORD_BCRYPT, array('cost' => 12, 'salt' => "twenty.two.letter.salt"));  4 5  if(password_verify($pass, $hash)){  6  if(password_needs_rehash($hash, PASSWORD_DEFAULT, array('cost' => 15))){  7  $hash = password_hash($pass, PASSWORD_DEFAULT, array('cost' => 15));  8  }  9  //Do something with hash here.  10  } 

The password_hash function accepts three parameters: the word to hash, a constant for the hashing algorithm, and an optional list of settings, which include the salt and cost. The next function, password_verify, makes sure that a string matches the hash after encryption, and, finally, the password_needs_rehash function makes sure that a hash follows the parameters given. For example, in our case, we set the hash cost to twelve, but, here, we are specifying fifteen, so the function will return true, meaning that it needs to be rehashed.

You may have noticed that, in the password_verify and password_needs_rehash functions, you don't have to specify the hashing method used, the salt, or the cost. This is because those details are prepended to the hash string.

Salts are used to prevent hashes from being precomputed into rainbow table.

The reason why it's okay to bundle the cost and salt along with the hash and not keep it secret, is because of how the system puts its strengths together. The cost doesn't need to be a secret, because it is meant to be a number which provides a big enough burden on the server. What I mean by this is that, even if someone gets your hash and determines that your cost level requires 1/2 a second to compute, he will know what level to bruteforce at, but it will take him too long to crack (e.g. decades).

Salts are used to prevent hashes from being precomputed into a rainbow table.

A rainbow table is basically a key-value store with a "dictionary" of words with their corresponding hashes as their keys.

## Bits and Bobs

One quite cool thing is the added support for constant string/string dereferencing. What this means is that you can access individual characters in a static string, as if the string was an array of characters. A quick example of this is the following:

 1 2 echo "Hello World"[1]; //this line will echo out 'e'  3 echo ["one", "two", "three"][2]; //this echos "three" 

Next, we have the finally keyword. This is appended to the end of a try/catch block; what it does is instruct PHP that, whether or not the try or catch was called, you want to process the finally section. This is good for situations, when you want to handle the outcome of a try/catch statement. Instead of repeating code in both, you can just put the "risky" part in the try/catch block, and all the processing in the finally block.

Another usage that was suggested by the creator as a best practice is to put all cleanup code in the finally block. This will ensure that you don't, for instance, try to close the same stream multiple times (e.g. your code crashed and went into the catch block after closing already, and you try closing it again).

The last thing worth mentioning is how the MySQL extension will be deprecated in this new release. You should convert your code to either the mysqli or PDO extensions instead. Though it's long since been considered an anti-pattern, it's nice for the PHP team to officially deprecate it.

While there's certainly more updates to dig into, the items in this article represent what I feel are the most important and exciting.

## Conclusion

Thanks for reading; I hope you've learned a bit! As always, if you have any comments or questions, jump into the conversation below, and let's talk!