New Neural Networks Coming to PA

Last Friday I got the crazy idea to update the neural networks used by the AI. Why, you ask? Well, during work Thursday I watched a live stream by Alex Champandard of that went over modern neural network techniques. There were a couple of things mentioned during the live stream that caught my attention. One was that neural networks are making a comeback. The second was a new (at least to me) activation function that has gained popularity.

The first point made me laugh because not very long ago I would get weird looks and questioning faces when I mentioned using neural networks for Supreme Commander 2. The second point highlighted the fact that my already limited knowledge of neural networks may be getting out of date. So, I took last Friday off of work and spent the majority of the weekend reading research papers and fiddling with the neural networks in PA, looking to see if I could improve them.

The first step was to update my neural network class to be more robust. The class I had in place was pretty rigid and unsuitable for what I had planned. The first thing I did was add support for having more than one hidden layer. It seems like a simple change, but one that was, up until now, completely unnecessary. The research I read backs that up as well. The second thing I added was support for specifying different activation functions per layer type. Sure, I could make them per layer, but for now per layer type is fine. These two changes opened up a lot of new possibilities.

Currently the live build PA uses a trio of 3 layer multi-layer perceptrons, one each for land, fighter, and bomber/gunship platoons. The hidden and output layers of these neural networks utilize a sigmoid activation function to squash the input values it receives from the previous layer. The reason just adding multiple hidden layers isn't an immediate bonus is, as I understand it, due to the sigmoid activation function itself, or rather its derivative. To train a neural network you need to be able to calculate an error amount for each output in the neural network. PA uses the gradient descent method to achieve this. The problem this poses is as you propagate the error value further up the stack of layers this gradient gets tinier and tinier causing the layers nearer to the input layer (top) to have a harder time learning than the layers closer to the output layer (bottom).

To get around this people are using what is called a rectified linear activation function. The benefits of using a rectified linear activation function are three-fold. One, the rectified linear activation function is cheaper to calculate (as is its derivative) which allows you to have more hidden nodes. Two, it allows for sparsity in the neural network, where a portion of the nodes in each layer will be inactive for a given set of inputs. Three, because the full value neurons inputs are utilized (as long as it is a positive value), rather than squashed, it allows for effective gradient descent in the upper layers.

All in all this allows for faster training times and potentially more effective neural networks. This also allows neural networks to grow upwards instead of outwards, which is more efficient to calculate. So far, I have been quite happy with the results. Once the patch that includes them goes live, I hope you will be too. 


Steviepunk said... / February 13, 2015 at 4:43 AM  

I'll look out for the new AI appearing in PA.
It's certainly a good day as a programmer when you find a new technique that is not only better at the required job, but also more efficient!

Been a long time since you last posted, good to see you back :)

Alex Andrina said... / February 27, 2015 at 3:39 AM  

Truly decent to see things like that are as yet being considered in the stage the amusement is! Likewise, Marvelous!!

Game Development

Richard Keene said... / June 9, 2015 at 10:22 AM  

I have found that charge-up base neuron models result in a more realistic behavior. The various 'desires' (e.g. attack, flee, patrol) charge up like neurons accumulate excitation, and then fire toggling to the behavior. This gives a semi-random twitchiness to the AI and very animalistic behavior. In TheVOID I did this for the spiders and drones and they appear to be 'thinking'. ,

Kelley James said... / February 27, 2016 at 1:03 PM  

It is really nice for me to see you and your great hardwork again.Every piece of your work look excellent.Looking forward to hearing more from you!

Games Development South Africa

Steve Robert said... / January 21, 2017 at 4:48 AM  

I am happy that I found your post while searching for informative posts. It is really informative and quality of the content is extraordinary.

hire freelance web developer

Mia Gomez said... / January 23, 2017 at 1:52 AM  

Globalization has decidedly made promoting straightforward for various associations wherever all through the world. Translation or limitation is an astoundingly fundamental part of an association's globalization for it can speak to the snapshot of truth the associations. Here are possibilities for the Game localization companies, Game confinement Services, Video amusements limitation, Gaming Translation Agency, Gaming Translation Company and Gaming Translation Services.

Louise Clark said... / April 16, 2017 at 9:08 PM  

I really enjoyed reading your article. I found this as an informative and interesting post, so i think it is very useful and knowledgeable. I would like to thank you for the effort you have made in writing this article.

Magento Connector

MyMoviesForYou.Com said... / May 7, 2017 at 11:07 AM  


MyMoviesForYou.Com said... / May 7, 2017 at 11:09 AM  

We opened this store to provide affordable gaming peripherals for the average gamer. As a gamer myself, I found that I always have to set a low budget for myself when it comes to shopping for gaming peripherals. With The Gaming Ape, it is possible to own average to high ends gaming peripherals at an affordable price. Do support us so we can get even more products out. We would like to thank you for shopping at our site. Now, Everyone Can Be A Pro Gamer!

Vamshi Krishna said... / June 7, 2017 at 4:03 AM  

Nice blog very useful content and Thanks for sharing.....

Post a Comment