CodeNewbie Community 🌱

Cover image for Neural networks in C++ from scratch. Definitions
Tristan
Tristan

Posted on

Neural networks in C++ from scratch. Definitions

Introduction

  • This series is going to be my attempt to build a very basic neural network in C++ from scratch. All the information from this series can be found in the book, C++ Neural Networks and Fuzzy Logic by Valluru B. Rao.

A Quick word

  • This first post in the series is on chapter one and there will be no actual creation of a neural network. Instead we will try to get a solid understanding of basic terminology used when creating a neural network. With that being said, let us move on.

Neural network

  • A neural network is a computational structure inspired by the study of biological neural processing. There many types of neural networks but we are going to start on the layered feed-forward type of neural network. Which is actually one of the simplest forms of neural networks

Layered feed-forward

  • Not surprisingly a layered feed-forward network has layers, or rather subgroups of processing elements. A layer of processing elements makes independent computations on data that it receives and passes the results to another layer and so on until a subgroup of one or more processing elements determines the output for the network. Each processing element makes its computation based upon base upon a weighted sum of inputs.
  • So basically, there are layers and each layer is full of processing elements that each make independent computations on data and passes the results to the next layer. Until the data goes through the last layer, the output layer and outputs our result.

Image of feed-forward network

  • The classic image on a feed-forward network can be found HERE and it is usually what is shown when we think of neural nets.

Output of a Neuron

  • The output of a neuron (processing element) in a neural network is a weighted sum of its inputs.

Example: Price is Right cash register game

  • If you are unfamiliar with the Price is Right, no worries, the example will still make sense. In the cash register game on the Price is Right a few products are described, their prices are unknown to the contestant and the contestant has to decide how many units of each items they would like to buy. The price of that product is revealed and the amount is totalled up on the cash register. If the amount doesn't go above a certain amount, then the contestant wins a prize. The contest must be careful so that the total price does not exceed some nominal value.

Example: explained

  • The products are the neurons in the input layer with its input being the unit price of that product. The cash register is a single neuron in the output layer. The only connections in the network are between each neuron(products) in the input layer and the output neuron.
  • The contestant actually determines these connections, because when the contestant says they want 5 of a particular product. The contestant is assigning a weight(more on weight later) of 5 to the connection between that product and the cash register. The bill of the purchases by the contestant is nothing but the weighted sum of the unit prices of the different products offered. For the items that the contestant does no choose to purchase, the implicit weight assigned is 0.

Weights

  • The weights used on connections between different layers have much significance in the working of the neural network and the characterization of a network. The following actions are possible in a neural network.

1) Start with one set of weights and run the network(no training)

2) Start with one set of weights, run the network and modify some or all of the weights, and run the network again with the new set of weights. Repeat this process until some predetermined goal is met(training).

Training

  • Since the output may not be what we expected, the weights may need to be altered. Some rule then needs to be applied to determine updating the weights, this is called training. Adjusting the weights on each connection between each node until we get a desired result is what we do to train our neural network.

Feedback

  • If you wish to train a network so it can recognize or identify some predetermined pattern, or evaluate some function values for given arguments. It would be important to have information fed back from the output neurons to neurons in some layer before that, to enable further processing and adjusting of weights on the connections. Such feedback can be to the input layer or a layer between the input and output layer, called the hidden layer. What is fed back is usually the error in the output, modified appropriately accordingly to some specification. The process of feedback continues through the subsequent cycles of operation of the neural network and ceases when the training is completed.

Noise

  • Noise is deviation from the actual. A dataset used to train a neural network may have some noise in it, or an image may have random specks in it. The response of the neural network to noise is an important factor in determining its suitability to a given application. We may want to introduce noise intentionally in training to find out if the network can learn in the presence of noise.

Neural Network Construction

  • There are 3 aspects to the construction of a neural network

1) Structure
2) Encoding
3) Recall

Structure

  • This relates to how may layers the network should contain. Structure also encompasses how their interconnections are made between neurons in the network and what their functions are.

Encoding

  • Encoding refers to the paradigm used for the determination of and changing of weights on the connections between neurons. In the case of the Layered feed-forward network, we might initially define weights at random. But then through the process of training, we can update the weights. After we are finished training we are finished with encoding.

Recall

  • Recall refers to getting an expected output for a given input. If the same input as before is presented to the network, the same corresponding output as before should result.

  • Structure, Encoding and Recall are very important because they essentially distinguish between different neural networks.

Conclusion

  • Thank you for taking the time out of your day to read this blog post of mine. If you have any questions or concerns please comment below or reach out to me on Twitter.

Top comments (0)