Example Code

Artificial Neural Network Multilayer Perceptron with Back Propagation

Code and Documents

Attachment

Overview

This code is a 3 layer MLP complete with a back propagation option for network training on datasets (I have a variable number of  hidden layers code set up if there is demand for it).

Description

The MLP in this code can be executed with any number of input and output nodes but with one set of input nodes, one set of hidden nodes, and one set of output nodes. It will require two 2-dimensional arrays for the two sets of weights. Run the Demo.vi to get an idea how the code works. The demo example uses to inputs (x,y) with any number of hidden neurons (10 by default) and two output (red, green).


Steps to Implement or Execute Code

  1. Download
  2. Run the Demo.vi

Requirements

Software

     LabVIEW 2009 or higher

Hardware

     Anything that can run LabVIEW 2009 or higher

Additional Images or Video



Example code from the Example Code Exchange in the NI Community is licensed with the MIT license.

Comments
Jok_Garces
Member
Member
on

REVISE ESTE EJEMPLO, PERO SERIA UTIL QUE TAMBIEN SE PUEDA EXPLICAR COMO EMPLEAR EL TOOLKIT MACHINE LEARNING Y LA  UTILIZACION DE LOS ICONOS DEL MISMO ACERCA DE REDES NEURONALES

asdfvar
Member
Member
on

To provide more detail on the example Demo.vi: The code that drives the neural network is ANN.vi. Demo.vi runs using the ANN.vi routine to demonstrate how to use it. ANN.vi is a general purpose 3 layer ANN that can be used for any number of inputs and outputs. In this example, the user sets the data points for training (x,y position for two colored catagories) and the ANN trains on that data. During training, the user can have the NN ingest additional data to observe where the classification boundaries are on the graph both during and after training.

Contributors