Backpropagation Programming

Highly skilled Engineer with 14 years of experience in academia, R&D and commercial product development supporting full software life-cycle from idea to implementation and further support. During my academic career I was able to succeed in MIT Computers in Cardiology 2006 international challenge, as a R&D and SW engineer gain CodeProject MVP, find algorithmic solutions to quickly resolve tough customer problems to pass product requirements in tight deadlines. My key areas of expertise involve Object-Oriented Analysis and Design OOAD, OOP, machine learning, natural language processing, face recognition, computer vision and image processing, wavelet analysis, digital signal processing in cardiology. That console app on codeproject is limited to training on 1 output neuron only.

My ANN lib supports any number of output neurons 3, 10, 100. You need just to modify main console file to feed to ANN lib desired outputs like: 0.9 0.1 0.1 for 1st class 0.1 0.9 0.1 for 2st class 0.1 0.1 0.9 for 3st class and so on, for any number of output neurons you have to put cls1, cls2, cls3. Entries as you presented in your post to one file cls and provide just empty file like void and do the training: >>ann1dn.exe t net.nn cls void 1000 I've got developed before the same looking console that induces ANN to train and run on any number of outputs as I described, I can give you it but not for free.

Yes, another neural network breakthrough. This one provides us with a way of training using reinforcement learning, but without the need for the biologically implausible and expensive back propagation method. The latest work from Open AI is quite a shock, or perhaps not if you have been a supporter of. A Step by Step Backpropagation Example (mattmazur.com) submitted 2 years. Neural networks programming looks as a whole different world for someone like me.

Backpropagation Programming