Random Number Generator Untuk Bobot Metode Conjugate Gradient Neural Network

Authors

  • Yudistira Arya Sapoetra STMIK ASIA Malang, Indonesia
  • Azwar Riza Habibi STMIK ASIA Malang, Indonesia
  • Lukman Hakim STMIK ASIA Malang, Indonesia

DOI:

https://doi.org/10.31316/j.derivat.v4i1.161

Abstract

This research develops the theory of NN (neural network) by using CG (conjugate gradient) to speed up the process of convergence on a network of NN. CG algorithm is an iterative algorithm to solve simultaneous linear equations on a large scale and it is used to optimize the process of the network on backpropagation. In the process, a Neural netwok doing random weighting on the weight of v and w  and this weight will have an effect on the speed of convergence of an algorithm for NN by the method of CG. Furthermore, generating the random numbers to take a sample as a generator in this research of neural network by using uniform distribution (0,1) methods. Therefore, the aims of this research are to improve the convergence on NN weighting using numbers which are generated randomly by the generator and the will be corrected with the CG method.

Keywords: neural network, backpropagation, weighting, conjugate gradient

Downloads

Published

2017-07-20

Citation Check