neural network utility is a neural Networks library for the C++ Programmer. It is entirely object oriented and focuses on reducing tedious and confusing problems of programming neural networks. By this I mean that network layers are easily defined. An entire multi-layer network can be created in a few lines, and trained with two functions. Layers can be connected to one another easily and painlessly.
標(biāo)簽: Programmer Networks entirely network
上傳時(shí)間: 2013-12-24
上傳用戶:liuchee
neural network 一個(gè)演示原理的代碼,便于初學(xué)者學(xué)習(xí)。
上傳時(shí)間: 2013-12-25
上傳用戶:181992417
the load forecast based on neural Networks,use the MATLAB
標(biāo)簽: the forecast Networks neural
上傳時(shí)間: 2016-12-21
上傳用戶:iswlkje
k-step ahead predictions determined by simulation of the % one-step ahead neural network predictor. For NNARMAX % models the residuals are set to zero when calculating the % predictions. The predictions are compared to the observed output. %
標(biāo)簽: ahead predictions determined simulation
上傳時(shí)間: 2016-12-27
上傳用戶:busterman
% Train a two layer neural network with the Levenberg-Marquardt % method. % % If desired, it is possible to use regularization by % weight decay. Also pruned (ie. not fully connected) networks can % be trained. % % Given a set of corresponding input-output pairs and an initial % network, % [W1,W2,critvec,iteration,lambda]=marq(NetDef,W1,W2,PHI,Y,trparms) % trains the network with the Levenberg-Marquardt method. % % The activation functions can be either linear or tanh. The % network architecture is defined by the matrix NetDef which % has two rows. The first row specifies the hidden layer and the % second row specifies the output layer.
標(biāo)簽: Levenberg-Marquardt desired network neural
上傳時(shí)間: 2016-12-27
上傳用戶:jcljkh
This function applies the Optimal Brain Surgeon (OBS) strategy for % pruning neural network models of dynamic systems. That is networks % trained by NNARX, NNOE, NNARMAX1, NNARMAX2, or their recursive % counterparts.
標(biāo)簽: function strategy Optimal Surgeon
上傳時(shí)間: 2013-12-19
上傳用戶:ma1301115706
Train a two layer neural network with a recursive prediction error % algorithm ("recursive Gauss-Newton"). Also pruned (i.e., not fully % connected) networks can be trained. % % The activation functions can either be linear or tanh. The network % architecture is defined by the matrix NetDef , which has of two % rows. The first row specifies the hidden layer while the second % specifies the output layer.
標(biāo)簽: recursive prediction algorithm Gauss-Ne
上傳時(shí)間: 2016-12-27
上傳用戶:ljt101007
hopfield neural network
標(biāo)簽: hopfield network neural
上傳時(shí)間: 2014-01-20
上傳用戶:釣鰲牧馬
《神經(jīng)網(wǎng)絡(luò)設(shè)計(jì)》(neural networks design)的源代碼
標(biāo)簽: networks neural design 神經(jīng)網(wǎng)絡(luò)
上傳時(shí)間: 2013-12-18
上傳用戶:wfeel
Kalman Filtering and neural Networks - Simon Haykin 原來網(wǎng)上的文件缺少第五章,這個(gè)完整版,歡迎下載,英文牛B的人可以翻譯好后,再上傳,謝謝!
標(biāo)簽: Filtering Networks Kalman Haykin
上傳時(shí)間: 2017-02-01
上傳用戶:標(biāo)點(diǎn)符號
蟲蟲下載站版權(quán)所有 京ICP備2021023401號-1