亚洲欧美第一页_禁久久精品乱码_粉嫩av一区二区三区免费野_久草精品视频

? 歡迎來到蟲蟲下載站! | ?? 資源下載 ?? 資源專輯 ?? 關(guān)于我們
? 蟲蟲下載站

?? userguide.tex

?? 神經(jīng)網(wǎng)絡的工具箱, 神經(jīng)網(wǎng)絡的工具箱,
?? TEX
?? 第 1 頁 / 共 3 頁
字號:
\documentclass[12pt]{book}\usepackage{amssymb}\newcommand{\toolboxname}{NetPack}\title{\toolboxname \\ Userguide (command line) }%\parindent 0cm\begin{document}\newcommand{\file}[1]{\texttt{#1}}\newcommand{\field}[1]{\texttt{#1}}\newcommand{\matlab}{Matlab}\newcommand{\netpackroot}{\texttt{NETPACKROOT}}% begin definitie \tbcmd{\catcode`\}=12\catcode`\]=2\global\def\tbcmd#{\verb}\let\troep]]% eind definitie \tbcmd% begin definitie \mlcmd{\catcode`\}=12\catcode`\]=2\global\def\mlcmd#{\verb}\let\troep]]% eind definitie \mlcmd% begin definitie \seealso{\catcode`\}=12\catcode`\]=2\global\def\seealso#{\par See also: \verb}\let\troep]]% eind definitie \seealso% begin definitie environment example\newenvironment{example}{\redefxverbatim\begin{verbatim}}{\end{verbatim}}{\makeatletter\catcode`\/=0\catcode`\\=12/catcode`/{=12/catcode`/}=12/catcode`/[=1/catcode`/]=2/global/def/redefxverbatim[/def/@xverbatim##1\end{example}[##1/end[example]]]] % eind definitie environment example\newcommand{\MU}{MU}\newcommand{\veci}[1]{\vec{#1}}\newcommand{\veco}[1]{#1}\newcommand{\veca}[1]{#1}\newcommand{\vw}{\vec{w}}\newcommand{\vdw}{\vec{\Delta w}}\newcommand{\vx}{\vec{x}}\newcommand{\vxmu}{\vec{x}_{\mu}}\newcommand{\vxa}[1]{\vec{x}_{#1}}\newcommand{\vt}{\vec{t}}\newcommand{\vtmu}{\vec{t}_{\mu}}\newcommand{\vta}[1]{\vec{t}_{#1}}\newcommand{\vy}{\vec{y}}\newcommand{\vymu}{\vec{y}_{\mu}}\newcommand{\vxi}{\vec{\xi}}\newcommand{\vf}{\vec{f}}\newcommand{\vfhat}{\vec{\hat{f}}}\newcommand{\vecay}{Y}\newcommand{\vecat}{T}\newcommand{\gradE}{\triangledown E}\maketitle%---------------------------------------------------------------------%---------------------------------------------------------------------\chapter{Overview}%---------------------------------------------------------------------The functions in SNN's \toolboxname\ toolbox for \matlab\ can be used fromthe \matlab\ command line, or from a graphical user interface.  This manual describes mainly the command line interface.  The next chapter describes how to install the toolbox. Chapter\ref{nn} is a short introduction to feedforward neural networks andensembles and chapter \ref{netpacktoolbox} descriptes how to use thetoolbox.\section{Toolbox functions and algorithms}The \toolboxname\ toolbox can be used to train anensemble of feedforward neural networks on regression tasks.For training you can choose from three different algorithms, known as\emph{gradient descent}, \emph{conjugate gradient} and\emph{Levenberg-Marquardt}, which all try to minimize a cost function.For this cost function, you can choose to write your own or you can use \tbcmd{wcf_snn}, which isa very flexible cost function. It gives a weighted average of outputerrors, where different weights can be set for each training patternand for each output. The output errors are calculated from outputerror functions, which can be set for each output separately. For theoutput error functions you can choose from squared error, relativeerror, log likelihood, cross entropy and cross logistic, or you canwrite your own.Additionally, for each pattern/output pair you can specify not to includethe output error in the average, thus allowing for multi tasklearning.  %Mathematically this cost function is:%\begin{equation}% E(y, t) = \frac{1}{Z} \sum_{ \{i, \mu | \Delta_{i\mu} = 1\} } %             a_i g_{\mu} f^i(y_{i\mu}, t_{i\mu})%\end{equation}%with $Z$ a normalization factor:%\begin{equation}%Z = \sum_{ \{i, \mu | \Delta_{i\mu} = 1\} } a_i g_{\mu}%\end{equation}%%\begin{tabular}{c l}%$i$ & output index \\%$\mu$ & pattern index \\%$a_i$ & output weight \\%$g_{\mu}$ & pattern weight \\%$f^i$ & output error function for output i \\%$y_{i\mu}$ & prediction for output i, pattern $\mu$ \\%$t_{i\mu}$ & target for output i, pattern $\mu$ \\%$\Delta_{i\mu}$ & parameter controlling over which outputs for%which patterns the summation is done \\%\end{tabular}Instead of using just one network, a more robust estimator can beobtained by training an ensemble of networks.To create the ensemble, the networks can be trained on differentsubdivisions of the input data set in a training and validation set.This subdivision can be done in several ways, of which this toolbox supports\emph{bootstrapping} and \emph{leaving half out}.Each network in the ensemble gives an estimate of the regression.By combining the networks, a more robust estimate is obtained. The networks can be combined in severalways. This toolbox provides a technique called \emph{balancing} whichcomputes a weighted average of the networks in the ensemble. The standard bagging, i.e. equal averaging, is also available.After balancing, you can compute \emph{confidence} and\emph{prediction intervals}, which give you an idea about the accuracyof the estimates.Using \emph{backward elimination} the relevance of (groups of) inputscan be determined.%---------------------------------------------------------------------%---------------------------------------------------------------------\chapter{Installation}\label{install}%---------------------------------------------------------------------\section{Requirements}The toolbox can be installed on Unix and Windows. To use it you musthave \matlab\ installed (tested on version 5.3.1). \emph{Note:} For the graphical user interface (GUI) a stand-alone versionis available, which does not require \matlab.   \section{Unpacking}This toolbox is installed by unpacking in a zip file, named \file{netpack-toolbox-1.1.zip}. To unpack follow the instructions below.\subsection{Unix}The toolbox can be installed in any directory you want. We advise youto install it in your home directory. To install it in the directory\file{/home/user/} just unzip the file \file{netpack-toolbox-1.1.zip}: \begin{example}> unzip -d /home/user/netpack-toolbox-1.1.zip\end{example}This will create a directory \file{/home/user/netpack-toolbox-1.1}, which contains the toolbox and documentation. From now on, we will refer to this directory as \netpackroot.\subsection{Windows}Use a standard unzip utility to extract all files from the file\file{netpack-toolbox-1.1.zip}. Youcan use any folder you like to extract the files to. Allfiles in the toolbox are installed in a subfolder called \file{netpack-toolbox-1.1}.   So, for example, when you install the toolbox in \file{C:}, allfiles will be in \file{C:$\backslash$netpack-toolbox-1.1}. From now on, we will refer to this subfolder as \netpackroot.%---------------------------------------------------------------------\chapter[Feedforward neural networks]{Feedforward neural networks for regression and classification tasks}\label{nn}%---------------------------------------------------------------------%---------------------------------------------------------------------\section{Regression tasks}In a regression task, we try to estimate an underlying mathematicalfunction between input variables $\vx$ and output variables $\vt$, based on a finite number of data points possibly corrupted by noise \cite{bis95c}.We are given a data set of $\MU$ pairs $\{ \vxmu, \vtmu \}$ of inputs and output targets (also known as patterns), which are assumed to be generated according to\begin{equation}\vtmu = \vf(\vxmu) + \vxi(\vxmu),\end{equation}where $\vxi(\vx)$ denotes noise with zero mean.The regression task is to find an estimator $\vfhat(\vx)$ of theregression $\vf(\vx)$. \section{Classification tasks}In a classification task, $\vtmu$ is restricted to $t_{\mu} \in \{-1,1\}$. The targets are assumed to be generated from a probabilitydistribution given by\begin{equation}p(t_{\mu} | \vx) = \frac{1}{1 + e^{- t_{\mu} f(\vx)}  }\end{equation}The classification task is to find an estimator $\vfhat(\vx)$ of theclassifier $\vf(\vx)$ \cite{bis95c}.\section{Outputs and cost functions}A feedforward neural network can be understood as a function  producing some output $\vy(\vx, \vw)$ given some input $\vx$ andnetwork parameters $\vw$.  With the network parameters $\vw$ fixed, we can interpret the output $\vy(\vx)$ of the network as an estimator of the regression or classifier $\vf(\vx)$.  Training a network means adjusting the network parameters $\vw$ insuch a way that the network output is a good estimator.This is done by minimizing a cost function $E(\vecay, \vecat)$ (alsoknown as performance function) withrespect to $\vw$. This cost function depends on all network outputs     $\vecay = \{\vy(\vxa{1}),...,\vy(\vxa{\MU})\}$ for the given inputs,and all target outputs $\vecat = \{\vta{1},...,\vta{\MU}\}$. The costfunction must have a global minimum for $\vecay = \vecat$. It is ameasure for the error of the network outputs.    Both regression and classification tasks can be implemented in thisway, each with a particular cost function.\section{Network architecture}The functional form of $\vy(\vx, \vw)$ depends on the architecture ofthe network. A feedforward network consists of a number of layerswith for each layer a number of units, where the value of each unitdepends on the value of the units in the previous layer. The inputs$\vx$ form the zeroth layer. For each subsequent layer, the value$v^l_i$ of the $i$th unit in the $l$th layer is computed from\begin{equation}v^l_i = g_l(\sum w^l_{ij} v^{l-1}_j + b^l_i)\end{equation}where $g_l$ is a transfer function. All the $w^l_{ij}$ and $b^l_i$together form the network parameters $\vw$ that need to be optimized. For input and output layer, we have:\begin{equation}v^0_i = x_i\end{equation}\begin{equation}y_i = v^L_i\end{equation}where $L$ is the total number of layers in the network.Thus the functional form of $\vy$ is specified by the number oflayers, the number of units in each layer and the transfer functionfor each layer.     \section{Training algorithms}Finding the (global) minimum of the cost function is a difficult task.In training multi layered feedforward networks, one tries to reach aminimum by iteratively adjusting the network parameters $\vw$. In socalled back-propagation algorithms \cite{pdp86} $\vw$ changes in each iterationdepending on the gradient of the cost function. This gradient isdetermined using the back-propagation technique, which involvesperforming computations backwards through the network.    Many variations of back-propagation algorithms exist, which vary in theway they use the gradient to compute the adjustment of the networkparameters.The simplest algorithm, \emph{gradient descent}, updatesthe parameters in the direction in which the cost function decreasesmost rapidly, i.e. the negative of the gradient. Although the decrease of the cost function is the largest in thisdirection, this algorithm does not necessarily produce the fastestconvergence to a minimum. Other algorithms which also use informationabout previous adjustments or second order derivatives generally givefaster convergence. Often used algorithms that use this information are\emph{conjugate gradient} algorithms and the \emph{Levenberg-Marquardt}algorithm \cite{lue84}. \section{Early stopping}When you train a network on a data set, it will try to fit thenetwork output $\vy$ as good as possible to the target outputs $\vtmu$.As a consequence, the network will get biased on

?? 快捷鍵說明

復制代碼 Ctrl + C
搜索代碼 Ctrl + F
全屏模式 F11
切換主題 Ctrl + Shift + D
顯示快捷鍵 ?
增大字號 Ctrl + =
減小字號 Ctrl + -
亚洲欧美第一页_禁久久精品乱码_粉嫩av一区二区三区免费野_久草精品视频
国产精品69毛片高清亚洲| 亚洲一区二区欧美激情| 久久99这里只有精品| 91麻豆精品国产| 午夜一区二区三区在线观看| 欧美三级日本三级少妇99| 亚洲妇女屁股眼交7| 欧美三电影在线| 日本网站在线观看一区二区三区| 日韩一区二区精品在线观看| 美脚の诱脚舐め脚责91| 久久精品综合网| 成人网在线播放| 依依成人精品视频| 在线成人小视频| 国产综合一区二区| 17c精品麻豆一区二区免费| 欧美在线看片a免费观看| 天堂成人国产精品一区| 久久综合狠狠综合久久综合88 | 亚洲精品一区二区三区四区高清| 麻豆精品视频在线| 中文字幕精品三区| 欧美精品xxxxbbbb| 国产伦精品一区二区三区视频青涩 | 久久久久国产一区二区三区四区| 成人免费观看av| 亚洲一二三区在线观看| 精品国产乱码久久久久久蜜臀| 成人福利视频在线| 亚洲chinese男男1069| 亚洲精品一区二区三区福利 | 国产精品久久久久影院色老大| 色哟哟精品一区| 国内精品在线播放| 一区二区三区中文字幕电影| 日韩一区二区免费在线观看| 丁香六月综合激情| 日韩有码一区二区三区| 一区在线播放视频| 精品久久人人做人人爽| 欧洲一区二区av| 久久精品国产色蜜蜜麻豆| 亚洲人亚洲人成电影网站色| 亚洲精品在线网站| 欧美日韩高清一区二区三区| 不卡欧美aaaaa| 久久精品国产精品亚洲精品| 一级特黄大欧美久久久| 亚洲国产成人私人影院tom| 欧美一区二区视频在线观看2020 | 精品一区二区免费| 亚洲一二三四区不卡| 中文字幕一区二区视频| 精品国产一区二区三区不卡| 欧美日韩国产一区二区三区地区| 成人黄色电影在线| 国产91色综合久久免费分享| 日本视频免费一区| 一区二区在线观看av| 精品区一区二区| 欧美一级在线免费| 91麻豆精东视频| 国产激情一区二区三区四区| 青青草精品视频| 亚洲免费av高清| 国产午夜精品在线观看| 欧美一区二区视频在线观看2022| 大白屁股一区二区视频| 国产一区啦啦啦在线观看| 国产精品伦一区二区三级视频| 日韩一级高清毛片| 欧美性淫爽ww久久久久无| 高清国产午夜精品久久久久久| 日本va欧美va精品| 亚洲电影激情视频网站| 亚洲欧美乱综合| 亚洲国产岛国毛片在线| 色综合久久久久综合体| 色欧美日韩亚洲| 成人av影院在线| 国产成人一区在线| 精品一区二区成人精品| 日本视频中文字幕一区二区三区| 亚洲自拍欧美精品| 亚洲人成在线播放网站岛国 | 久久久不卡网国产精品二区 | 香蕉成人啪国产精品视频综合网| 亚洲色图19p| 中文字幕亚洲综合久久菠萝蜜| 久久久久亚洲综合| 久久精品免费在线观看| 欧美一二三在线| 欧美人狂配大交3d怪物一区| 在线观看免费一区| 欧美日韩国产成人在线91| 欧美日韩亚洲综合| 欧美老女人在线| 欧美久久久久免费| 日韩一区二区在线播放| 91精品国产综合久久精品图片 | 欧美国产97人人爽人人喊| 国产欧美精品一区| 国产精品卡一卡二卡三| 亚洲欧洲另类国产综合| 国产拍揄自揄精品视频麻豆| 欧美精品一区视频| 久久久99久久| 亚洲欧洲国产专区| 亚洲色欲色欲www| 亚洲一区成人在线| 亚洲国产综合色| 免费观看成人鲁鲁鲁鲁鲁视频| 日韩成人伦理电影在线观看| 久久精品国产第一区二区三区| 久久成人免费网| 色哟哟国产精品免费观看| 在线观看91视频| 日韩美女在线视频| 国产欧美精品国产国产专区| 国产精品第13页| 亚洲一区二区在线播放相泽 | 中文字幕不卡在线播放| 中文字幕中文字幕在线一区| 奇米精品一区二区三区在线观看| 狠狠v欧美v日韩v亚洲ⅴ| 不卡欧美aaaaa| 欧美日韩精品是欧美日韩精品| 日韩欧美在线影院| 国产精品久久久久久久久久免费看 | 国产亚洲欧洲一区高清在线观看| 欧美激情综合在线| 亚洲丝袜美腿综合| 国产一区中文字幕| 欧美在线观看视频一区二区| 日韩免费看网站| 亚洲同性同志一二三专区| 视频一区视频二区在线观看| 国产精品66部| 午夜精品福利一区二区蜜股av | 91丝袜美腿高跟国产极品老师| 欧美麻豆精品久久久久久| 国产精品久久久久久妇女6080| 日韩高清一级片| www.色精品| 精品久久久久一区二区国产| 亚洲精品国产成人久久av盗摄 | 亚洲视频一二三| 国产麻豆视频精品| 精品视频123区在线观看| 国产偷国产偷亚洲高清人白洁| 午夜激情一区二区| 成人91在线观看| 久久一区二区三区四区| 亚洲一区免费视频| 欧美在线免费播放| 国产欧美精品一区二区三区四区 | 中文字幕在线免费不卡| 三级欧美在线一区| 在线视频亚洲一区| 国产日产精品1区| 九九精品视频在线看| 日韩三级在线观看| 视频精品一区二区| 在线欧美一区二区| 一区免费观看视频| 懂色av中文一区二区三区| 欧美一级欧美三级| 亚洲精品乱码久久久久久黑人| 91视频国产观看| 国产精品久久久久久久岛一牛影视 | 国产精品99久久久久久似苏梦涵| 91.xcao| 一区二区三区毛片| 91在线国产观看| 国产精品大尺度| 成人自拍视频在线| 日韩亚洲国产中文字幕欧美| 免费观看日韩av| 日韩一区二区三区电影 | 色老汉一区二区三区| 国产精品久久久久久久久免费相片| 黄色资源网久久资源365| 欧美精品一级二级三级| 五月天亚洲精品| 欧美日韩国产精选| 亚洲成人激情社区| 欧美日韩一区中文字幕| 亚洲 欧美综合在线网络| 在线看日韩精品电影| 亚洲午夜激情网站| 欧美日韩精品三区| 男女男精品网站| 久久久影视传媒| 国产激情91久久精品导航 | 欧美一区二区三区免费视频| 理论电影国产精品| 久久久久久久久久久99999| 国产精品白丝jk白祙喷水网站| 久久精品水蜜桃av综合天堂|