Neural Networks: A Comprehensive FoundationPrentice Hall, 1999 - 842 halaman Introduction; Learning processes; Single layer perceptrons; Multilayer perceptrons; Radial-basis function networks; Support vector machines; Comittee machines; Principal components analysis; Self-organizing maps; Information-theoretic models; Stochastic machines and their approximates rooted in statistical mechanics; neurodynamic programming; Temporal processing using feedforward networks; Neurodynamics; Dynamically driven recurrent networks; Epilogue; Bibliography; Index. |
Isi
Introduction | 1 |
Committee Machines 351 | 3 |
Learning Processes | 50 |
Hak Cipta | |
15 bagian lainnya tidak diperlihatkan
Edisi yang lain - Lihat semua
Istilah dan frasa umum
activation function algorithm applied approximation back-propagation back-propagation algorithm back-propagation learning bias Boltzmann machine Chapter classification computation convergence cost function defined denote derivative described in Eq desired response dimensionality distribution eigenvalue entropy equation error signal error surface estimate example feature map FIGURE follows function f(x Gaussian gradient Green's function Hebbian Hessian matrix hidden layer hidden neurons induced local field input layer input patterns input space input vector input-output mapping iteration learning algorithm learning machine learning process learning-rate parameter linear LMS algorithm m₁ memory method minimization multilayer perceptron neural network neuron nodes nonlinear operator optimal output layer output neuron performance probability density function problem radial-basis functions random variable RBF network regression respect result risk functional Section self-organizing sigmoid sigmoid function signal-flow graph statistical stochastic supervised learning support vector machine theorem tion training data training sample training set VC dimension weight vector X₁ zero