全部版块 我的主页
论坛 计量经济学与统计论坛 五区 计量经济学与统计软件
3202 5
2008-01-17
&lt;An Introduction to Neural Networks&gt;<br/><br/>CONTENTS<br/><br/>Preface<br/><br/>I FUNDAMENTALS<br/>1 Introduction<br/>2 Fundamentals<br/> 2.1 A framework for distributed representation<br/>  2.1.1 Processing units<br/>  2.1.2 Connections between units<br/>  2.1.3 Activation and output rules<br/> 2.2 Network topologies<br/> 2.3 Training of arti_cial neural networks<br/>  2.3.1 Paradigms of learning<br/>  2.3.2 Modifying patterns of connectivity<br/> 2.4 Notation and terminology<br/>  2.4.1 Notation<br/>  2.4.2 Terminology<br/><br/>II THEORY<br/>3 Perceptron and Adaline<br/> 3.1 Networks with threshold activation functions<br/> 3.2 Perceptron learning rule and convergence theorem<br/>  3.2.1 Example of the Perceptron learning rule<br/>  3.2.2 Convergence theorem<br/>  3.2.3 The original Perceptron<br/> 3.3 The adaptive linear element (Adaline)<br/> 3.4 Networks with linear activation functions the delta rule<br/> 3.5 Exclusive-OR problem<br/> 3.6 Multi-layer perceptrons can do everything<br/> 3.7 Conclusions<br/>4 Back-Propagation<br/> 4.1 Multi-layer feed-forward networks<br/> 4.2 The generalised delta rule<br/>  4.2.1 Understanding back-propagation<br/> 4.3 Working with back-propagation  <br/> 4.4 An example   <br/> 4.5 Other activation functions   <br/> 4.6 De_ciencies of back-propagation    <br/> 4.7 Advanced algorithms    <br/> 4.8 How good are multi-layer feed-forward networks?<br/>  4.8.1 The e_ect of the number of learning samples    <br/>  4.8.2 The e_ect of the number of hidden units <br/> 4.9 Applications   <br/>5 Recurrent Networks<br/> 5.1 The generalised delta-rule in recurrent networks <br/>  5.1.1 The Jordan network<br/>  5.1.2 The Elman network<br/>  5.1.3 Back-propagation in fully recurrent networks    <br/> 5.2 The Hop_eld network   <br/>  5.2.1 Description<br/>  5.2.2 Hop_eld network as associative memory <br/>  5.2.3 Neurons with graded response <br/> 5.3 Boltzmann machines    <br/>6 Self-Organising Networks<br/> 6.1 Competitive learning    <br/>  6.1.1 Clustering<br/>  6.1.2 Vector quantisation<br/> 6.2 Kohonen network<br/> 6.3 Principal component networks<br/>  6.3.1 Introduction    <br/>  6.3.2 Normalised Hebbian rule   <br/>  6.3.3 Principal component extractor<br/>  6.3.4 More eigenvectors<br/> 6.4 Adaptive resonance theory<br/>  6.4.1 Background Adaptive resonance theory<br/>  6.4.2 ART1 The simpli_ed neural network model<br/>  6.4.3 ART1 The original model<br/>7 Reinforcement learning<br/> 7.1 The critic<br/> 7.2 The controller network<br/> 7.3 Barto's approach the ASE-ACE combination<br/>  7.3.1 Associative search<br/>  7.3.2 Adaptive critic <br/>  7.3.3 The cart-pole system<br/> 7.4 Reinforcement learning versus optimal control<br/><br/>III APPLICATIONS<br/>8 Robot Control<br/> 8.1 End-e_ector positioning<br/>  8.1.1 Camera{robot coordination is function approximation<br/> 8.2 Robot arm dynamics<br/> 8.3 Mobile robots<br/>  8.3.1 Model based navigation<br/>  8.3.2 Sensor based control<br/>9 Vision<br/> 9.1 Introduction<br/> 9.2 Feed-forward types of networks<br/> 9.3 Self-organising networks for image compression<br/>  9.3.1 Back-propagation<br/>  9.3.2 Linear networks<br/>  9.3.3 Principal components as features<br/> 9.4 The cognitron and neocognitron  <br/>  9.4.1 Description of the cells<br/>  9.4.2 Structure of the cognitron<br/>  9.4.3 Simulation results<br/> 9.5 Relaxation types of networks<br/>  9.5.1 Depth from stereo<br/>  9.5.2 Image restoration and image segmentation<br/>  9.5.3 Silicon retina<br/><br/>IV IMPLEMENTATIONS<br/>10 General Purpose Hardware<br/> 10.1 The Connection Machine<br/>  10.1.1 Architecture<br/>  10.1.2 Applicability to neural networks<br/> 10.2 Systolic arrays<br/>11 Dedicated Neuro-Hardware<br/> 11.1 General issues<br/>  11.1.1 Connectivity constraints<br/>  11.1.2 Analogue vs. digital<br/>  11.1.3 Optics<br/>  11.1.4 Learning vs. non-learning<br/> 11.2 Implementation examples<br/>  11.2.1 Carver Mead's silicon retina<br/>  11.2.2 LEP's LNeuro chip<br/><br/>References<br/><br/>Index<br/><br/>http://down8931.pinggu.org/UploadFile_20082009/2008-10/200810120505383402.pdf<br/>

[此贴子已经被作者于2008-10-12 1:04:31编辑过]








二维码

扫码加我 拉你入群

请注明:姓名-公司-职位

以便审核进群资格,未注明则拒绝

全部回复
2008-4-11 15:33:00
把书拿来
二维码

扫码加我 拉你入群

请注明:姓名-公司-职位

以便审核进群资格,未注明则拒绝

2008-7-6 00:21:00

骗子一个

我本来钱就不多啊

二维码

扫码加我 拉你入群

请注明:姓名-公司-职位

以便审核进群资格,未注明则拒绝

2008-7-6 18:01:00
二维码

扫码加我 拉你入群

请注明:姓名-公司-职位

以便审核进群资格,未注明则拒绝

2009-11-9 22:18:16
楼主不厚道,根本就没看到书,就一个链接,浪费感情了
二维码

扫码加我 拉你入群

请注明:姓名-公司-职位

以便审核进群资格,未注明则拒绝

2010-3-6 21:00:50
太多了,用中文的吧
二维码

扫码加我 拉你入群

请注明:姓名-公司-职位

以便审核进群资格,未注明则拒绝

相关推荐
栏目导航
热门文章
推荐文章

说点什么

分享

扫码加好友,拉您进群
各岗位、行业、专业交流群