Faithful representations and topographic maps : from distortion-to informationa-based self-organization / Marc M. van Hulle.

Saved in:
Bibliographic Details
Online Access: Full Text (via EBSCO)
Main Author: Van Hulle, Marc M.
Format: eBook
Language:English
Published: New York, N.Y. : Wiley, ©2000.
Series:Adaptive and learning systems for signal processing, communications, and control.
Subjects:
Table of Contents:
  • 1 Topographic Maps in Sensory Cortices 1
  • 1.2 Role of Topographic Maps 2
  • 1.3 Topographic Map Development 6
  • 1.4 Self-Organization 7
  • 2 Topographic Map Models and Algorithms 9
  • 2.2 Gradient-based Learning 11
  • 2.3 Competitive Learning 13
  • 2.3.1 Willshaw and von der Malsburg Model 13
  • 2.3.2 Amari Model 15
  • 2.3.3 Kohonen's Self-Organizing Map 15
  • 2.4 Basic Properties of SOM 20
  • 2.4.1 Topographic Ordering 20
  • 2.4.2 Weight Convergence, Energy Function 23
  • 2.4.3 Role Played by Neighborhood Function 24
  • 2.5 Biological Interpretation of SOM 28
  • 2.5.1 Formalizing Laterally Connected Networks 28
  • 2.5.2 Computational Shortcut 30
  • 2.5.3 Physiological Interpretation 31
  • 2.6 Extensions of SOM 34
  • 2.6.1 Different Matching and Optimization Criteria 34
  • 2.6.2 Different Neighborhood Definitions 35
  • 2.6.3 Feature Maps 36
  • 2.7 Other Types of Topographic Map Algorithms 36
  • 2.7.1 Durbin and Willshaw Model 36
  • 2.7.2 Van Velzen Model 36
  • 2.7.3 Maximum Local Correlation Model 37
  • 2.7.4 Information-Preservation Model 38
  • 2.7.5 Generative Topographic Map 38
  • 3 SOM Data-Modeling Properties and Statistical Applications 41
  • 3.2 Vector Quantization and Neighborhood Function 43
  • 3.2.2 Quantizer Optimality 44
  • 3.2.3 Quantizer Design 48
  • 3.2.4 Standard UCL and SOM 49
  • 3.2.5 SOM and Phase Transitions 53
  • 3.2.6 SOM and Numerical Integration 53
  • 3.3 Non-Parametric Regression and Topographic Ordering 56
  • 3.3.1 Principal Axes and Principal Curves 56
  • 3.3.2 Tangled Lattices and Monitoring 61
  • 3.3.3 Effective Dimensionality 67
  • 3.3.4 Discrete Input/Output Mapping and Regression 68
  • 3.3.5 Continuous Input/Output Mapping 71
  • 3.4 Non-Parametric Density Estimation and Magnification Factor 72
  • 3.4.1 Magnification Factor 72
  • 3.4.2 Density Estimation 74
  • 3.4.3 Gray Level Clustering 75
  • 4 Equiprobabilistic Topographic Maps 77
  • 4.1.1 Avoiding Dead Units 78
  • 4.1.2 Equiprobabilistic Map Formation 79
  • 4.2 Distortion-based Learning 82
  • 4.2.1 Activation Monitoring Rules 82
  • 4.2.2 Local Distortion-Monitoring Rules 89
  • 4.2.3 Combined Rules 89
  • 4.2.5 Constructive Algorithms 93
  • 4.2.6 Mean Absolute Error Minimization Rules 94
  • 4.3 Information-based Learning 96
  • 4.3.1 Mutual Information Maximization 96
  • 4.3.2 Redundancy Minimization and Sparse Coding 98
  • 4.3.3 Entropy Maximization 100
  • 4.4 Maximum Entropy Learning 101
  • 4.4.1 Equiprobable Quantization 102
  • 4.4.2 Equiprobabilistic Topographic Map Formation 110
  • 4.4.3 Distinction from SOM Algorithm 115
  • 4.4.4 Maximum Entropy Learning Rule 116
  • 4.4.5 Extension with Neighborhood Function 118
  • 4.4.6 Lattice-Disentangling Dynamics 118
  • 4.5 Biological Interpretation 124
  • 4.5.1 Sensory Representation 124
  • 4.5.2 Model for Topographic Map Formation 126
  • 5 Kernel-based Equiprobabilistic Topographic Maps 129
  • 5.1.1 Topographic Subspace Maps 129
  • 5.1.2 Topographic Feature Maps 130
  • 5.1.3 Outlook 131
  • 5.2 Kernel-based Maximum Entropy Learning 132
  • 5.2.1 kMER 134
  • 5.2.2 Convergence 135
  • 5.2.3 Equiprobable Quantization 135
  • 5.2.4 Relation with MAE Minimization 135
  • 5.2.5 Relation with Fuzzy Clustering 136
  • 5.2.6 Relation with Maximum Local Correlation Model 136
  • 5.2.7 Relation with Generative Topographic Map 137
  • 5.2.8 Optimized Algorithm 138
  • 5.2.9 Choice of Parameters 140
  • 5.3 Lattice-Disentangling Dynamics 141
  • 5.3.1 Monitoring 143
  • 5.4 Non-Parametric Density Estimation 149
  • 5.4.1 Fixed Kernel Estimate 149
  • 5.4.2 Variable Kernel Estimate 150
  • 5.4.3 Automatic Choice of Smoothing Parameter 152
  • 5.4.4 Simulations 156
  • 5.4.5 Alternative Interpretation 158
  • 5.5 Density-based Clustering 162
  • 5.5.1 Clustering and Competitive Learning 163
  • 5.5.2 Density-based Clustering with SKIZ 164
  • 5.5.3 Density-based Clustering with Hill-climbing 172
  • 5.5.4 Bayesian Classification 178
  • 5.6 Blind Source Separation 182
  • 5.6.1 Sub-Gaussian BSS 184
  • 5.6.2 Super-Gaussian BSS 186
  • 5.6.3 Algorithm 187
  • 5.6.4 Results 190
  • 5.6.5 BSS from Fewer Mixtures 190
  • 5.6.6 Distinction from Branch Networks 193
  • 5.7 Topographic Feature Maps 193
  • 5.7.1 Feature Map kMER 194
  • 5.7.2 Speech Encoding Example 196
  • 5.7.3 Image Encoding Example 199
  • 5.7.4 Comparison with ASSOM 200
  • 5.8 Adaptive Subspace Maps 203
  • 5.8.1 Adaptive Signal Transformation 204
  • 5.8.2 Subspace Method 205
  • 5.8.3 Optimally Integrated Adaptive Learning 206
  • 5.8.4 Subspace kMER 208
  • 5.9 Music Application 216
  • 5.9.1 Music Signal Generation 217
  • 5.9.2 System Overview 217
  • 5.9.3 Detailed Description and Simulations 220
  • 5.9.4 System Performance 227.