Neural Networks: Unsupervised Learning
13 important questions on Neural Networks: Unsupervised Learning
What is the third principle of self-organisation?
What is the fourth principle of self-organisation?
What does the Generalised Hebbian Learning algorithm do?
- Higher grades + faster learning
- Never study anything twice
- 100% sure, 100% understanding
What is the goal of self-organising maps?
What are features of self-organising maps?
- Ordering represents coordinate systems for different input features.
- Topographic map is created in which spatial locations of the neurons in the lattice are indicative of the intrinsic statistical features contained in the input patterns.
What are the three processes in which the self-organising maps are achieved?
- Competition
- Cooperation
- Synaptic adaptation
What does the process competition for creating self-organising maps consist of?
What does the process cooperation for creating self-organising maps consist of?
- We want to stay close to neurobiological notions.
- hj,i is the topological neighbourhood j entered on winning neuron i.
- dj,i is the lateral distance.
What is the activation spread?
The topological neighbourhood can shrink over time:
What does the process synaptic adaptation for creating self-organising maps consist of?
The weights are moved into the direction of the input vector. Neurons close to each other tend to have similar synaptic weights. The learning rate eta can be made dynamic.
What does the algorithm for self-organising maps overall look like?
2. Sampling. Draw a sample x from the input space with a certain probability. Vector x represents the activation pattern that is applied to the lattice. The dimension is equal to m.
3. Similarity Matching. Find the best-matching (winning) neuron i(x) at time step n by using the minimum-distance Euclidean criterion.
4. Updating. Adjust the synaptic weight vectors of all neurons by using the update formula.
5. Continuation. Continue with step 2 until no noticeable changes in the feature map are observed.
What are two phases of self-organising maps?
- Self-organising or ordering phase: 1000 iterations, desirable values eta0 0.1 and T2 = 1000, sigma0 equal to the radius of the lattice and set T1 to 1000/log sigma0.
- Convergence phase: iterations = 500 x number of neurons, set eta to 0.01, neighbourhood function should contain only the nearest neighbours of the neurons, eventually reducing to one or zero neighbours.
What is a contextual map?
The question on the page originate from the summary of the following study material:
- A unique study and practice tool
- Never study anything twice again
- Get the grades you hope for
- 100% sure, 100% understanding