Lecture: Distributed Representations & Neural Network Dynamics

24 important questions on Lecture: Distributed Representations & Neural Network Dynamics

What are the 3 principles of neural network processing?

1. Distributed Representations
2. Connectivity
3. Excitation/Inhibition

What are benefits of Distributed Representations?

- Robust against noise: Whole ensemble won't fire by chance
- Robust against damage: 1 cell damaged =/= byebye grandma
- Increased capacity: Overlap --> more things possible
- Dynamic/flexible behaviour: Neural networks show competitive behavior`

What is long term potention function?

Neurons that fire in synchrony -->
- Stronger connections
- Synapses strengthened
  • Higher grades + faster learning
  • Never study anything twice
  • 100% sure, 100% understanding
Discover Study Smart

What is synaptic plasticity?

Synapses can be changed (stronger/less strong)

What is long term depression?

Cell B gets input from Cell A
- Cell B fires before Cell A gives input
Reasons B fired:
- Accident
- B belongs to other representation by another stimulus

B might belong to stimulus related to A, but not A:
- LTD: Decrease in synapse strength (disassosciation)
- LTD function: Neural pattern difference with similar patterns

What makes a neural representation (anatomy wise)?

Strong connections between synapses compared to other neurons oustide of the neural representation

How does retrieval/remembering work (neural representations)?

There is a neural representation (strong synapses, just inactive

Cue activates some neurons in the representation --> Representation is activated

What is pattern completion in memory retrieval?

Part of neural ensemble activates the rest of the ensemble

What is autoassosciation? What is heteroassosciation?

Autoassisciation: Synaptic strengthening within a layer (local)
Heteroassociation: Connection between neural patterns in different layers

What types of heteroassociations can there be in neural representations?

- Assosciatin with other representations
- other aspects of stimuli


Have to be coded in a different layer

What the fuck is visual processing organization design direction shit?

Retina: Light --> Signal --> Ganglion cells
- Ganglion cell gets input from round patch of sensory cells
- V1 responds to stripes?. So what happens

Ganglion cells --> LGN before --> V1
- LGN: cells are lined in circle with the retina
- LGN receptive fields still circular
- LGN RF'S are along a line in visual space

V1
elongated RF's are constructed from the RF's of this LGN

What is convergence (connectivity)?

Ook een hele slechte film

Compression: Multiple cells --> 1 target cell

What is point-to-point (connectivity)?

Source & target cells are the same amount (copying)
named: topological connectivity

What are the 2 types of divergence (connectivity)?(functions)

Source area --> Multiple target area's
- Parallel processing (what/where pathway)
Source area --> 1 target area (multiple cells)
-    More dense projection
- Association area's mostly (MTL): connect it to input of more other areas in this area

What is density (connectivity?)

Source area  ALOT OF projections to target area.

More projections --> More dense

What types of convergence are there (connectivity)?

Within 1 modality
- Within the visual hierarchy

Across modalities:
- Larger scale: Episodic memory, combining information from multiple systems

What is the function of convergence (connectivity)?

Many cells --> 1 cell:
Simple --> comination into complex environmental aspects

What is topological connectivity (examples?)?

Organization of the input is clear (mostly low-level processes)

- Retinotopic maps
- Tonotopic maps
- Homunculus

Where does topological connectivity happen?

In all levels of NS, but mostly only in lower/simple layers

What is the difference in plasticity between high levels & low levels?

More plastic in higher levels

V1 = constrast
Higher = Specific objects

What receptors cause fast inhibition?

Rare vraag i know

GABA-A receptors?

What is feedforward inhibition? (& function)

Source --> Target
Source --> Interneuron --> target

Function:
- Networks canhandle input at variable strength & different size cues

What is feedback inhibition? (& Function)

Target layer --> Interneuron --> Back to target layer

Function:
- Proportional activity in target layer
- Pattern size limits
- No other patterns get activated
- Gamma oscillations

What does feedforward and feedback inhibition control?

Level of firing in a layer: More inhibition = small patterns

Level of Propagation (spreading to other regions):
-  Too much inhibition: Spreading dies out
- Too low inhibition --> Excitation & spreading of noise

The question on the page originate from the summary of the following study material:

  • A unique study and practice tool
  • Never study anything twice again
  • Get the grades you hope for
  • 100% sure, 100% understanding
Remember faster, study better. Scientifically proven.
Trustpilot Logo