CSE/NB 528 Homework 3: Neural Dynamics and Dynamic Networks

Due Date: Friday, May 20, 2011 midnight

 

Submission Procedure:
Create a Zip file called "528-hw3-lastname-firstname" containing the following:
(1) Document containing your answers to any questions asked in each exercise, 
     as well as any figures, plots, or graphs supporting your answers,
(2) Your Matlab program files,
(3) Any other supporting material needed to understand/run your solutions in Matlab.
 

Upload your Zip file to this dropbox.

 
Upload your file by 11:59pm Friday, May 20, 2011. 
Late submission policy is here.
 

 

 

1.           Simplified Neuron Models (40 points): Answer the questions here about the 
Fitzhugh-Nagumo neuron model discussed by Adrienne in class. More information 
about the Fitzhugh-Nagumo model can be found here and here.
2.           Nonlinear Recurrent Networks (60 points): Write Matlab code and 
answer the questions in Exercise 4 from Chapter 7 in the textbook 
as described in the file c7.pdf. 
 
       Create figures reproducing Figures 7.18 and 7.19 in the textbook using 
       your code, and include additional example figures to illustrate the effects of 
       varying the value of tauI.
 
              (The following files implement a nonlinear recurrent network in Matlab:
                   c7p5.m and c7p5sub.m.
          These files are for Exercise 5 in c7.pdf but you can modify them and 
          use them for Exercise 4. For an analytical derivation of the stability matrix, 
          see Mathematical Appendix Section A.3 in the text).
 
 

Extra Credit Problem (Hopfield Networks, 20 points)

The Hopfield network is a famous type of recurrent network with the property that if you start the network in an initial state, the network always converges to a local minimum of an “energy” function (or Lyapunov function) that can be defined for that network. The local minima correspond to particular patterns stored in the network by choosing appropriate synaptic weights. Thus, if the initial input is an incomplete pattern, the network will converge to the closest stored pattern, giving rise to the useful property of pattern completion. Read about Hopfield networks in this Scholarpedia article by Hopfield.

In this problem, we will consider Hopfield networks with Binary Neurons (output is 0 or 1).

We assume the network has symmetric weights and no self-connections.

(a)  The Lyapunov function of our network is given in the Scholarpedia article in the section “Binary neurons.” Suppose we pick a neuron at random and use the update rule given in the article, i.e., the neuron’s output becomes 1 if its overall input is above the threshold of 0, and becomes 0 otherwise. Show that this update rule necessarily decreases the value of the Lyapunov function (or leaves it unchanged).

(b) Suppose the update procedure is repeated, picking a neuron at random at each time step and setting its output according to the update rule. Show that the network will eventually converge to a stable state (Hint: Use your result from (a) and ask yourself whether the value of the Lyapunov function can decrease forever).