Here you will find the latest news and announcements relating to this course.

Final Exam

Wednesday, August 1

(5:00 – 7:00 pm)

Closed book and notes

You are allowed to bring a calculator and an index card (standard size) with your notes.

No student will be allowed to leave the classroom for the duration of the (2 hr) exam (so, plan accordingly)

Cell phones, smart watches and all other devices must be powered off and placed in your back bag. You will have no access to your back bag once the exam starts. No food or beverages. No water or other bottles.

 

 

Reduced Size MNIST DATA

 

The following links point to documents that you will need in Chapter 1:
Click here to download the LTG Synthesis Document (.pdf)

Click here to download Matlab example for LTG synthesis (.doc)

Click here to download the derivation for the upper bound on C(m,n) (.pdf)

 

Minimal PTG(r) Realization of Switching Functions (Example): P1 P2 P3 P4 P5

 

The following are video clips (.avi format) that animate some important neural networks results:

 

Probability of linear separability of m points in general position in Rn (Figure 1.3.3, p. 19)

 

Polynomial regression (Figure 5.2.2, p. 222)

 

Multilayer Perceptron function approximation (Figure 5.2.5, p.228)

 

 

 

 

Midterm Exam

Wednesday, June 13

Closed book and notes

You are allowed to bring a calculator and an index card (standard size) with your notes.

No student will be allowed to leave the classroom for the duration of the (2 hr) exam (so, plan accordingly)

Cell phones, smart watches and all other devices must be powered off and placed in your back bag. You will have no access to your back bag once the exam starts. No food or beverages. No water or other bottles.

Bonus Project (Extra Credit: 20 points toward the Midterm Exam)

You are asked to design a linear classifier with 10 linear units followed by a max selector to decide on the class of handwritten digits (0-9) from the MNIST data set

 

The MNIST training data (60,000 samples of handwritten digits 0-9) and its associated labels can be downloaded here.

 

Save it in the Matlab directory and set path to it. Then, at the Matlab prompt type: load  MNIST_data in order to import the data to the workspace. This

This will load the 784x60,000 images_train array and its corresponding 60,000x1 array of labels. The images are 28x28 each, but they are scanned and represented as 784 dimensional columns in the images_train array.

 

The weights of the classifier should be saved in a matrix W. This is a matrix of size 10x785, where the first row is the weight vector for the “zero digit”, the second row is the weight vector for the “one digit”, etc. The weight vectors are to be organized in the weight matrix as rows: [w0  w1  w2  w3  … w784]. The weight w0 is the weight associated with the input bias of 1.

 

The program/script is to be written form scratch preferably written using Matlab (no NN toolbox is to be used). After training the classier, use it in retrieval mode to classify the training set and report the classification error in percent. Turn in your program, weight matrix (see below on format) and report via email to your instructor by the deadline.

 

The weight matrix should be saved in a .mat file and the file emailed to your professor. Use the instruction save(your_last_name.mat’,’W’) to save the matrix in Matlab compatible data format.

 

I expect you to see me during office hours (or by appointment) so I can answer your questions regarding this midterm exam bonus project.

The due date is (before) 5:00 pm, June 13 (the day of the final).