Programming Assignment 2
Due Tuesday, March 27 at the
beginning of lecture
- Please prepare a
report that is neatly written or typed.
- Please also submit
print outs of the Matlab code your write, and any tables or graphs.
- Your report must
provide an analysis of each method's performance and provide
reasoning behind the analysis.
- Also be sure to compare/contrast the relative strengths and weaknesses of
the methods, given the experimental results and what you know of
each algorithm.
Download and save the data set,
images of handwritten digits: MNIST_data.mat,
prog2.txt. The data represents a
subsampling of the full MNIST data set available here.
- (70 points)
Develop code for training and testing an SVM classifier with nonlinear
kernel. You are welcome to use either formulation described
in the textbook, Chapter 7. You cannot use an SVM library to
complete this assignment. You can use quadratic programming library
if you like. Using your implementation of the SVM classifier, compare
multi-class classification performance of two different voting schemes: (a)
"one versus the rest" and (b) "one versus one." Be sure to specify
your voting scheme using a method described in the book (and describe it in
the report). To analyze accuracy, you will find it helpful to produce
and analyze the multiclass confusion matrix,
in addition to examining the overall error rate.
- (30 points) Use the
same "one versus one" classifiers from the previous problem in a DAGSVM approach.
A paper describing the approach is available here.
Compare multi-class classification performance with the other two
voting schemes.
- (10 points extra credit)
A baseline implementation of the DAGSVM with 6th degree polynomial kernels
achieves 95% accuracy on the test set. See if you can do better than
this baseline, using the DAGSVM approach. Here is the confusion matrix
for the baseline implementation:
84 0 0 0 0 1 2 0 1 0
0 121 0 0 0 0 0 0 0 0
0 0 110 0 1 0 0 0 1 0
0 0 0 106 0 2 0 1 2 0
0 0 0 0 102 1 0 3 1 1
1 0 0 5 0 86 2 0 0 0
1 0 0 1 1 1 82 0 0 0
0 0 1 2 0 0 0 93 2 0
0 1 1 1 0 1 1 1 78 2
0 0 1 0 4 0 0 1 1 89