Home Page > > Details

Optimization in Machine Learning Assignment 2

 Optimization in Machine Learning (2020 Winter)

Assignment 2
Instructions: For the code parts, submit the completed files on Canvas (both .py file
and .ipynb file are accepted). For the free response parts, type your solution into a seperate
electronic document (.pdf file). Physical submissions will NOT be accepted.
To submit, compress all your files into a single compressed file (.zip file). E-mail a softcopy
of your code and answers to rkwon@mie.utoronto.ca.
If you have any questions about this assignment, please e-mail yhe@mie.utoronto.ca.
1. Linear Support Vector Machine
Download the dataset ‘prob1data.csv’. The dataset consists of two features as first two
columns and its classification as the third column (0 and 1 refer to two distinct types.) A
plot of the dataset is given below:
In this problem, you will solve both primal and dual optimization problem of the linear
soft-margin support vector machine using CVXPY and analyze the results.
(1a) (Code + Free Response) Complete the function ‘LinearSVM Primal0
that solve the
primal optimization problem of the linear SVM. Using the entire dataset, C = 1, solve the
optimization problem and report:
(1) The optimal decision boundary.
(2) The optimal support vectors.
(3) The solution time.
1
(1b) (Code + Free Response) Complete the function ‘LinearSVM Dual’ that solve the dual
optimization problem of the linear SVM. Using the entire dataset, C = 1, solve the opti￾mization problem and report:
(1) The optimal dual solution.
(2) The optimal decision boundary.
(3) The optimal support vectors.
(4) The solution time.
(1c) (Free Response) Discuss if the decision boundary of the linear SVM will change with
increased and decreased C value. If the decision boundary changes, briefly discuss how it
changes with C and why. If the decision boundary does not change with C, discuss the reason.
(1d) (Code + Free Response) Complete the function ‘Linearly separable’ that output 1 if
the dataset is linearly separable and 0 otherwise. Determine if the given dataset is linearly
separable. For any given dataset with multiple features, how can one conclude if the dataset
is linearly separable based on the optimal solution (optimal decision boundary) and optimal
objective function value solved? (Hint: consider varying C values.)
In this following problems, we will consider an alternative soft-margin method, known as the
l2 norm soft margin SVM. This new algorithm is given by the following primal optimization
problem (notice that the slack penalties are now squared, n is the total number of datapoints,
(2a) (Code) Complete the function ‘gaussian kernel sigma’ that returns a function ‘gaussian kernel’
with the specified σ value.
(2b) (Code + Free Response) Use ‘SVC’ from ‘sklearn.svm’ and ‘gaussian kernel sigma’
coded in (2a) to build a kernel SVM to classify the train data X train. Use C = 1 and
σ = 0.1. Report:
(1) Number of support vectors.
(2) Prediction error (ratio) in test set X test.
(3) Plot decision boundary approximately.
Download the files ‘votes.csv’. This dataset votes consists of over 3000 counties in the
United States along with their socioeconomic and demographic information and voting
records in the 2016 US election; each row corresponds to a single county.
(2c) (Code) The response variable will be prefer trump, which is 0 or 1 indicating whether
the percentage of people who voted for Trump in that county is greater than that who voted
for Clinton. Compute the response variable y.
(2d) (Code + Free Response) Use ‘SVC’ from ‘sklearn.svm’ to implement polynomial kernel
SVM with C = 10.0, max iter=1e6. Implement SVM with kernel degree set to 1, then 2, 3,
4, and 5. For each model, report:
(1) Number of support vectors.
(2) Prediction error (ratio) in test set X train.
(3) Prediction error (ratio) in test set X test.
(2e) (Free Response) Based on the 5 models trained in (2d), how does the predictive error
change with the degree of the polynomial kernel? Explain why. How does the number of
support vectors change with the degree of the polynomial kernel? Explain why.
 
Contact Us - Email:99515681@qq.com    WeChat:codinghelp
Programming Assignment Help!