CSci 5521 Homework 5
1. Table 1 shows data collected on a person’s decision to wait for a place at a restaurant. Answer
the following questions:
(a) We wish to build a decision tree to predict if the person will wait for a place at the current
restaurant, based on type, price, & hunger. Draw the decision tree that fits this data and
show how to calculate each node split using entropy as the impurity measure.
Note: If the entropy is the same for two or more features, you can select any of the
features to split.
(b) Based on the decision tree, if the restaurant sells Pizza, the average price is $, and the
person is Hungry, will he/she wait for a place?
2. Build a Perceptron [multilayer or single layer as the case may be] to recognize a certain area of
the plane. That is, the Perceptron should output a “1” if the input vector lies in the shaded
(a) Determine the vector of coefficients W for a single layer perceptron of the form in Figure 1
to recognize the area in Figure 2 and again for Figure 3 shaded blue. Use a step-function
3. Suppose we use a linear SVM classifier for a binary classification problem with a set of data
points X shown in Figure 6: samples with positive labels +1 are (0.2, 0.8), (2, -1), (0.8, 2), (2,
1), (1.5, 1), and samples with negative labels −1 are (-0.5, -0.5), (-0.5, -2), (-2, -1), (-1, -1.5).
(a) Find the support vectors for this data when using a hard-margin SVM with a linear kernel.
(b) Find the weight vector w and bias b resulting from a hard-margin Support Vector Machine
with a linear kernel. Express the vector w as a convex combination of the support vectors.
What is the decision function based on w and bias b that will be positive for the +1 class
and negative for the −1 class?
(c) Pick three samples and calculate their distances to the decision boundary.
(d) If the sample (-0.5, -0.5) is removed, will the decision boundary change? What if we
remove the sample (0.8, 2) instead?
(e) If a new sample (1, 0) comes as a negative sample, will the decision boundary change? If
so,what SVM method will you use in this case?
4. Modify the sample multi-layer perceptron code NN2pruned.py: Add one more layer with three
nodes to the network, after the existing hidden layer. There is no need to include a bias term
in any layer. Submit your modified python code. The code should print out the final values
for the weights, plus the final outputs on the training set.
本网站支持淘宝 支付宝 微信支付 paypal等等交易。如果不放心可以用淘宝交易！
E-mail: [email protected] 微信:itcsdx