# Python代写 | CMT311 Exercise Sheet 1 – Foundations

CMT311
Exercise Sheet 1 – Foundations

Assignment
Answer all parts of the four questions below. The number of marks available for each question (part) is indicated in the Criteria for Assessment section below. There are 100 marks
available in total for this assignment.
Question 1.
a) Consider the class of hypotheses consisting of all Boolean functions that can be represented as a list of if-then rules of the form:
If l1 then b1, else if l2 then b2, . . . , else bk
where li (for all 1 ≤ i ≤ k) is a literal, that is, a variable xj or its negation xj and
bi ∈ {1, 0} is the assigned label. For example, “If x2 then 1, else if x1 then 0, else 1”
is a decision list.
Assuming that only n variables can be used. Is the class of decision lists learnable in the PAC model? Justify your answer with a formal argument (in maximum
2 pages .
b) For the class H of axis-aligned n-dimensional rectangles in R
n
, that is
H = {[a1, b1] × · · · × [an, bn] : ai
, bi ∈ R}
(1) Give a PAC-learning algorithm.
(2) Prove that your algorithm satisfies the conditions for PAC learnability.
(3) How does the sample complexity vary as a function of n?
Question 2.
For the hypothesis class H defined by the following family of subsets of the real line:
[x, x + 1] ∪ [x + 2, ∞], with x ∈ R
Determine the VC-dimension of H. Justify your answer, by giving a proof (in maximum 1 page).

Question 3.
Let H1, · · · , Hk be hypothesis classes such that H1 ⊂ H2 ⊂ · · · ⊂ Hk and |Hi
| = 2i
, for
every i ∈ {1, . . . , k}. Suppose you are given a sample of size m (with each element chosen
i.i.d.), and you want to learn the union of all these classes, that is you would like to learn the
hypothesis class
H =
[
k
i=1
Hi
Consider two alternative approaches:
1. Learn on the sample using the ERM rule.
2. Divide that sample into a training set of size (1 − α)m, and a validation set of size αm
for some α ∈ (0, 1). Then apply the approach of model selection using validation, i.e.:
• First, train each class H1 on the (1 − α)m training examples using the ERM rule
w.r.t. Hi and let h1, . . . , hk be all the resulting hypotheses.
• Second, apply the ERM rule w.r.t. to the finite class {h1, . . . , hk} on the αm validation examples.
formally (using maximum 2 pages).
Question 4.
Consider the following hypothetical scenario. A car company would like to use a Bayesian
Network model to better predict whether a certain customer will buy a specific car, so they
can focus their efforts on developing certain car models. Specifically, they want to label pairs
of customers and car models according to whether they belong to the target class ‘buys’.
The manufacturer has selected eight attributes, each taking values from {yes, no}, namely
• Basic features of the car:
– ‘5-star safety rating’: whether the car model has been awarded with the
highest safety rating (5-stars in this case)
– ‘side-airbags’: whether the car model includes side airbags
• ‘large engine capacity’: whether the car has a capacity of at least 2 litres
• ‘expensive’ whether the car is expensive
• Characteristics of the client:
• ‘young’: whether the client is young
• ‘rich’: whether the client is rich
• ‘family’: whether the client is a family
• ‘interested’: whether the client is interested in the car

E-mail: [email protected]  微信:itcsdx