STOR 767 Homework Assignments Fall 2023

Homework problems will be assigned from the reading material, the lecture material, and the file below, whose subsection titles refer to the subject matter of the problems they contain.

Homework file

Homework problems from the file will be assigned in the format [section * number]: for example B3 refers to problem 3 of section B in the homework file, C2-4 refers to problems 2-4 of section C in the homework file.

Recall that all assignments are due before class.

 

Homework 1: Due Friday 8 September

Reading: Elements of Statistical Learning 2.1-2.4, 3.1-3.2, 3.4.  See also 3.3 if you are interested.

Problems:

From Lecture Slides on Linear Regression
Facts 2 on p.21
Fact on p.27
Fact C on p.30
Fact A on p.37
Fact B on p.49
Corollary on p.50
Corollary on p.52
Formulate and prove an extension of Fact 3 on p.21 in a setting where the squared norm of beta is replaced by a general, non-negative penalty function pen(beta)

From HW File: A4, B1, B8

 

Homework 2: Due Friday 15 September

Reading:

  • Assumptionless consistency of the Lasso, C. Chatterjee
  • Chapter 2 of Probabilistic Pattern Recognition, Devroye, Gyorfi, and Lugosi
  • Chapters 1 and 2 of Foundations of Machine Learning, Mohri, Rostamizadeh, and Talwalkar

Problems:

From Lecture Slides on Linear Regression:
Restate and establish the properties of the set C appearing in the proof of the LASSO Theorem on p.52

From HW File: A9;  B3,5,19;  C1,9;  D5;  E1,2,3,4

 

Homework 3: Due Monday 2 October

Reading:

  • Chapter 12 (see sections1, 3, 4, 5, and 7) and Chapter 14 (see proofs of Thm14.1 and 14.5) in Probabilistic Pattern Recognition, Devroye, Gyorfi, and Lugosi
  • Chapters 1 and 2 of Foundations of Machine Learning, Mohri, Rostamizadeh, and Talwalkar

Problems:

From HW File: A13; C4; D4; E7; F2; H7-9; I2,3,12

From lecture slides on ERM (please upload new file):
Establish that histogram rule on slide 6 is indeed the ERM estimate in this case
Establish the corollaries on slide 14
On slide 16, study the family G in the case that H contains binary classification rules, and l(y’,y) is the 0-1 loss.  Show that G corresponds to a collection of subsets of X x {0,1}
Establish the fact on slide 19

 

Homework 4: Due Wednesday 11 October

Reading:

Problems:

From HW File: H10,11,14

From lecture slides on ERM (please upload new file):
Establish first Fact on slide 23
Establish the stated inequality in the Fact on slide 24
Establish the shatter coefficient inequality on slide 25, and the Corollary that follows
Establish parts 1-3 of the Corollary on slide 29
Establish Facts 1 and 2 on slide 32
Establish the Fact on slide 33
Slide 34: Establish the (in)equalities (a)-(d) in the proof of the Proposition.  Establish monotonicity and convexity of L(*,epsilon).  Establish the Fact.
Slide 36, proof of lower bound: Establish equations (a) for R_sigma(h) and equation (b) for R_sigma(h) – R_sigma(h*).

 

Homework 5: Due Wednesday 18 October

Reading:

  • Chapter 1 of Prediction, Learning, and Games, by Cesa-Bianchi and Lugosi, Chapter 2.2-2.2, 2.9-2.10, Chapter 3.7, Chapter 4.1-4.2

From lecture slides on Multi-Armed Bandits:
Establish Fact on slide 9
Establish properties of U_{k,t} on slide 10
Slide 13: Use Jensen’s inequality to show that D(P,Q) is non-negative.  Establish the tensorization inequality.
Slide 14: Establish upper and lower bounds on D(p,q), and prove the Fact
Slide 15, proof of lower bound: Establish the change of measure Claim 1 (fill in details). Establish Claim 3 that P(T_2(n) = o(1) (fill in details).

 

Homework 6: Due Friday 3 November [Preliminary]

Reading:

From lecture slides on Sequential Prediction (please download most recent version of slides):
Verify that each of the losses on slide 4 is convex in its first argument when the second is fixed.
*Establish exp-concavity (with specified etas) in examples 1,3,4 on slide 13.
*Write out the argument for the proof of part 2 of the Theorem on slide 15, using part 1, carefully supporting the bound involving the exponential potential.
*Rigorously establish the first limit relation in the Fact on slide 17.  You will need to appeal to a truncation argument to extend the convergence of expectations of bounded continuous functions, which is guaranteed by weak convergence arguments.
*Verify the example on slide 20.
*Establish part 2 of the Fact on slide 35