STOR 767 Homework Assignments Fall 2025

Homework problems will be assigned from the reading material, the lecture material, and the file below, whose subsection titles refer to the subject matter of the problems they contain.

Homework file

Homework problems from the file will be assigned in the format [section * number]: for example 1.3 refers to problem 3 of section 1 in the homework file, 3.2-4 refers to problems 2-4 of section 3 in the homework file.

Recall that all assignments are due before class.

 

Homework 1: Due Tuesday 26 August [Final]

Reading: Elements of Statistical Learning 2.1-2.4, 3.1-3.2, 3.4.  See also 3.3 if you are interested.

HW File: 1.1-3, 2.1-5, 2.9, 3.1, 3.5

 

Homework 2: Due Thursday 4 September [Final]

Reading: Assumptionless consistency of the Lasso, S. Chatterjee; Chapter 2 of Probabilistic Pattern Recognition, Devroye, Gyorfi, and Lugosi; Chapters 1 and 2 of Foundations of Machine Learning, Mohri, Rostamizadeh, and Talmalkar

HW File: 1.4-6, 2.8, 3.3-4, 5.1.  Regression slides: prove Fact A.2 on slide 26, Fact B on slide 37, and the Corollary on slide 38.

 

Homework 3: Due Tuesday 9 September

Reading: Concentration Inequalities by Boucheron, Lugosi, and Massart. Background reading Ch1.1-1.3, 2.1-2.3; Material covered in the lecture: 4.8-9, 6.0-1.

HW File: Download most recent (Sept 5) version of the HW file.  Do problems 2.6, 2.7, 8.4, 9.2, 9.3, 9.6, 9.8, 9.9.

From Boucheron, Lugosi, and Massart: Prove corollary in Remark 4.4.  Write out proof of Theorem 4.22 on sub-additivity of entropy, taking care to work out the details of how/why the expression in Remark 4.4 applies to the conditional entropy.

 

Homework 4: Due Tuesday 16 September

Reading: Concentration Inequalities by Boucheron, Lugosi, and Massart. Background reading Ch1.1-1.3, 2.1-2.3; Material covered in the lecture: 4.8-9, 6.0-1.

HW File: September 5 version.  5.2, 6.1-2, 9.4, 9.7, 9.10, 9.11, 9.12, 9.17, 9.18.  Prove claim in lecture about the process X(theta).

 

Homework 5: Due Thursday 25 September

Reading: Chapter 12 (see sections1, 3, 4, 5, and 7) and Chapter 14 (see proofs of Thm14.1 and 14.5) in Probabilistic Pattern Recognition, Devroye, Gyorfi, and Lugosi. Chapters 1 and 2 of Foundations of Machine Learning, Mohri, Rostamizadeh, and Talwalkar

HW File: September 5 version. 1.7, 8.1, 8.4, 8.5, 9.15, 9.16, 9.19.  Also Empirical Risk Minimization slides: Prove Fact on Slide 10, Fact and Corollary on slide 12.

 

Homework 6: Due Thursday 9 October

Reading: Sections 1-2 of Regret Analysis of Stochastic and Nonstochastic Multi-armed Bandit Problems by Bubeck and Cesa-Bianchi

HW File: September 5 version. 1.8, 9.5, 9.13, 9.14, 10.1.

Empirical Risk Minimization slides: Prove Fact on Slide 15, Fact on Slide 18, Fact about shatter coefficients and Corollary on Slide 24, Corollary on Slide 28 parts 1,2.   Multi-Armed Bandit slides: Prove Fact on Slide 8.

 

Homework 7: Due Friday 24 October

Reading: Chapter 1 of Statistical Optimal Transport by Chewi, Niles-Weed, and Rigollet

HW File: September 5 version. 8.2, 8.3, 8.7

From lecture slides on Multi-Armed Bandits: Sketch an argument for the Fact on slide 9; Establish Fact on slide 10; Slide 13: Use Jensen’s inequality to show that D(P,Q) is non-negative, and establish the tensorization inequality. Slide 14: Establish upper and lower bounds on D(p,q), and prove the Fact.  Proof of lower bound: Establish Claim 1 concerning E(hat{D}_t) and Claim 2 expression P'(A) in terms of hat{D}_{T_2(n)}.