Exercises

ex-ch08-01

Easy

For coded convolution with (p,q)=(3,5)(p, q) = (3, 5), compute the recovery threshold KK for both the standard polynomial code and the entangled polynomial code.

ex-ch08-02

Easy

For a cubic function f(x1,…,x6)=βˆ‘ixiTAixif(\mathbf{x}_1, \ldots, \mathbf{x}_6) = \sum_i \mathbf{x}_i^T \mathbf{A}_i \mathbf{x}_i (diagonal quadratic in each input), compute the LCC recovery threshold.

ex-ch08-03

Easy

State why ReLU cannot be exactly computed via LCC at bounded per-worker storage and bounded recovery threshold.

ex-ch08-04

Easy

Compare the LCC recovery threshold for bilinear (df=2d_f = 2) vs. cubic (df=3d_f = 3) functions with K=5K = 5 inputs.

ex-ch08-05

Medium

Construct an entangled polynomial code for convolution with a\mathbf{a} having p=3p = 3 chunks and b\mathbf{b} having q=4q = 4 chunks. Specify the encoding polynomials and verify the recovery threshold.

ex-ch08-06

Medium

Why is LCC constant-factor worse than specialized codes (polynomial, entangled) for common operations? Give an example where LCC is attractive despite this penalty.

ex-ch08-07

Medium

Derive the degree of the composed polynomial g(z)=f(u(z))g(z) = f(u(z)) in the LCC construction, where uu is the Lagrange-encoding polynomial of the KK inputs and ff has total degree dfd_f.

ex-ch08-08

Medium

For K=16K = 16 workers and a matrix multiplication with p=q=4p = q = 4 (so K=pq=16K = pq = 16), compare the recovery thresholds of: (a) standard polynomial code, (b) entangled polynomial code (MatDot variant), (c) LCC.

ex-ch08-09

Medium

Propose a polynomial approximation of the sigmoid function Οƒ(x)=1/(1+eβˆ’x)\sigma(x) = 1/(1 + e^{-x}) on [βˆ’5,5][-5, 5] with degree ≀5\leq 5. Discuss the tradeoff.

ex-ch08-10

Medium

For a 4-way tensor contraction Tijkβ„“=βˆ‘sAijsBskβ„“T_{ijk\ell} = \sum_s A_{ijs} B_{sk\ell} (matrix-like contraction on tensors of order 3), propose an entangled polynomial code achieving recovery threshold K=p1+p2+p3+p4βˆ’3K = p_1 + p_2 + p_3 + p_4 - 3 for (p1,p2,p3,p4)(p_1, p_2, p_3, p_4)-partitioning.

ex-ch08-11

Hard

Prove (sketch) the LCC lower bound Krecβ‰₯df(Kβˆ’1)+1K_{\text{rec}} \geq d_f(K - 1) + 1 via an entropy argument on polynomial values.

ex-ch08-12

Hard

Describe how LCC composes with TT-privacy (hiding inputs from any TT colluding workers). What is the recovery-threshold cost?

ex-ch08-13

Hard

Discuss the "polynomial approximation + LCC" strategy for non-polynomial deep learning: identify the tradeoffs between approximation degree, accumulated error across layers, and total recovery-threshold overhead.

ex-ch08-14

Hard

Sketch the composition of coded convolution (Chapter 8 Β§8.2) with coded gradient (Chapter 6) for distributed CNN training. What is the per-layer recovery threshold?

ex-ch08-15

Challenge

Open problem. Propose an approximate coded-computing framework that handles general non-polynomial functions with bounded error. What information-theoretic quantities would characterize the optimal rate-accuracy tradeoff?