Prerequisites & Notation

Before You Begin

Chapter 8 closes Part II by generalizing the polynomial-code framework of Chapter 5 to richer computational primitives: convolutions, matrix-vector products at scale, and general multivariate-polynomial tensor operations. The prerequisites are the polynomial-code background of Chapter 5 and comfort with tensor notation from basic linear algebra.

  • Polynomial codes (Chapter 5)(Review ch05)

    Self-check: Write the encoding polynomials pA,pBp_A, p_B and state the recovery threshold K=pqK = pq.

  • Gradient coding (Chapter 6)(Review ch06)

    Self-check: Why is gradient coding a specialization of polynomial codes to q=1q = 1?

  • Tensor notation and the Kronecker product(Review ch01)

    Self-check: What is the relationship between (AβŠ—B)vec(X)(\mathbf{A} \otimes \mathbf{B})\text{vec}(\mathbf{X}) and vec(AXBT)\text{vec}(\mathbf{A}\mathbf{X}\mathbf{B}^T)?

  • Convolution: 1D, 2D, and their polynomial representations

    Self-check: If c=aβˆ—bc = a * b (linear convolution), what is the relationship between the generating polynomials pa(x)pb(x)=pc(x)p_a(x) p_b(x) = p_c(x)?

  • Multivariate polynomials and Lagrange interpolation(Review ch03)

    Self-check: For f(x)=βˆ‘iciLi(x)f(x) = \sum_i c_i L_i(x) (Lagrange form), what does the collection {f(Ξ±k)}\{f(\alpha_k)\} tell you about the coefficients cic_i?

Notation for This Chapter

Tensor operations introduce new notation. We write tensors in bold upper-case (A,B,T\mathbf{A}, \mathbf{B}, \mathbf{T}) and convolution as βˆ—*. The generating polynomial of a tensor along one axis is denoted pA(x)p_{\mathbf{A}}(x). The Lagrange Coded Computing framework uses interpolation points Ξ²1,…,Ξ²K\beta_1, \ldots, \beta_K and evaluation points Ξ±1,…,Ξ±N\alpha_1, \ldots, \alpha_N.

SymbolMeaningIntroduced
A∈FqmΓ—d\mathbf{A} \in \mathbb{F}_q^{m \times d}Input matrix (same as Chapter 5)s01
T∈Fqd1Γ—β‹―Γ—dr\mathbf{T} \in \mathbb{F}_q^{d_1 \times \cdots \times d_r}Tensor of order rr (generalizes matrices)s01
βˆ—*Linear convolution (1D or 2D)s01
p,qp, qPartition counts (same role as Chapter 5)s02
KKRecovery threshold (minimum responses)s02
Ξ²1,…,Ξ²K\beta_1, \ldots, \beta_KLagrange interpolation points at which master evaluates the target functions03
Ξ±1,…,Ξ±N\alpha_1, \ldots, \alpha_NEvaluation points assigned to workerss03
f:(Fqd)K→Fqd′f: (\mathbb{F}_q^d)^K \to \mathbb{F}_q^{d'}Target function: a KK-variate polynomial over vectorss03
dfd_fDegree of the target function ffs03