Predictive Caching via Sensing
Closing the Sensing → Caching Loop
Section 19.2 showed caching freeing resources for sensing. The reverse direction is equally powerful: sensing outputs drive caching decisions.
Sensing produces vehicle locations, mobility patterns, and target identity information. These are correlated with content demand: a vehicle approaching an intersection will likely request intersection-specific map tiles; a user in a stadium will likely request replay clips of the ongoing game; a surveillance feed identifying a truck will trigger routing requests. In each case, sensing provides demand predictors with accuracy .
Predictive caching uses to skew the cache placement toward predicted-likely content. The resulting hit rate and rate reduction scale with : perfect prediction () yields zero miss rate; no prediction () recovers the MAN baseline.
Definition: Predictive Cache Placement
Predictive Cache Placement
A predictive cache placement policy takes as input:
- Sensing outputs (e.g., user positions, velocities, target IDs).
- A prediction function mapping sensing observations to a probability distribution over files.
and outputs a cache content that depends on the sensing observation.
Prediction accuracy is correctly ranks within top- files.
The policy is dynamic: caches are updated as sensing information arrives. This differs from the static MAN placement (which assumes no prior knowledge of demands).
Theorem: Predictive Caching Rate Improvement
For an ISAC-cached system with prediction accuracy , a predictive cache placement achieves expected delivery rate Linear interpolation between the MAN rate (no prediction) and zero rate (perfect prediction).
With accuracy , the cache holds the right content for a fraction of users — no delivery needed. The remaining fraction uses standard MAN delivery. Hence the linear reduction.
Correct predictions
For the users with correctly predicted demands, cache already contains the requested file. Zero delivery needed.
Incorrect predictions
Remaining users need MAN-style delivery. Treat as a smaller MAN system with users and same .
Effective rate
Delivery rate (the MAN scheme's rate is essentially linear in number of "active" users at fixed ).
Total rate
.
Predictive Caching Hit Rate vs Sensing Accuracy
Cache hit rate as a function of sensing-driven prediction accuracy . Base hit rate (no prediction, ) equals MAN baseline. Perfect prediction () gives 100% hit rate.
Parameters
Sensing CRB vs Cache Size
Sensing CRB as a function of cache size . Coded caching reduces communication overhead → more resources for sensing → lower CRB. Compare to uncached baseline.
Parameters
Example: V2X Map-Tile Predictive Caching
Autonomous vehicle approaching a merge junction. Sensing reports position, velocity, heading. Map-tile demand is highly predictable from these: upcoming segments have probability 0.9 of being requested within 2 seconds. Design a predictive caching strategy.
Sensing inputs
Position (GPS + lidar): ~1 m accuracy. Velocity (radar): m/s. Heading (IMU + vision): .
Prediction
From position + velocity + heading, project 2-second horizon. Highly likely segments: 3 tiles (merge, exit, ramp). Accuracy .
Cache allocation
Allocate 70% of cache to predicted tiles; 30% to popular static content (near-home, work).
Performance
Expected hit rate: (predicted tiles cached when needed). Delivery rate: . Network load in merge critical zone reduced .
Safety implication
Freed bandwidth supports additional sensing channels or higher sensing refresh rate — enhancing collision avoidance precisely at high-risk locations.
Predictive Coded Caching with ISAC
The CommIT group introduced predictive coded caching driven by ISAC outputs. Key contributions:
- Prediction framework. Formalized the mapping from sensing observations (position, velocity, trajectory) to per-file demand probabilities .
- Rate characterization. Showed expected delivery rate decreases linearly in prediction accuracy: .
- Adaptive placement algorithm. An online algorithm that updates cache content as sensing arrives, approaching the offline optimum within a constant factor.
- V2X and smart-city case studies. Demonstrated 3-10× bandwidth reduction in realistic traffic scenarios.
This work is among the first to rigorously couple ISAC outputs with coded caching, enabling the sensing → caching feedback loop central to 6G.
Common Mistake: Prediction Errors Can Degrade Below MAN
Mistake:
Assuming predictive caching always outperforms static MAN.
Correction:
If the sensing predictor is miscalibrated (e.g., effectively, i.e., systematically wrong), the predictive cache may perform worse than the static MAN baseline, since the cache holds rare content while common demands miss.
Robust predictive schemes mix static MAN and predictive placement: reserve a fraction of cache for MAN's combinatorial pattern (preserves worst-case guarantees) and for predictive content. The combined scheme is at least as good as MAN.
Key Takeaway
Sensing-driven predictive caching linearly reduces delivery rate with prediction accuracy . At : zero delivery. At : MAN baseline. The CommIT Zhou-Caire 2023 framework formalizes the sensing → caching direction, closing the ISAC-caching loop. Robust deployments mix static MAN and predictive placement for worst-case safety.