# ElastiCache Strategies

## Two Caching Implementation Strategies

* Lazy Loading / Cache-Aside / Lazy Population
* WriteThrough

## Lazy-Loading

![](https://3344504418-files.gitbook.io/~/files/v0/b/gitbook-legacy-files/o/assets%2F-MVv3Zs5vmzXFH_qSrJw%2F-MVvDTzj43Ntb8h5W_Ce%2F-MVvFoUuTm_nvjCqpqzi%2Fimage.png?alt=media\&token=2eb563ab-923b-4077-8367-3d36a87b9f19)

Pros:

* Only requested data is cached; no useless data&#x20;
* Node failures are not fatal (just increased latency to warm cache)

Cons:

* Cache miss penalty will result in 3 round trips, noticeable delay in initial request
* Stale data: data can be updated in database and outdated in the cache

## WriteThrough

![](https://3344504418-files.gitbook.io/~/files/v0/b/gitbook-legacy-files/o/assets%2F-MVv3Zs5vmzXFH_qSrJw%2F-MVvDTzj43Ntb8h5W_Ce%2F-MVvG5yAvXmjDNxHtyfu%2Fimage.png?alt=media\&token=1991201e-198f-41be-801a-7e5e88b41061)

Pros:

* Data in dcache is never stale, reads are quick
* Write penalty instead of ready penalty (each write requires 2 calls)

Cons:

* Missing Data until it is added/updated in the DB. **Mitigation** is to implement **Lazy-Loading** strategy in conjunction&#x20;
* **Cache churn** - a lot of the data will never be read into the cache&#x20;

## Managing Cache with Cache Evictions and Time-To-Live (TTL)

Eviction can occur in three ways:

1. Delete item explicitly in the cache
2. Item evicted due to full memory and not been recently used (LRU)
3. Set item time-to-live (TTL)

TTL can range from a few seconds to hours and days.
