From the course: Designing Highly Scalable and Highly Available SQL Databases

Unlock the full course today

Join today to access over 22,400 courses taught by industry experts or purchase this course individually.

Reducing latency with caching

Reducing latency with caching

From the course: Designing Highly Scalable and Highly Available SQL Databases

Start my 1-month free trial

Reducing latency with caching

- [Instructor] Caching is a common technique for improving database performance, particularly with regards to reading. And that's because what we're doing is we're taking advantage of two things. One, we can read from memory a lot faster than we can read from persistent disk, like hard disk drives. But also the other factor that we can take advantage of is that we often need the same data more than once. So if we make a query, and we pull up particular blocks of data, and we need it for one operation, there's a nontrivial, non-small probability that we'll need that data again in the near future. We can take advantage of that by basically storing that data once we've read it once into a low latency, very fast read storage system, in this case, a particular cache. Now, what we're doing is we're trading off, as always. There is always a trade off. And in this case, what we're trading is lower latency reads, but what we…

Contents