There are theoretical models of random-access machines with fixed-size caches, on which data structures like B-trees and others can be (and are) analysed for asymptotic performance. It's usually not taught until late undergrad or graduate level though.
(You may have heard the terms "cache aware" and "cache oblivious" used to describe algorithms. These terms come out of theoretical CS, describing algorithms with good performance on these "RAM with cache" machines when the cache size is known or unknown ahead of time, respectively.)
(You may have heard the terms "cache aware" and "cache oblivious" used to describe algorithms. These terms come out of theoretical CS, describing algorithms with good performance on these "RAM with cache" machines when the cache size is known or unknown ahead of time, respectively.)