Jul 10th 2013, 1:09:32
Any EE or CS majors out there remember how to calculate this crap and wouldnt mind giving me a brief explanation of how its done? Something with it is just not clicking for me:
I have an application whose memory access pattern is a stream and its entire data set is 128kB. The data cache in your machine has a capacity of 64kB. It is word-addressable and direct-mapped with 32B lines. It is able to fetch one line at a time.
Given the access pattern: 0, 4, 8, 12, 16, 20, 24, 28, 32, 36, 40, …, where each access is a 4B word.
a. What is the miss rate? (4)
I have an application whose memory access pattern is a stream and its entire data set is 128kB. The data cache in your machine has a capacity of 64kB. It is word-addressable and direct-mapped with 32B lines. It is able to fetch one line at a time.
Given the access pattern: 0, 4, 8, 12, 16, 20, 24, 28, 32, 36, 40, …, where each access is a 4B word.
a. What is the miss rate? (4)
Smarter than your average bear.