Google Interview Question
Software Engineer / Developersuse simple LRU strategy cache, or use memcached.
for the read operation, it check the Cache first,
if hit the cache, it return the value directly.
if can not hit the cache, it need to read from the Database, and reset the the KV to the Cache.
for the write operation, it should delete the KV in the cache first, to make sure no one will hit the stale data, then write the KV to the Database.
wish it can help.
You can design a cache server using a dynamic data structure like a double linked list. while you must be able to read the cache concurrently, only one thread or process must be able to write to it a time. this could be achieved by procuring specific read and write locks in the POSIX thread API.For cache eviction policy, LRU is a good choice and it can be implemented using counters.
- jimit May 08, 2009