Caching Levels

 

No caching

Local Cache

  • Pos.
    • No Latency
  • Cons.
    • Require RAM for cache on same machine which limited
      • Leak in cache will kill application
    • In-most require more CPU for garbage collecting

Local Cache: In-Process Cache for Immutable objects

  • Pos
    • No (de)serialization needed.
    • NO latency - fastest of all possible caches
  •  Cons.
    • Require an app was designed to work with Immutable Objects which allows share them by  multiple threads.
    • Servers: Possible only in thread-per-request servers (Java EE, IIS Asp.Net) and not possible in process-per-request servers (Apache, CGI)

Local Cache: In-Process Cache for Mutable objects

  • Pos
    • Minimum latency - deserialization and allocation time
  • Cons
    • (De)serialization penalty

Local Cache: Out-Process Cache 

  • Latency between processes
  • (De)serialization penalty

Distributed Cache

  • HUGE network latency
  • (De)serialization penalty 
  • Scalable


http://syntx.io/in-process-caching-vs-distributed-caching/
http://openmymind.net/Do-More-In-Process-Caching/
http://java.dzone.com/articles/process-caching-vs-distributed

Still exploring this topic.
Don't really understand why many enterprise projects use Redis for caching DBs query caching:
  • Same HUGE latency over network (x10-x100 than RAM)
  • Speed also not significantly higher (x2-x5). This is because only simple DBs queries can be putted into cache (in other case you will should solve a lot of invalidation issues), and simple DBs queries are already in DBs cache.
  • Apart, less locks inside DB 
  • Apart, local memory have limits, plus GC.
  • Apart, Redis have other sweet functional benefits SO: Redis Cache vs Memory , but this is not related to performance.

Comments