Gzhu at Ironplanet.com
Wed Jun 11 18:29:01 UTC 2008
In most of the cases, LRU should be good enough. If the 'important' items are used very often, they will be naturally kept in cache. Locking some rarely-used items in cache, I cannot see the justification for the cost of discarding other often-used items and constantly re-creating them as they are referenced frequently.
But if the feature is really wanted, the idea of identifying items not subject to LRU (discarded only after they expire), I agree with Dustin, this should not override expiry parameter.
From: memcached-bounces at lists.danga.com [mailto:memcached-bounces at lists.danga.com] On Behalf Of Dustin Sallings
Sent: Wednesday, June 11, 2008 9:31 AM
To: Reinis Rozitis
Cc: <memcached at lists.danga.com>
Subject: Re: item expiration
IMO, the idea does conflict a bit with the idea of a cache, at
least as I understand it.
Implementation-wise, it seems like another command to flag an
existing record as one that should specify LRU priority.
I can imagine if you have items you want to not be removed by LRU
that you perhaps also have items you consider cheaper than others and
would prefer these other items discarded before more expensive items
Likewise, it doesn't unreasonable to have items you want to expire
at a particular point in time, but not before it.
If you can imagine communicating a cost of invalidation to
memcached, you could see how careful use of such a thing could
potentially make for more efficient systems. For a finite list of
priorities, it could be inplemented quite easily and efficiently.
Dustin Sallings (mobile)
On Jun 11, 2008, at 5:48, "Reinis Rozitis" <roze at roze.lv> wrote:
>> RR the patch you pointed to sounds exactly what I'm looking for.
>> If you would not mind posting your latest version patch please.
> This is not my patch (all credits go to Paul G) but just a bit
> tweaked to apply for current (1.2.5) release.
> As to answering few of Brads comments:
> I think Paul wanted to show just some quick working concept of the
> feature which in real world situations is pretty usefull rather than
> push the patch directly into source. Ram is cheap, but in some cases
> you still can't deploy enough to satisfy all the web-coder needs so
> instead of cycling the cache all the time give some logic and
> decision possibilities to (client) software which data is more
> It sure needs love to cope with the current code-style, but the
> question is whether the current developers are fine with the idea/
> option at all.
More information about the memcached