How to delete lots of related keys at once

Jarom Severson j.severson at yahoo.com
Wed Sep 19 17:43:06 UTC 2007


For what it's worth, here is the solution I came up
with and am now using successfully on our site (which
gets about 300k page views per day, so it's had its
trial by fire so to speak). 

Sorry there are so few comments in this file, I
haven't had the time to really clean it up well. 

After you've read through the file, here is an example
of how I use it in the ORM to store the query, and
another example of how I bust the cache on a new
article insert:

//Create our memcache key based on our ORM function
name and parameters
$strCacheKey =
MyMemcache::QueryToKey("Article->LoadAllEnabled(" .
serialize($objOptionalClauses) . ")"); 

//Check memcache
if(false === ($objArticleArray =
MyMemcache::G()->GetQuery($strCacheKey))) {
  //If not in memcache, do the work
  $objArticleArray = Article::QueryArray(
    QQ::AndCondition(
      QQ::Equal(QQN::Article()->User->IsEnabled,true),
      QQ::Equal(QQN::Article()->IsBuried,false),
      QQ::Equal(QQN::Article()->IsLive,true)
    ),
    $objOptionalClauses

  );
  //Stick result set into memcache and put the key
into the "article" set
  MyMemcache::G()->SetQuery($strCacheKey,
$objArticleArray,'article');
}


//When new article is posted, or one is deleted, etc
MyMemcache::G()->DeleteBySet(array('article',MyMemcache::GetUserSetName($intArticleUserId)));

Hopefully that helps.

Jay


--- Dustin Sallings <dustin at spy.net> wrote:

> 
> On Sep 19, 2007, at 7:02 , K J wrote:
> 
> > I'm facing the same problem here.  From the
> responses I'm guessing  
> > that most of you think it's better to cache each
> individual  
> > "article" and perhaps the article titles for
> display.  Listings  
> > still hit the database directly.  The only savings
> would be that  
> > the app wouldn't have to do another SQL to fetch
> the article titles/ 
> > links/data, as it can get it from the cache.
> >
> > However, is there no way to cache these pages
> effectively?  For  
> > instance, what if say I set a default expiration
> time for anything  
> > that's not on page 1.  Then whenever an article
> gets inserted, the  
> > first 10 pages are purged, while pages 11 onwards
> are expired on  
> > their own.
> >
> > What do you guys think of this solution?
> 
> 	Have you looked at the proposed solutions (i.e.
> tags and regex)?
> 
> 	In general, having multiple copies of the same data
> seems like a bad  
> idea.  When you cache the article,  you're avoiding
> the DB hit.  When  
> you cache the page, you're memoizing a template
> render.  These are  
> solving different problems.
> 
> On Sep 19, 2007, at 7:07 , K J wrote:
> 
> > For instance, I'd like to cache my search pages,
> as they are really  
> > database/cpu intensive.  For instance, on a social
> networking site  
> > someone could be searching based on sex, height,
> religion, hobbies,  
> > location, and other combined criteria.  This sort
> of query hits the  
> > databse hard, and it wouldn't take too many of
> those to crash the  
> > db server.
> >
> > In this case wouldn't it be great to cache every
> single search that  
> > comes in, so that in a duplicate search the app
> wouldn't have to  
> > hit the database again?
> 
> 	What do you expect the hit ratio to be on this? 
> What aspect of your  
> search is intensive?  Are you, perhaps pulling too
> much data out with  
> your search results such that you're pulling out
> information on each  
> user that you could look up from cache?
> 
> -- 
> Dustin Sallings
> 
> 
> 
-------------- next part --------------
A non-text attachment was scrubbed...
Name: MyMemcache.zip
Type: application/x-zip-compressed
Size: 1713 bytes
Desc: 506604995-MyMemcache.zip
Url : http://lists.danga.com/pipermail/memcached/attachments/20070919/93870816/MyMemcache.bin


More information about the memcached mailing list