Sanity check please! PHP/memcached strategy

mike mike503 at gmail.com
Fri Oct 26 23:03:16 UTC 2007


I have the code below created and this has been my "scholastic"
approach to adding in a cache layer seamlessly. by caching each row i
have granular information that I can invalidate by creating an
asset_set($entry_id) function and invalidating (and re-caching) the
cache for that row.

I know that people use this strategy (or at least part of it) with
success. I've seen some code samples/explanations that map to this
somewhat. However, am I still insane here, if the page has 50 assets,
that is 50 memcached calls.

Now I know some clients do request pipelining, in this case, would the
PECL/memcache module issue 50 indivdual requests or be able to
pipeline some into a few multigets grouped by the server?

Any information is helpful. I'm trying to build this to scale from the
start and want an appropriate caching strategy...

function asset_get($entry_id) {
        $key = "assets:{$entry_id}";
        if(!$r = cache_get($key)) {
                $q = db_query("SELECT individual_info FROM entries
WHERE entry_id=$entry_id");
                if(db_numrows($q) == 1) {
                        $r = db_rows_assoc($q);
                        cache_set($key, $r);
                        return $r;
                }
        }
        return $r;
}

function asset_list() {
        $return = array();

        $q = db_query("SELECT entry_id FROM entries ORDER BY submitted DESC");
        while(list($entry_id) = db_rows($q)) {
                $r = coolstuff_get($entry_id);
                $return[] = $r;
        }
        db_free($q);

        return $return;
}

function asset_set($entry_id, $stuff) {
        $key = "assets:{$entry_id}";
        cache_del($key);
...      do the updates and stuff here. maybe issue an
asset_get($entry_id) after to refill the cache with the updated info
...
}


More information about the memcached mailing list