sanbat at gmail.com
Fri Sep 21 06:42:09 UTC 2007
> There is one thing computers do really well, and that is iterating thru
> I would suggest that getting the list from memcache and then processing it
> is not going to be any slower than getting a list of results from a heavy
> database query... and... iterating through that. Especially if you use the
> get_multi instead of thousands of individual gets (but that take a little
> extra programming to detect an element that wasn't in the cache, but you
> might get better over-all response times)
> Of course, its always a useful thing to then store in memcache your
> PROCESSED data. So after you've iterated through your list and formatted or
> ordered the data in some format that is useful for you, cache it. Then next
> time, use the cached version and you don't have to iterate it again. If
> your data changes frequently, figure out the delta and use it as your expiry
I suppose the main problem is this... If I wanted to store the entire list,
I would have to fetch the entire dataset from the DB, whereas if I were
doing it via SQL queries, I would use paging.
Does this mean that I would, the first time a user logs in and interacts
with this list, to fetch the entire set instead of say just page 1, then use
the entire set when doing paging and other organizing?
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the memcached