<blockquote class="gmail_quote" style="PADDING-LEFT: 1ex; MARGIN: 0px 0px 0px 0.8ex; BORDER-LEFT: #ccc 1px solid">There is one thing computers do really well, and that is iterating thru lists.<br>I would suggest that getting the list from memcache and then processing it is not going to be any slower than getting a list of results from a heavy database query... and... iterating through that. Especially if you use the get_multi instead of thousands of individual gets (but that take a little extra programming to detect an element that wasn't in the cache, but you might get better over-all response times)
<br><br>Of course, its always a useful thing to then store in memcache your PROCESSED data. So after you've iterated through your list and formatted or ordered the data in some format that is useful for you, cache it. Then next time, use the cached version and you don't have to iterate it again. If your data changes frequently, figure out the delta and use it as your expiry time.
</blockquote>
<div> </div>
<div>I suppose the main problem is this... If I wanted to store the entire list, I would have to fetch the entire dataset from the DB, whereas if I were doing it via SQL queries, I would use paging.</div>
<div> </div>
<div>Does this mean that I would, the first time a user logs in and interacts with this list, to fetch the entire set instead of say just page 1, then use the entire set when doing paging and other organizing?</div>