Multiget/intelligent generic PHP wrapper function... thoughts/advice wanted

mike mike503 at gmail.com
Fri Nov 2 22:11:58 UTC 2007


On 11/2/07, Dustin Sallings <dustin at spy.net> wrote:

>        I can somewhat understand what you're saying, but what you call a
> rule of thumb, many would call ``premature optimization.''  You're
> wanting to rewrite small loops between network service calls in C and
> add complexity to an API because you are afraid that it *might* be
> too slow in some of the least likely cases.
>
>        Could you at least provide more than a rule of thumb before
> suggesting API changes?

Fair enough.

>        I'd strongly recommend you to use bindings *before* you find out
> you're wrong about the control you're exercising.  Someday you'll be
> entering data for Little Bobby Tables and everything will go terribly
> wrong.  The price of protection is *really* low, and in many cases
> can make things much faster.

By parameter binding are you talking about using prepared statements
in the SQL queries?

I have never had any of my applications that I can remember suffer
from SQL injection. I always use mysql_escape_string, the PHP filter
functions on POST and GET data, type checking and bounds checking (for
numeric information)

That's not to say it couldn't happen but that is lower on my priority
list and something the language already provides. I was looking to
address the cache stuff here.

>        You mean to say you have to copy and paste that entire thing for
> every different type of object you'll want to cache?  Could you not
> achieve something closer to what I wrote?  My function will work for
> all variations of cache objects without modification.

I do have a generic cache_get() wrapper, I just plan on having _set
_get etc. type functions for each type of data interaction. In theory
I could just put in the function name in the cache_get() as a
callback, but I do like having control, sometimes a function like
user_get needs some additional work, not just a raw SELECT * FROM
table WHERE user_id=$id.

>        ...but you're working really hard to optimize for a case that you
> haven't proven causes any issues in your application.
>
>        That is to say, if PHP is so incredibly bad at iterating a small
> array that you would work this hard to avoid it, why would anyone
> ever use it?

I think PHP is already fast, but obviously no matter what language it
is, if the code is trying to work with too much data or is written
like crap, it's going to be slow.

I have a very nice generic non-pipelining function set already
written, this thread was aimed to address trying to move it into
pipelineable/multiget-compatible code, and it seemed like having to do
multiple iterations and creating two or more arrays was overkill, once
I noticed the key prefixes were complicating things.


More information about the memcached mailing list