binary protocol time representation

Dan Farina drfarina at
Thu Jul 12 12:22:54 UTC 2007

On Thu, 2007-07-12 at 14:11 +0200, Antonello Provenzano wrote:
> I can't speak for other environments, but converting a .NET "tick"
> representation of dates into UNIX epoch time is quite easy and
> requires just 2 lines of code.

It may cost two lines, but costs similarly much to use the "N seconds in
the future" representation. I suggest we just use that one and ditch
UNIX epoch time for two reasons:

1) Protocol aesthetics
2) Evade timezone woes

> the main problem with the binary protocol could be another one
> instead: you should mainly focus on bytes alignment, which under .NET
> is quite different from Java or C.
> To interoperate with systems that are not running under .NET (or COM
> using Interop) many times I need to represent bytes by using integers
> and in many cases the conversion from byte buffers to integers is not
> correct (.NET uses integers at 4, 8, 16 bytes) and the result corrupts
> the whole system.

This I don't understand. Do you mean issues of reading stuff off the
wire and getting strange numbers out of byte buffers? Perhaps endian
problems? Or something more sinister?


More information about the memcached mailing list