Large file problem/Split file

Eric Lambrecht eml at guba.com
Wed Jul 26 23:48:12 UTC 2006


komtanoo.pinpimai at livetext.com wrote:
> I got "out of memory!" for inserting hundreds megabyte of files into
> Mogilefs, so I googled for a while and found the issue had been discussed,
> http://lists.danga.com/pipermail/mogilefs/2005-October/000199.html.

Was mogstored dying on you or was it your client?

> There are many files in my system that's between 50M-700M, so I'm thinking
> about splitting each of them by the maximum of 5M, to avoid
> injecting/replicating problems. My system has Perlbal in the front end,
> mod_perl server at the back for getting_path of files and reproxying to
> mogstored via Perlbal. My question is, if I split those files into many
> parts, do I have to write a special webserver to assemble those files
> before sending it to Perlbal, so Perlbal has to reproxy to my special
> webserver that also does the assembly instead of redirecting to mogstored
> as small files ? Or does mogstored supports assemblying files.

Mogstored currently doesn't support assembling files so, yeah, perlbal 
will have to reproxy to your special webserver that does the assembly.

> Sounds confusing ?... How do you solve this problem ?

Since the patch you reference in your email, we've had no problems 
storing or serving files up to 2GB in size. You just have to watch out 
for some of the perl client API functions that want to read the entire 
file into memory - they'll kill your client pretty quickly.

Eric...



More information about the mogilefs mailing list