inject script for big(?) files
komtanoo.pinpimai at livetext.com
komtanoo.pinpimai at livetext.com
Fri Jul 28 14:55:52 UTC 2006
Well, I've found out that it's not just changing the sysread in
Danga::Socket to 8k, replication eats a lot of memory when facing 700M
files and it's 90% fail on replication. Here is what I need to fix to get
it working. (MogileFS from CVS)
1. install new perlbal from svn and patch it with the $self->{alive_time}
= time; for PUT.
2. fix the Danga::Socket.
3. quick fix the http_copy sub_route in mogilefsd, looks like it read all
content of a file into memory before writing it to another mogstored.
On Fri, July 28, 2006 6:10 am, Delfim Machado wrote:
> Hi,
> this script lets you inject big file without splitting them.
>
> The out of memory problem is resolved changing the 5MB block to 8k
> block in Danga::Socket, i think this was already talked here.
>
>
> 
>
>
> Brad, when you plan to release any new version?
>
>
> --
> Delfim Machado
>
>
> SMTP: delfim.c.machado at co.sapo.pt
> XMPP: delfim.c.machado at sapo.pt
>
>
> SPAM: ******.*.*******@**.****.**
>
>
>
>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: mogilefsdpatch
Type: application/octet-stream
Size: 3196 bytes
Desc: not available
Url : http://lists.danga.com/pipermail/mogilefs/attachments/20060728/576d63a5/mogilefsdpatch.obj
More information about the mogilefs
mailing list