mogilefs installation on sarge

Justin Azoff JAzoff at
Wed May 4 20:05:48 PDT 2005

So I installed mogilefs from cvs on a few sarge systems.  Took about an
hour from start to finish.  I took some notes :-)

perlbal does not build(test error)
change 16 to 18
RCS file: /home/cvspub/wcmtools/perlbal/t/00use.t,v
-use Test::More tests => 16;
+use Test::More tests => 18;

some of the required libs are missing from the perl makefile stuffs..
trivial to fix, but having them specified can't hurt:

mogilefs server requires DBD/
utils requires Compress/

Something in the debian dirs for mogilefs-utils and mogilefs-perl causes
the resulting .deb to only include /usr/share/doc.  Looks like something
is up with PREFIX...

tracker needs port 7001 open
storage nodes need port 7500 open

Only the tracker needs to connect to mysql.

In mogtool I changed some "print" to "print STDERR" so I can pipe
"extract foo - " to other programs and not have the input be corrupted.

In all it seems to work nice.  My only somewhat trouble is I'm not sure
why mogilefs should be in charge of the directory hashing(instead of the
application), and why the file keys can't be full path names... (well
you can store the path name as the key, but it doesn't store it that

My thinking is that say you have a few node cluster, and your tracker
database breaks completely.  If the files were stored as their actual
names, you would have /var/mogdata/dev1/domain/path/to/file instead
of /var/mogdata/dev1/0/000/000/....  Then it would be possible to write
a script that loads the filenames from each host back into the tracker.
I think the 'big' files may complicate this.. There might be a good
reason for doing things this way that I just don't understand yet :-)

Though I suppose you could have it do both... say:
if a key starts with '/' treat it as the filename
otherwise hash it and store it wherever...

-- Justin Azoff
-- Network Performance Analyst

More information about the mogilefs mailing list