>From: Juhana Sadeharju <kouhia@email-addr-hidden>
>>
>>Definitely take a look at mediawiki: http://www.mediawiki.org/
>
>Many wiki pages have both the current wiki pages and the control pages
>at the same directory. Then wget may download multiple GBs(!!) of
>control pages instead of 200 MB of wiki pages.
Because wget got trapped to such a wiki recently I have true
figures:
-Document is about 20 MB (with duplicate files I'm sure)
-docu/index.php/ directory with Special:Recentchangeslinked/ etc subdirs
is 63 MB
-The whole wiki at docu/ is 13.5 GB with 1,600,000 files
20 MB vs 13.5 GB makes me think the wiki is not of the best technology.
The download software could be better as well. I will mail my
suggestions to wget list.
Juhana
-- http://music.columbia.edu/mailman/listinfo/linux-graphics-dev for developers of open source graphics softwareReceived on Sat Jul 8 20:15:03 2006
This archive was generated by hypermail 2.1.8 : Sat Jul 08 2006 - 20:15:03 EEST