Results 1 to 9 of 9

  Click here to go to the first Developer post in this thread.  

Thread: Wiki site PDF dump?

  1. #1

    Wiki site PDF dump?

    Hi All,

    Is there any one that knows how I can make an offline version of the TOH wiki so that I can use the wiki information offline ?

    PS: Yes, there are homes in UK that have no internet ...
    My rig is in the Spoiler:

  2.   Click here to go to the next Developer post in this thread.   #2
    will look into that as part of updating and improving BIKI ... no ETA atm.

    but thanks for pointing it out ...

  3. #3
    You can get the lite version of the COMREF which is pretty much the same thing.

    http://www.ofpec.com/ed_depot/index....=391&game=ArmA
    Documentation is not a dirty word.
    : TeamSPAFF : PRACS : RKSL : Stella Artois : Creme Eggs : XTRA :

    Making working streetside IEDs since 2011.

  4. #4
    Master Gunnery Sergeant konyo's Avatar
    Join Date
    Jun 26 2009
    Location
    United kingdom
    Posts
    1,264
    On the wiki page just click 'Ctrl' + 'S' And save it as a webpage html or something for viewing offline.


    Follow konyo.webs On Twitter : https://twitter.com/konyoWebs

    "In truth, these men never let anyone down, which is why they are no longer here today." - R.I.P

  5. #5
    Lance Corporal fboes's Avatar
    Join Date
    Oct 25 2011
    Location
    Berlin, Germany
    Posts
    59
    There is a tool called HTTrack that downloads whole sites for offline browsing. But you should be careful with this tool, our you might download… THE WHOLE INTERNETZ!!!!1

    Just kidding.

  6. #6
    Dwarden, The fact is that the system seems (I'm not sure how good it is) to be already developed but should be run on server side. See: http://www.mediawiki.org/wiki/Extension:Pdf_Export

    The only downside of this method is that "only" goes down one link level, however that would be good as it limits the bandwidth / server CPU and memory required and it would be more than enough to capture the massive: http://community.bistudio.com/wiki/C...On_Helicopters

    Anyway, as usual, thanks for taking in to account our feedback.

    Tankbuster, I know that PDF file and part of the reason I made that post is because I was wandering how those guys did it... I made a pretty extensive research and I have found many ways to do that on wikipedia (apparently they dump they XML database regularly for mirroring purposes) but not a single one that works in "MediaWiki". Thanks anyway for your spot on answer.

    konyo, While you are correct you would need many hours and patience to rework the (roughly) 1300 links that relate to TOH as well as to download every single page... your proposal doesn't look feasible to me, but thanks.

    fboes, I know this tool as well as WGET and some others, have you had any experience trying to download a wiki site? (this is not a rhetorical question) On the other hand I'm not sure this is an acceptable method (this is worse than a robot/web crawler), yes I know you can limit the assigned bandwidth but it is still a systematic dump... chances are that a firewall gets upset with such a reiterative IP entry... Thanks

    Isaac

  7. #7
    Lance Corporal fboes's Avatar
    Join Date
    Oct 25 2011
    Location
    Berlin, Germany
    Posts
    59
    We were working with HTTrack to do an offline version of a whole internet magazine (you had to submit your site as offline cd-rom for a contest… don't ask… ). We set it up to play nice with the servers (because there was normal traffic), only allowed it to go down to a certain level, and told it to be really slow.

    So it can be done - but you have to be really careful with your settings. The best way of doing this is to ask permission, and do it in low-traffic hours (like 2 am).

    And it is far less brutal then a Google bot swamping you with 10,000 requests in a matter of seconds.

  8.   This is the last Developer post in this thread.   #8
    Isaac that extension is obsolete and beta, i aim first for stable extensions or actively developed and tested as working ...

    see http://community.bistudio.com/wiki/Planned_Extensions

  9. #9
    fboes, well, it is always good to learn new things, if I finally decide to go ahead with this I will ask first ... Oh, wait a second, now I can foresee some extra issues, I didn't realized at the beginning, let's say I finally get a html render dump of the wiki database... ok, so now I have an offline copy of the same information formatted in the very same way, in other words I solved my lack of connection at home but I just made an offline port of the contents... (Ok, I have to give you that, you solved 50% of the problem ).
    The reason I started this thread is to have kind of a structured PDF so that I can print the wiki and have that information, black over white, when I'm in the editor because I don't think it is practical to have more applications running in parallel...

    Dwarden, thanks for the link, I didn't realized of the beta status... Sorry about that.

    [Me thinking loud] How the heck those guys did that wonderful PDF? Going link by link?
    Last edited by Isaac; Nov 9 2011 at 17:10.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •