×
Create a new article
Write your page title here:
We currently have 3,189 articles on s23. Type your article name above or create one of the articles listed here!



    s23
    3,189Articles

    Mediawiki Upgrade Tutorial with Spamblacklist patch: Difference between revisions

    Content added Content deleted
    imported>mutante
    imported>mutante
    Line 9: Line 9:
    ===Backup before you Upgrade===
    ===Backup before you Upgrade===


    Making a backup of a mediawiki installation is basically a two-step process. Copying the regular files and making a database backup.
    Making a backup of a [[mediawiki]] installation is basically a three-step process. Copying the regular files, making a database backup and sending them to a remote backup location.


    ==== Files ====
    ==== Copy "w" directory ====


    If you follow the standard wikipedia way to hide "index.php" in URLs and your webserver's document root is /var/www/, you will have the wiki physicall installed in /var/www/w and an alias to /var/www/wiki in your apache config. Hence, something like:
    If you follow the standard wikipedia way to hide "index.php" in [[URL]]s and your webserver's document root is /var/www/, you will have the wiki physically installed in /var/www/w and an alias to /var/www/wiki in your [[apache]] config. Hence, something like:


    cp -r /var/www/w /home/backup/w_20051223
    cp -r /var/www/w /home/backup/w_20051223


    would be sufficient. Afterwards copy the backup directory to a remote server ,f.e. via [[
    would be sufficient.

    ==== Dump Database ====

    To make a dump of the [[Mysql]] database, use the "mysqldump" command on a console.

    mysqldump -u root -p wikidb > wikidb_20051223.sql

    To save diskspace and bandwidth you can now compress the dump file, f.e. with .tar.gz

    tar zcvf wikidb_20051223.sql.tar.gz wikidb_20051223.sql

    ==== Copy to remote location ====

    Finally copy the files to a remote server, f.e. via [[scp]].

    scp wikidb_20051223.sql.tar.gz user@backupserver.com:/home/user/backups/


    {{Expandsect}}
    {{Expandsect}}

    Revision as of 18:33, 18 December 2005

    Please drop a line on the talk page with feedback or comments on this page

    What for?

    Maybe Wikipedia/Mediawiki developers are taking care of things in the next version of Mediawiki, but in the mean time smaller wikis are getting totally blasted with spam. This is totally wearing the wiki maintainers down and stifling creativity and constructivness with most of the energies spent on making repetitive rollbacking to non-spammed versions and IP blocking which is ineffective since the attacks come from multiple IPs. So this wikipage has been created to help those folks who are looking to control the amount of spam on their wiki.

    Requirements

    Please improve this section.

    SpamBlacklist

    Please improve this section.

    Backup before you Upgrade

    Making a backup of a mediawiki installation is basically a three-step process. Copying the regular files, making a database backup and sending them to a remote backup location.

    Copy "w" directory

    If you follow the standard wikipedia way to hide "index.php" in URLs and your webserver's document root is /var/www/, you will have the wiki physically installed in /var/www/w and an alias to /var/www/wiki in your apache config. Hence, something like:

    cp -r /var/www/w /home/backup/w_20051223
    

    would be sufficient.

    Dump Database

    To make a dump of the Mysql database, use the "mysqldump" command on a console.

    mysqldump -u root -p wikidb > wikidb_20051223.sql
    

    To save diskspace and bandwidth you can now compress the dump file, f.e. with .tar.gz

    tar zcvf wikidb_20051223.sql.tar.gz wikidb_20051223.sql
    

    Copy to remote location

    Finally copy the files to a remote server, f.e. via scp.

    scp wikidb_20051223.sql.tar.gz user@backupserver.com:/home/user/backups/
    
    Please improve this section.
    Cookies help us deliver our services. By using our services, you agree to our use of cookies.
    Cookies help us deliver our services. By using our services, you agree to our use of cookies.