Manual:Backing up a wiki/id

From MediaWiki.org
Jump to: navigation, search
This page is a translated version of the page Manual:Backing up a wiki and the translation is 23% complete.

Other languages:
Deutsch • ‎English • ‎español • ‎suomi • ‎français • ‎Bahasa Indonesia • ‎italiano • ‎日本語 • ‎Nederlands • ‎polski • ‎português • ‎português do Brasil • ‎русский • ‎中文

Melakukan backup wiki Anda pada MediaWiki menjadi penting untuk dilakukan apabila Anda tidak ingin kehilangan data. Pada laman ini akan dijelaskan secara umum bagaimana melakukan backup data wiki Anda pada MediaWiki; Untuk kebutuhan yang lebih spesifik misalnya backup secara periodik, Anda perlu melakukan penyesuaian tersendiri.

Tinjauan[edit | edit source]

MediaWiki menyimpan data pada dua lokasi, yaitu:

Database 
Halaman berikut isinya, pengguna berikut preferensinya, metadata, indeks pencarian, dsb.
File System 
Berkas konfigurasi perangkat lunak, skin kustomisasi, ekstensi, gambar (termasuk di dalamnya gambar yang dihapus), dsb.

Consider making the wiki read-only before creating the backup - see $wgReadOnly. This makes sure all parts of your backup are consistent (some of your installed extensions may write data nonetheless).

File transfer[edit | edit source]

You will have to choose a method for transferring files from the server where they are:

  • Non-private data you can simply publish on archive.org and/or in a dumps/ directory of your webserver.
  • SCP (or WinSCP), SFTP/FTP or whatever other transfer protocol you're used to/is available.
  • The hosting company might provide a file manager interface via a web browser; check with your provider.

Database[edit | edit source]

Mayoritas data kritikal wiki tersimpan di dalam database, berarti dapat disimpulkan bahwa umumnya proses backup akan cukup sederhana. Saat menggunakan basisdata MySQL, terdapat beragam perkakas yang tersedia yang dapat membantu melakukan "dumping" database ke dalam sebuah berkas. Jika merasa dibutuhkan, dapat dibuat script yang berfungsi untuk melakukan backup secara periodik.

MySQL[edit | edit source]

Mysqldump from the command line[edit | edit source]

Contoh perkakas adalah MySQL dump tool yang merupakan aplikasi command-line. Aplikasi ini dapat menghasilkan sebuah berkas dump dari sebuah database. Kustomisasi dapat dilakukan dengan menambahkan opsi. Misalnya kustomisasi format karakter encoding berkas keluaran.

Contoh perintah yang dapat Anda jalankan dari crontab adalah sebagai berikut:

$wgReadOnly = 'Dumping Database, Access will be restored shortly';

this can be removed as soon as the dump is completed.

Example of the command to run on the Linux/UNIX shell:

mysqldump -h hostname -u userid --password --default-character-set=whatever dbname > backup.sql

Gunakan variabel yang valid untuk $USER, $PASSWORD, $DATABASE. Perintah ini akan menghasilkan berkas backup yang disertai dengan timestamp saat dilakukan backup. Hal ini akan memudahkan untuk melakukan pemeriksaan kapan proses backup dilakukan.

See mysqldump for a full list of command line parameters.

The output from mysqldump can instead be piped to gzip, for a smaller output file, as follows

mysqldump -h hostname -u userid --password dbname | gzip > backup.sql.gz

A similar mysqldump command can be used to produce xml output instead, by including the --xml parameter.

mysqldump -h hostname -u userid --password --xml dbname > backup.xml

and to compress the file with a pipe to gzip

mysqldump -h hostname -u userid --password --xml dbname | gzip > backup.xml.gz

Remember to also backup the file system components of the wiki that might be required, eg. images, logo, and extensions.

Running mysqldump with Cron[edit | edit source]

Cron is the time-based job scheduler in Unix-like computer operating systems. Cron enables users to schedule jobs (commands or shell scripts) to run periodically at certain times or dates.

A sample command that you may run from a crontab may look like this:

nice -n 19 mysqldump -u $USER --password=$PASSWORD $DATABASE -c | nice -n 19 gzip -9 > ~/backup/wiki-$DATABASE-$(date '+%Y%m%d').sql.gz

The nice -n 19 lowers the priority of the process.

Use valid values for $USER, $PASSWORD, and $DATABASE. This will write a backup file with the weekday in the filename so you would have a rolling set of backups. If you want to save the files and extensions as well, you might want to use this one.

Warning Warning: Jangan mencoba melakukan backup basisdata MediaWiki menggunakan mysqlhotcopy. Format tabel yang digunakan MediaWiki tidak dapat dibackup menggunakan perkakas ini. Tidak akan ada pemberitahuan mengenai kegagalan backup menggunakan perkakas ini.!

If you want to add this task in Cron through Cpanel then you must escape the character "%"

/usr/bin/mysqldump -u $USER --password=$PASSWORD $DATABASE -c | /bin/gzip > ~/backup/wiki-$DATABASE-$(date '+\%Y\%m\%d').sql.gz

or you will get an error:

/bin/sh: -c: line 0: unexpected EOF while looking for matching `''
/bin/sh: -c: line 1: syntax error: unexpected end of file

Tabel[edit | edit source]

Under close examination one finds that some of the tables dumped have various degrees of temporariness. So to save disk space (beyond just gziping), although those tables need to be present in a proper dump, their data does not. However, under certain circumstances the disadvantage of having to rebuild all this data may outweigh the saving in disk space (for example, on a large wiki where restoration speed is paramount).

Lihat arsip diskusi pada milis thread mengenai topik ini.

Latin-1 to UTF-8 conversion[edit | edit source]

See the relevant section of the upgrading page for information about this process. Also see the talk page for more information about working with character sets in general.

PostgreSQL[edit | edit source]

You can use the pg_dump tool to back up a MediaWiki PostgreSQL database. For example:

pg_dump mywiki > mywikidump.sql

will dump the mywiki database to mywikidump.sql.

To restore the dump:

psql mywiki -f mywikidump.sql

You may also want to dump the global information, e.g. the database users:

pg_dumpall --globals > postgres_globals.sql

SQLite[edit | edit source]

See Manual:SQLite#Backing up.

phpMyAdmin[edit | edit source]

Turn your wiki to read only by adding $wgReadOnly = 'Site Maintenance'; to LocalSettings.php.

Open the browser to your phpadmin link, login, choose the wiki database. (Check LocalSettings.php if you're not sure). Select Export. Make sure all items under Export are highlighted, and make sure Structure is highlighted (it's important to maintain the table structure). Optionally check Add DROP TABLE to delete existing references when importing. Make sure Data is checked. Select zipped. Then click on GO and save the backup file.[1]

Remove $wgReadOnly = 'Site Maintenance'; from LocalSettings.php

Remember to also backup the file system components of the wiki that might be required, eg. images, logo, and extensions.

External links[edit | edit source]

Untuk tutorial yang telah tertulis rapi, bisa melihat: MySQL Export: Bagaimana melakukan backup database phpMyAdmin

HeidiSQL[edit | edit source]

HeidiSQL is similar to phpMyAdmin, but without any restrictions of phpMyAdmin's free version.

File system[edit | edit source]

MediaWiki stores other components of the wiki in the file system where this is more appropriate than insertion into the database, for example, site configuration files (LocalSettings.php, AdminSettings.php), image files (including deleted images, thumbnails and rendered math and SVG images, if applicable), skin customisations, extension files, etc.

Metode yang terbaik untuk membackup berkas-berkas ini adalah dengan menempatkan mereka sebagai berkas archive seperti .tar. Bagi pengguna sistem operasi Windows, dapat menggunakan aplikasi WinZip atau 7-zip.

Jika menggunakan XAMPP, backup seluruh folder "wiki" di dalam "htdocs".

  tar zcvhf wikidata.tgz /srv/www/htdocs/wiki

It should be possible to backup the entire "wiki" folder in "htdocs" if using XAMPP.

Backup the content of the wiki (XML dump)[edit | edit source]

It is also a good idea to create an XML dump in addition to the database dump. XML dumps contain the content of the wiki (wiki pages with all their revisions), without the site-related data (they do not contain user accounts, image metadata, logs, etc).[2]

XML dumps are less likely to cause problems with character encoding, as a means of transfering large amounts of content quickly, and are easily be used by third party tools, which makes XML dumps a good fallback should your main database dump become unusable.

To create an XML dump, use the command-line tool dumpBackup.php, located in the maintenance directory of your MediaWiki installation. See Manual:dumpBackup.php for more details.

You can also create an XML dump for a specific set of pages online, using Special:Export, although attempting to dump large quantities of pages through this interface will usually time out.

To import an XML dump into a wiki, use the command-line tool importDump.php. For a small set of pages, you can also use the Special:Import page via your browser (by default, this is restricted to the sysop group). As an alternative to dumpBackup.php and importDump.php, you can use MWDumper, which is faster, but requires a Java runtime environment.

See Manual:Importing XML dumps for more information.

Without shell access to the server[edit | edit source]

If you have no shell access, then use the WikiTeam Python script dumpgenerator.py from a DOS, Unix or Linux command-line. To run the script see the WikiTeam tutorial.

See also Meta:Data dumps.

Script[edit | edit source]

Warning Warning:
Warning Warning: Risiko penggunaan script ini harus ditanggung sendiri.
  • Unofficial web-based backup script, mw_tools, by Wanglong (allwiki.com); you can use it to back up your database, or use the backup files to recover the database, the operation is very easy.
  • WikiTeam tools - if you do not have server access (e.g. your wiki is in a free wikifarm), you can generate an XML dump and an image dump using WikiTeam tools (see some saved wikis)
  • Another backup script that: dumps DB, files, and XML; puts the site into read-only mode; timestamps backups; and reads the charset from LocalSettings. Script does not need to be modified for each site to be backed up. Does not (yet) rotate old backups. Usage: backup.sh -d backup/directory -w installation/directory
  • Script to make periodical backups mw_backup. This script will make daily, weekly and monthly backups of your database and images directory when run as a daily cron job.

Lihat pula[edit | edit source]

Pranala luar[edit | edit source]

  1. Manual_talk:Backing_up_a_wiki#Ubuntu_10.10_-_Step_by_Step_Instructions
  2. XML dumps are independent of the database structure, and can be imported into future (and even past) versions of MediaWiki.