automatic copy wikimedia articles from wiki1 to wiki2
$30-250 USD
Closed
Posted over 13 years ago
$30-250 USD
Paid on delivery
Hello,
I need a script that copies selected Wikimedia pages from sourceWiki to destinationWiki.
A script on destinationServer (which is on a LAN server) will be started by cron and should be fetching the data from sourceWiki by https request, it must provide login and passwort to authenticate.
SourceServer has database tables (different database then the wiki itself) that specify the login credentials (sha1) and the wiki article names to be fetched to destinationWiki by the respective user, those can be in all namespaces (articles, categories, talk pages etc..)
Existing articles in destinationWiki will be overwritten.
SourceScripts will be in a different directory than the wiki itself but will run with same user as the wiki spaces, the wikis are both .htaccess protected, a direct access to the wiki by http/s is generally out of question.
The script will send an email on errors.
Configuration parameters must be easily accessible to be changed by me, please use some ini or config file.
Both wiki systems are semantic media wikis but the script should also work for standard wikimedia (I do not know if that makes any difference at all).
You will NOT get access to any of the servers, thus I expect you to test in your local environment and submit the completed and working scripts. I do not want to be beta tester, the project will fail if you try to waste my time by repeatedly submitting not working scripts.
The script sources must be open to us.
DB: mysql
Mediawiki version: 1.15.1
Systems: Debian/Ubuntu
may help: [login to view URL]:Maintenance_scripts
If something leaves doubt please ask me before your bid, I am not responsible for wrong estimations due to unclear details.
Happy bidding