YOU CAN EDIT THIS PAGE! Just click any blue "Edit" link and start writing!

Wikitravel:Approved scripts

From Wikitravel
Jump to: navigation, search

The following scripts have passed through Wikitravel:Script nominations.


These scripts are still actively running.


So, I'd like to cook up a little botlet to update Wikitravel:Discover and Template:Discover) automatically. It'll be simple enough to implement (and will probably reuse the relevant bits of StatScript) — just delete the first item in the upcoming queue, place it into the top slot in the template and autoarchive the last one. I'll try to add some logic to make it stop and complain if for whatever reason it can't find the bot-readable tags it's looking for. —The preceding unsigned comment was added by Jpatokal (talkcontribs)

What would it do about adding DotM/CotW trivium that BotH and I categorized? - Andrew Haggard (Sapphire) 01:47, 23 June 2006 (EDT)
It would merrily ignore them. However, if entries are chewed up at a precise rate of 1 per 24 hours, elementary differential calculus will let you synchronize DOTMage and related discoveries. We could change the list type from * to # to make this even easier. Jpatokal 01:51, 23 June 2006 (EDT)
I'm wooed. Let me ask one more question so that I can make sure I do understand. The bot will ignore CotW/DotM trivium, unless we change the script to take listings from "#DOTM trivia"? - Andrew Haggard (Sapphire) 01:57, 23 June 2006 (EDT)
Yup. Basically, I'll just stick in a tag that says <!-- Eat me --> or equivalent, and the bot will munch on the next line (unless it says "don't eat me"). Jpatokal 02:34, 23 June 2006 (EDT)
Bump. I need another admin's support for this... Jpatokal 21:28, 30 June 2006 (EDT)
Support. Sorry, I didn't realize it was you asking for this, so I didn't realize you were ready. -- Colin 21:41, 30 June 2006 (EDT)
Support. Also didn't realize you were ready - I was waiting to see if there was going to be a test version or some such first. -- Ryan 21:47, 30 June 2006 (EDT)
Oh. It's not ready, but I thought the first step was to get approval -- not much point in writing a bot only to have it shot down, no? Will try to squeeze in a few hours tomorrow to write it up, shouldn't be too hard. Jpatokal 00:15, 1 July 2006 (EDT)
Support. Sounds like a good one to me. --Evan 01:24, 1 July 2006 (EDT)
DiscoverBot has been unleashed on the unsuspecting citizens of Wikitravel to wreck wild crazy robot havoc every night at UTC+01. Jpatokal 04:03, 11 July 2006 (EDT)


Runs once a week to update Wikitravel:Multilingual statistics automatically, once working can be extended to all language versions. I plan to deploy in stages:

  1. Fetch the stats data automatically
  2. Format into table row automatically
  3. Place new row into table automatically
  4. Place new row into table for all language versions

This doesn't exist either but I'm working on it. I will also need to create a Wikitravel:NumberOfArticles page in each language version that will simply contain {{NUMBEROFARTICLES}} as content, so I don't need to deal with customized Special:Statistics pages. Jpatokal 11:10, 27 Oct 2004 (EDT)

Stages 1 and 2 exist now → User:Jpatokal/StatScript. Stages 3 & 4 shall await admin approval because they actually edit pages instead of just reading them. Jpatokal 12:43, 27 Oct 2004 (EDT)
I think this sounds like a good idea for a script. How often do you plan to run it? --Evan 16:35, 27 Oct 2004 (EDT)
Once a week, early Friday morning UTC. Jpatokal 22:41, 27 Oct 2004 (EDT)
I think this sounds like a good idea, so I vote yes. Do I need to express that elsewhere as well? -- Mark 19:16, 28 Oct 2004 (EDT)
I vote yes. -phma 21:58, 28 Oct 2004 (EDT)

FYI, after being moribund for a long time (I was only running the first half manually), I've finally written up the last chunk with the pywikipedia framework and set it on autopilot. So it should now run automatically every Friday at 00:00 UTC. Jpatokal 21:47, 18 Aug 2005 (EDT)


So I have this evil hotelmaker script which goes and scrapes various hotel websites to generate entries. Some people have populated various USA articles using this data source. Each time I run the script, my formatting improves, some hotels go away, and some hotels appear. I'd like to write a script to help keep the entries up to date. It will

  1. edit all US articles for which a hotel is found in the hotelmaker database
  2. if the Sleep section contains an out-of-date set of hotelmaker entries, the Sleep section will be replaced with a modernized version of the entries.
  3. the new entries will use the new <sleep> tags

In the future I might get fancier about things (note: this nomination does not give permission for any fancier versions), but for right now I want to update old entries and nothing else, so my script will not make an edit if any of the following are true:

  • a hotel was manually added to the list
  • a hotel was removed from the list
  • any formatting changes were made that might need to be preserved.

My main motivation here is to jump ahead of Jani and get this done before he makes a bot which converts sleep entries to the new listings tag style because that will make it hard for me to write an updater bot in the future. -- Colin 03:00, 6 November 2006 (EST)

  • Support. -- Andrew H. (Sapphire) 03:02, 6 November 2006 (EST)
  • Support. It will save OldPine a lot of cut & pastin' :) -- Ryan 03:12, 6 November 2006 (EST)
  • Support much. Sounds like a great plan. --Evan 10:17, 13 November 2006 (EST)

DorganBot (was: InterLangBot)[edit]

On the German site, most articles have an inter-language link only to the English version. At the other hand, the corresponding English articles don't know about their German counterparts. I'd like to write a little robot that does the following:

  1. It runs through a list of German articles and reads out their inter-language link to the English site.
  2. It puts an appropriate inter-language link to the German site on every English article it finds this way.
  3. At the same time, it looks for other inter-language links in the English article and adds them to the German article.
This way, both, the English and the German version (and other language versions, too) would stay in good contact and are kept well linked. You may say, all that could also be done manualy. Well, basicaly, you are right. But who is going to do all that boring work? -- Hansm 16:33, 2004 Oct 21 (EDT)
    • This sounds good. It would need to exclude the twin pages links that aren't interlanguage links (WikiPedia, Dmoz). I'd be willing to make the informal rule that interlanguage links always start with a lowercase letter (en:, sv:, fr:, de:, ro:) and are three letters or longer, and that other sites start with an uppercase letter and are four letters or more. Sound fair? --Evan 17:02, 21 Oct 2004 (EDT)
    • For its first run, I'd prefer to hardcode the 5 country codes. I want to keep the bot as easy as possible in order to avoid disasterous disfunctions. When the bot runs fine, more advanced configuration features could be built in later. In a first investigation, I have found out that some existing inter-language links point to empty pages. The reason for that is eather a broken link or a link to a planed, but still empty article. The question is, what to do with this links. Delete those links or keep them? -- Hansm 14:24, 2004 Oct 24 (EDT)
      • Delete. Broken links are useless, and the links will be recreated once somebody notices the problem. Of course, you could just record the list of broken links and post it somewhere for manual correction... Jpatokal 11:02, 27 Oct 2004 (EDT)
    • Just as a point of order -- I'm not sure that we've achieved the two-administrator rule here, so I've temporarily set the InterLangBot's run to "no". We have me giving support, and Hansm gives support, but he's a) the originator of the idea and b) an admin on a different language version. This policy was created before we had multiple language versions, so it's not clear if non-en: admins count. And for Wikitravel:administrator nominations I think we ended up saying that you needed two other admins to give support, if the nominator was an admin him/herself. Anyways, I'm just soliciting opinions. If another en: admin could pop in and give their thumbs-up, though, it's a moot point. --Evan 17:01, 28 Oct 2004 (EDT)
      • Evan, I understand your point, and at the other hand I don't. The script is nominated for one week. I have asked in the pup for some other admin to give a comment. If admins don't care about, what should I do? Either there are too less admins here, or the rules shouldn't be taken too serious. Anyway, this error that has produced editings in the main page is realy painfull and I'm sorry about it. I hope I can fix it before it would happen once more. -- Hansm 18:10, 2004 Oct 28 (EDT)
        • What you should do is wait. Sorry, but it's not so urgent that it can't stand another 24 hours of talking and decision making. --Evan 13:50, 29 Oct 2004 (EDT)
    • Oops.. sorry. I haven't been paying attention to this page. Here's your second admin vote: Yes. Sounds like a good script. Meanwhile use the perl Module for this if it's written in perl... and make me keep it up to date.. ;) -- Mark 19:19, 28 Oct 2004 (EDT)
    • Ditto... it gets my vote... will it just be for German at first or all languages? Majnoona 20:35, 28 Oct 2004 (EDT)
    • I think it's a good idea. It needs to be tested more than StatScript, as it modifies many pages, and it's not clear how to test it on the test wiki, since it needs two language versions to do anything useful. So test it carefully. -phma 21:58, 28 Oct 2004 (EDT)
    • Thanks to all of you for your pro vote. Marc, the script is already written and I must confess that I didn't remember you wix module. Be sure I will consult it next time. It's better to use some well tested standards than always beginning from scratch, as I did. I know, the InterLangBot is a critical mission since testing is only possible in its real ambience. What Evan has blocked was not the attempt to run the bot, but only my testings. Some more will follow before I realy will start it. -- Hansm 03:01, 2004 Oct 29 (EDT)

The InterLnagBot is running now. -- Hansm 04:11, 2004 Oct 29 (EDT) Run of InterLangBot completed. About 400 en: articles should have got new links to their de: couterparts, of course, many of them are still stubs. Some broken links to fr: and ro: have been deleted und some few inconsistent links have been encountered, but left untouched. If you should find some mistakes made by the bot, please let me know by reporting them on the InterLangBot's talk page. -- Hansm 03:48, 2004 Oct 30 (EDT)

Instead of reinventing the wheel (and promptly forgetting it), Pywikipediabot has an advanced, well-used bot for doing this already. I'll try to set this up to run automatically at some point. Jpatokal 09:57, 14 Jul 2005 (EDT)


My old InterLangBot has a serious problem: It cannot handle UTF-8 characters than cannot be mapped onto iso-latin-1 encoding. Since we have a ja: language version, it has no big use any more.

I have tried to customize the bot from the Pywikipediabot framework in order to comply our needs on Wikitravel. The advantage would be to set up on a well maintained program that is kept up to date when version changes of wikimedia software are done. But there are some problems, too:

  • I don't see any way to keep the WikiPedia links at the bottom of the list without doing massive changes in the source. Does it matter if they get shifted to the top?
  • Handeling our very special script blocking policy saying that scripts have to stop when certain pages have an other contence than 'yes' doesn't make things easyer, but there could be a way by doing only minor changes in the bot's code. Jpatokal's approach could by generalized to all of our 6 languages. I see a way without changing, what is one of the core files of the framework. Instead, a slight modification of needed to be done.
  • An other problem is that the mediawiki messages are switched off at Wikitravel, but the, and so, too, relie on them. There seems to be only one possibilty to get rid of that problem, that is changing one line in the source. The trade off would be that edit conflicts and other irregular problems could not be handled any more, then. Nevertheless, this might be a minor problem.

Generaly, changes at the sources of Pywikipediabot are problematic since they get lost with updates.

Now the most problematic question to all of you: How important is it that the WikiPedia link is at the bottom? -- Hansm 19:06, 16 Oct 2005 (EDT)

You mean that the WikiPedia link is still kept, but it just changes position in the list? Is it because capital W goes before lowercase a-z in an alphabetical sort? Jpatokal 20:35, 16 Oct 2005 (EDT)
Yes, the WikiPedia Link is still kept at the top of the "other languages" list. But It's not for the capital W. The WikiPedia link cannont be treated as normal interwiki link like language version. So, the WikiPedia link is the last part of the article that remains after all interwiki links have first been removed and then added at the end of the page. -- Hansm 00:57, 17 Oct 2005 (EDT)

Maybe, a realy dirty hack in could be a solution for the WikiPedia Link problem. See User talk:InterLangBot for more. -- Hansm 15:31, 17 Oct 2005 (EDT)


I've started to operate this bot on Japanese version[1], and I can contribute to English version as a substitute for DorganBot.

  • Operator: ja:User:Tatata(Tatata)
  • Function:
    • maintaining interlanguage links
    • replacing text (templates / internal links)
  • Operation: manually
  • Software: Python
  • Bot flags: ja[2]

On English version, I'll use only the function of maintaining interlanguage links by myself. As to the function of replacing text, it will be used by request if someone needs help.

By the way, I would like to use this bot on Chinese version too. But there are no documents about script nor script nomination process because there is only one admin. ;-) Can I apply a bot flag for Chinese version here? -- Tatata 22:03, 27 April 2009 (EDT)

  • Support -- this is long overdue! And yes, I'll be happy to enable the bot flag on fi and zh. Jpatokal 01:09, 28 April 2009 (EDT)
Thank you Jani-san. I created some accounts (en, fi, zh) and /Run page (zh:Wikitravel:腳本方針/Run). How can I spell "Script policy" in Suomi? "Komentosarja käytännöt", right? -- Tatata 21:16, 29 April 2009 (EDT)
Bot flags enabled on fi, zh. There's no good Finnish word for "script" in this sense (komentosarja is a manual command sequence), so the policy page is now at fi:Wikitravel:Botit ("Bots") and the flag at fi:Wikitravel:Botit/Run. Jpatokal 02:48, 30 April 2009 (EDT)
I moved Chinese page from "zh:Wikitravel:腳本方針/Run" to "zh:Wikitravel:機器人方針/Run", because of the same reason though I cannot understand Chinese well ;-)
The bot has started working on fi, zh! -- Tatata 22:14, 30 April 2009 (EDT)
  • Support. I'm so tired of marking interwiki links as patrolled :-) -- Colin 02:51, 30 April 2009 (EDT)
Thank you Colin-san. -- Tatata 22:14, 30 April 2009 (EDT)

The bot got a flag[3] and started working on en:[4]. Thank you!

Now, I'm requesting bot flag on de: and fr:. Unfortunately, noone come to script nomination page of both language versions. ;-) -- Tatata 22:08, 6 May 2009 (EDT)

Nice work, Tatata! Gorilla Jones 22:36, 6 May 2009 (EDT)


This interwiki bot uses standard pywikipedia framework and is operated by wikipedia:ru:User:Volkov (sysop @ on many WikiMedia projects. I think it will be helpful to keep an eye on interwikis on wikitravel as well and update them when needed. The bot is supposed to be run periodically. --Volkov 03:13, 19 August 2009 (EDT)

How is this different from the already-operational Tatatabot? - Dguillaime 03:26, 19 August 2009 (EDT)
I don't think it's pretty much different but my guess is that having a backup is not a bad idea. Up-to-date interwiki links will be helpfull for all wikitravellers ;) --Volkov 03:32, 19 August 2009 (EDT)
Support. Seems to be doing a good job already, and redundancy is good. Volkov, if you're up for a challenge, how about also automating Wikipedia<->Wikitravel links? We use [[WikiPedia:XXX]] (note: case may vary), while Wikipedia usually uses the {{wikitravel}}> template (cf Wikipedia:Template:Wikitravel). Jpatokal 04:46, 19 August 2009 (EDT)
Support. A backup to Tatabot won't hurt, and it would be awesome if there was (additionally) a way to automate Wikipedia links, although that isn't a prerequisite for supporting this bot. -- Ryan • (talk) • 10:48, 19 August 2009 (EDT)
    • Approved, bit flipped. Jpatokal 02:22, 4 September 2009 (EDT)


These scripts were approved, but are no longer active or never materialized.


Kunibotto will be used for automated uploads of CIA flags, maps and article stubs for all countries into the Japanese Wikitravel. Pywikipediabot's will be used. This is a one-time run, but I expect to use variants of this to enter great big chunks of Japan later on. Jpatokal 09:57, 14 Jul 2005 (EDT)

I'll vote yes. That said, maybe we should set things up so that images can be shared between the language versions... -- Mark 10:47, 14 Jul 2005 (EDT)
A Wikitravel Commons? I'm all for it, now to find the hardware, the software and the maintainer... Jpatokal 13:27, 15 Jul 2005 (EDT)
Huh? Why wouldn't we just host it on the server, with the same software and technology as Wikitravel? MediaWiki 1.4.x has a lot of support for a media commons. --Evan 13:30, 15 Jul 2005 (EDT)
You tell me, boss, it's not the first time this has been proposed! Jpatokal 13:47, 15 Jul 2005 (EDT)
The answer, then, is no reason. I've been looking at the code for MediaWiki's commons system and I think it would be a pretty straightforward thing to build without even taking down the running server. --Evan 13:52, 15 Jul 2005 (EDT)
Yes, but please follow the policy. Slow updates, use a flag to turn the bot on/off, etc. It's not that hard, I don't think, and it's easier on the Wikitravel servers. It's also easier to stop a bot that's got a bug in it, and to clean up after it. Thanks. --Evan 13:34, 15 Jul 2005 (EDT)
Uploads are being done once per minute, which is in line with policy, no? The flag-double-check thing would if anything only increase the load, although I'll admit that the real reason I was too lazy to implement it is that I'm not very familiar with Python and would need to spend an hour or two peering at the uncommented entrails of pywikipediabot to see how to do this. But I'll be wilfully evil and let the flag uploader run its course first, the data set is sufficiently well-defined that the risk of running amok is rather minimal. Jpatokal 13:47, 15 Jul 2005 (EDT)
Please, don't make a habit of it. Neither you nor I nor any other admin are above the policies and guidelines of Wikitravel. In fact, we have a responsibility to set a good example. I don't think anything in the policy is too difficult to implement in any modern scripting language.
If you'd like to change the policy, let's work that out on Wikitravel talk:Script policy. If you can't be arsed to make changes to the script to work with local requirements, or you're simply not experienced enough with the code to do so, then you probably shouldn't be running the script in the first place. I'd be happy to help with the pywikipediabot changes to make the framework work with the double-check. --Evan 15:32, 17 Jul 2005 (EDT)
For the record, Kunibotto's now cleaned up its act and dutifully checks the Run pages like a good 'bot should. Jpatokal 06:38, 2 Aug 2005 (EDT)

So for the record, Kunibotto is now doing its last run (?) to pump all CIA flags and maps into Wikitravel Shared. One upload every 60+ seconds, so it'll take most of the day... Jpatokal 01:45, 7 May 2006 (EDT)


I'd like to create a script that does automatic spelling corrections. It would use the Spelling maintenance page to find pages with spelling mistakes listed in the list of common misspellings, then change those pages to use the correct spelling. It would not change a) anything capitalized (probably a proper name) b) anything in the foreign-language part of a phrasebook, c) anything in italics (probably a foreign word), and d) any page linked from a special page (maybe User:SpellingScript/Ignore), so people can override it. This would improve the readability and apparent quality of the site, and take away one of the more tedious tasks for users. The script would probably be run from a cron job on the Wikitravel server -- maybe once per day. Note: this script isn't written yet. --Evan 19:24, 10 Jan 2004 (EST)

I'd like to second the nomination. -- Mark 09:50, 6 Feb 2004 (EST)
I third the nomination. It's worth a try. -phma 21:44, 17 Feb 2004 (EST)


This script would read a list of pages and a list of currencies, then edit each of those pages, looking for comments that mark a number as an exchange rate, then updating the exchange rate. This would improve the usefulness of the site (go to any country page and get the current exchange rate) without burdening anyone with updating them. The script could be run once a day or once a week from my laptop. This script isn't written yet, but I do have a shell script that gets the exchange rate of any other currency against the USD. -phma 21:44, 17 Feb 2004 (EST)

Seems like an interesting idea. What would the base currancy be? USD? Since I live in Switzerland I guess I would have to vote for CHF... of course Evan and MAJ are in Canada, so they might have a different point of view... ;) Really though if all the script did was to put Xcurrency = USD at the bottom that would probably be fine, especially if it was always marked as a minor edit. -- Mark 01:35, 23 Feb 2004 (EST)
The comment would look like <!--wb:chf/cad-->. The base is USD, because that's what uses as a base; after gathering all the exchange rates, the bot computes the ratios as needed. -phma 08:01, 23 Feb 2004 (EST)
OK, so just to be sure I've got this the script would look for a number after (or is it before?) a comment like the one above. It also sounds like you want to have a page of exchange rates per currency which we could link to from the pages for places where that currency is used. So it isn't like the bot would look for these tags all over the place right? In that case I think it's a great idea! -- Mark 09:58, 23 Feb 2004 (EST)

Wechselbot would:

  1. Read a list of currency codes from a page.
  2. Get the current exchange rates from This takes one web access; that's why I want it to read the list first.
  3. Read the list of pages to edit.
  4. Read a page, look for comments followed by numbers, and change the numbers to the current exchange rates.
  5. Repeat until it has done all the pages.

The comment could also contain the number of significant digits and the Unicode offset of the digit zero (in case on the Hindi, Arabic, or Thai Wikitravel we want to write numbers in the native script). -phma 15:45, 23 Feb 2004 (EST)

Better late than never, but you do get an answer -- As far as I understand it this is not a bad idea at all. I was thinking a step further: as I'm using the Euro, an amount in USD gives me an idea of the value, but if I want to know the precise amount I still have to convert it manually. How about a way to indicate in my user preferences that I want to use the Euro as a base? You would set your preferences to the USD, somebody else might set it to GBP, etc. Of course, this would involve a change to the code. DhDh 17:02, 5 Mar 2004 (EST)
If I understand you right, an arbitrary value (such as the price of a ticket from Györ to Debrecen in Magyar forintok) would be preceded by a tag saying that it's in some currency (HUF), and the PHP script would then display it in EUR, GBP, USD, AUD, CAD, ZAR, or NZD, depending on your preference. (Anonyms and people with no preference would get the forint currency.) This would require an exchange rate table. The table can be maintained by the bot as described previously; the PHP script would have to know where it is. Any change to the code would require getting Evan involved. -phma 19:24, 5 Mar 2004 (EST)
That is indeed what I mean. But in your example the HUF amount would not disappear. The amount in your chosen currency would be added to that. Like this:
  • no preference/anonym: 456 HUF
  • preference = EUR: 456 HUF (45 EUR)
  • preference = USD: 456 HUF (56 USD)
(I have no idea of the HUF exchange rate...) DhDh 19:51, 5 Mar 2004 (EST)
It sounded to me like the orginal idea was not to try to do automatic conversions on actual destination pages, but rather to make special pages with conversion tables which travelers could print out and carry around. I don't frankly like the idea of a script like this mucking with prices on the destination pages at all, for a couple of reasons.
  • A bug would cause waaaaaayyyy more damage, which somebody would have to take time to clean up after.
  • At the acutal destination prices will be listed in the local currency, so that's the way they should be listed on the destination page too.
  • Changing all of the pages with a script all the time will make a great big mess out of the Special:RecentChanges.
  • Having a user preference for their own local currency as above would make the feature work better, but would probably require some intensive hacking of the MediaWiki software, where a simple script which deals with a limited set of currency table pages would be a lot quicker.
On the other hand, I am in full support of the idea of doing a set of currency conversion pages with a script, even though currencies are not destinations. Maybe phma should make a demonstration page in his workspace to show us what he had in mind. -- Mark 14:22, 6 Mar 2004 (EST)
May I make a user page for Wechselbot and start writing the script? -phma 17:44, 7 Mar 2004 (EST)
Of course you can. Of course the safest thing to do (IMO) is t do this in your own namespace. For instance you can make some pages like: User:PierreAbbat/Wechselbot to play with. That said, If you think the script needs it's own login, then OK, go for it (again IMO) -- Mark 18:20, 7 Mar 2004 (EST)
The script needs its login; this is required by script policy. -phma
I don't know PHP, but I'm not sure that my idea would require intensive hacking. The feature should be able to search currencies in a 2-dimensional lookup table (the currency used in the article and the one in your preferences) and take the corresponding value to your screen. Somebody who knows PHP should contradict me if I'm wrong. DhDh 03:14, 7 Mar 2004 (EST)

Once the script is written, can I upload it to the arch server so that others can use it as a basis for their own scripts? -phma 22:57, 7 Mar 2004 (EST)


  • Username: ???
  • Maintainer: User:Maj
  • Runs: ad hoc

Per the request for interwiki links for World66, I'd like to run a bot to jump start this process. Since their file names are irregular and use a hierarchy (ie /texas/paris and /france/paris) it's going to be doing some fuzzy matching/guessing but I'll start with the easy/exact matches first (I've aready got ~2k or so done). I'll set up a Sandbox with examples. Majnoona 14:26, 1 July 2006 (EDT)

  • Support — Ravikiran 11:27, 13 July 2006 (EDT)
  • Support. --Evan 12:31, 13 July 2006 (EDT)
The bot, she is ready to run. I'm going to start with some small batches this evening. Majnoona 17:26, 19 July 2006 (EDT)
Sorry my bot is being a spaz... I swear it works perfect under my User:Maj/Sandbox! I will tinker offline until I get the bug (it's stripping all the returns) worked out... sorry! Majnoona 22:23, 19 July 2006 (EDT)
We're back! I'll be running it in small batches so as not to flood recent changes... Maj 13:20, 27 July 2006 (EDT)


  • Maintainer: Dorgan
  • Job: Interwiki
  • Program: Pywikipedia

-- Dorgan 09:48, 10 July 2007 (EDT)

Support. The functionality is identical to the deceased and much missed InterLangBot, and the cough test run just now shows that it seems to work fine, so I propose we waive the 7-day waiting period if Evan is willing to toggle the bot switches (this being one of those few bureaucrat-only tasks). One question though: does the bot handle UTF8 properly? Can it run in Japanese, Hindi, Hebrew etc as well? Jpatokal 10:00, 10 July 2007 (EDT)
Yes, it's UTF-8, and working fine, and of course, its working other languages. The source is here! ( Dorgan 10:16, 10 July 2007 (EDT)
You also need to hack to check the bot flags -- see [5]. Jpatokal 12:15, 10 July 2007 (EDT)
I understand. My problem is: I cracked that file, I copied it to and it doesn't work. I think this crack is out of date and unfortunatelly I can't resolve to check the two pages. And in interwiki there is another problem too: if my main language is hungarian, and I set the two pages in hu wt (if it's working :)) how can you stop me if you don't know the main language? OR if you know, how can I know the stop isn't an affray (if anonymous)? Could you review the script policy? The STOP is for the admins I think (like wps too)! Dorgan 03:57, 11 July 2007 (EDT)
The bot is supposed to check the flags on the language version it's currently editing. The code above is very simple, all you need to do is change the URLs to point to User:DorganBot/Run and Wikitravel:Script policy/Run. Jpatokal 04:35, 11 July 2007 (EDT)
Yes, I know, I chaged that, and nothing.... Dorgan 04:49, 11 July 2007 (EDT)
Ok, my last question: yes I know, that's not wp, but I think interwiki bot isn't dangerous. You see I have a bot since 2006 nov (in en, it, ro, ru, nl, fi, fr, de, hu wikipedias) more than 13 000 edits just in hu (the most edits are interwiki!). So I couldn't resolve the two check page, but Can I have the bot flag? :) Dorgan 04:09, 11 July 2007 (EDT)
It's not a question of is it dangerous. The question is, does it follow Wikitravel policy? If not, then we either 1) don't let it run on Wikitravel, or we 2) ignore policy in this case, or 3) we change the policy. My opinion: maybe we should drop the onerous article-checking requirements of our current script policy, so that useful tools like the pywikipediabot suite will be compliant. I think that we've got enough admins around that blocking ill-behaved bots is practical. --Evan 19:41, 15 July 2007 (EDT)
Not yet. I'd like to know that it's working according to our Wikitravel:script policy. I'm going to look over the code. --Evan 10:43, 10 July 2007 (EDT)
Since the bot has previously seemed to work OK, and Evan's not around too much these days, and the comment above is pretty ambiguous anyway, and this discussion has been dragging on for way too long, I'm going to tentatively say that the bot is approved — we'll just block it if there are problems. Jpatokal 13:09, 1 November 2007 (EDT)
Does anybody have an idea what has happend to this bot? It did a really good job, unfortunatly no edits any more since March 2008. --Flip666 writeme! • 05:13, 10 February 2009 (EST)