After the crucial 2 admin level has been reached then whoever has got sysadmin access to the db should set the bot's identity as a bot, and announce that that has been done back on the nominations page.
The bot may be run only after this has been done, otherwise even a perfectly functioning script makes a big mess out of [[Special::Recentchanges]]. -- Mark 04:53, 30 Oct 2004 (EDT)
I cannot find anywhere to request someone write a script or propose a suggestion for a useful script - should there be such a page or place to do this? If so, I would like to suggest that someone consider writing the following scripts:
Links to WikiPediaScript for linking to the associated WikiPedia article. Should only link to a WikiPedia article of (effectively) the same name. As confirmation the Bot should first check that the subject article is linked from links in the Region, City or Other destination sections of a refering page that also has a WikiPedia link and that refering page's partner WikiPedia page also links to the target page. It should also check for page uniqueness and that the WikiPedia page is not a stub, disambiguation page, Vote for Deletion or otherwise suspect article.
Links to Open DirectoryScript for linking to the associated Open Directory regional category page. Should only link to an Open Directory regional page where there is effectively the same name for the Open Directory category. As confirmation the Bot should check that first check that the subject article is linked from links in the Region, City or Other destination sections of a refering page that also has an Open Directory category link and that refering page's partner Open Directory category page also links to the target page, either directly or via a Localities list and the target Open Directory category is in the same regional hierachy but one level lower than the referer page's category.
Disambiguation Page Identifier - Script for identifying disambiguation pages. Looks at all main namespace pages linked to the Template:Disamb page and confirms they are disambiguation pages that have a disambiguation notice on them. Adds any that do not appear into the Wikitravel:Links to disambiguating pages article
The "link to Wikipedia" bot is on my TODO list, along with its evil twin the "link from Wikipedia" bot. But I'll be away for the next month so don't expect to see this until late Sep at the earliest. Jpatokal 21:55, 18 Aug 2005 (EDT)
We now have Wikitravel available in 6 languages and new languages will appear soon. It becomes hard and heavy to maintain inter-language links between all versions. I think a bot would be great for that job. InterLangBot already exists but it doesn't match exactly this goal as it's running only between German and English versions. InterLangBot should read all articles available in english and check all inter-language links in the article and make links between all these versions. The problem is for pages who don't exist in the english version yet. What do you think about extending InterLangBot? in which way? --Quirk 08:49, 6 Jul 2005 (EDT)
As far as I know the InterLangBot is not running even for the German version anymore... and yes, it should definitely be extended to cover all of Wikitravel.
Wikipedia has some fairly advanced bots doing this, maybe we should try to port one of them here? Jpatokal 21:02, 6 Jul 2005 (EDT)
Sample checking code for pywikipedia framework
I'm happy to report that Kunibotto, the bad boy of the bot world, has joined the light side of the force. Here's the pywikipediabot code added to wikipedia.py needed to make it happen:
# URL to your bot's Run page goes here
path = self.get_address('%E5%88%A9%E7%94%A8%E8%80%85:Kunibotto/Run')
txt = getUrl(self, path)
txt = getUrl(self, path)
ok1 = 'yes' in txt
# URL to systemwide Run page goes here
path = self.get_address('Wikitravel:%E3%82%B9%E3%82%AF%E3%83%AA%E3%83%97%E3%83%88%E3%81%AE%E5%9F%BA%E6%9C%AC%E6%96%B9%E9%87%9D/Run')
txt = getUrl(self, path)
ok2 = 'yes' in txt
output(u'User-run: %s System-run: %s' %(ok1,ok2))
return (ok1 and ok2)
Note that the bizarre URLs above are because this runs in the Japanese Wikitravel and needs to deal with Unicode, for English you can replace them with "User" and "Script_policy" respectively. And the actual check can then be done inside your robot like so:
# check if blocked
allowed = self.site.allowed()
For posterity, here's what I needed to do to automatically upload country stubs to the Japanese Wikitravel.
Write up a table of all countries in Japanese and English.
Write bot number one to snarf maps and flags off the CIA World Factbook and upload them all.
Write bot number two to scrape the map/flag code for each country from the English Wikitravel.
Write bot number three to scrape country data from the Japanese Wikipedia. After a couple of abortive attempts with regexps, I gave up and went with a real HTML parse, namely Beautiful Soup which does a good job. I also needed to write a converter to swap between raw UTF8 and the HTML-escaped flavor used by Mediawiki in URLs.
Write bot number four to take the data compiled by the previous bots, inject it into a template, and upload that template to the Japanese Wikitravel.
And oh yes, as bonuses, the remote terminal I used is incapable of displaying Japanese and I didn't know Python at all before I embarked on this.
The last of these bots is the most reusable of the bunch, as it can be used for generating pretty much any time of automated content, so I plan to clean up it and submit it into the pywikipedia CVS. Jpatokal 21:25, 16 Aug 2005 (EDT)
Should this process be moved to Shared? Eg. StatScript actually runs on Shared now, not en. Jpatokal 02:11, 1 November 2006 (EST)
I'd rather it did. Most bots are useful across multiple language versions, and Shared is where all language versions (are supposed to) coordinate activities. For most language versions, I think approval on Shared, and the participation of a bureacrat from the individual language version, should be enough to allow the bot flags to be put in place. --PeterTalk 15:51, 19 August 2009 (EDT)
Any more opinions on this? If not, I'm going to plunge forward with this. Jpatokal 02:24, 4 September 2009 (EDT)