Help Wikitravel grow by contributing to an article! Learn how.

Difference between revisions of "Wikitravel:Script nominations"

From Wikitravel
Jump to: navigation, search
(weTravelWrite.com)
(Fixed links)
(7 intermediate revisions by 5 users not shown)
Line 114: Line 114:
  
 
:I've created the account on shared.  I don't remember, is there a process to request the bot bit be flipped? -- [[User:Mark|Mark]] 11:22, 22 May 2009 (EDT)
 
:I've created the account on shared.  I don't remember, is there a process to request the bot bit be flipped? -- [[User:Mark|Mark]] 11:22, 22 May 2009 (EDT)
== VolkovBot ==
 
This interwiki bot uses standard pywikipedia framework and is operated by [[:wikipedia:ru:User:Volkov]] (sysop @ ru.wiki) on many WikiMedia projects. I think it will be helpful to keep an eye on interwikis on wikitravel as well and update them when needed. The bot is supposed to be run periodically. --[[User:Volkov|Volkov]] 03:13, 19 August 2009 (EDT)
 
 
: How is this different from the already-operational [[Wikitravel:Approved_scripts#Tatatabot|Tatatabot]]?  - [[User:Dguillaime|Dguillaime]] 03:26, 19 August 2009 (EDT)
 
:: I don't think it's pretty much different but my guess is that having a backup is not a bad idea. Up-to-date interwiki links will be helpfull for all wikitravellers ;) --[[User:Volkov|Volkov]] 03:32, 19 August 2009 (EDT)
 
 
: '''Support'''.  Seems to be doing a good job already, and redundancy is good.  Volkov, if you're up for a challenge, how about also automating Wikipedia<->Wikitravel links?  We use <nowiki>[[WikiPedia:XXX]]</nowiki> (note: case may vary), while Wikipedia usually uses the <nowiki>{{wikitravel}}</nowiki>> template (cf [[:Wikipedia:Template:Wikitravel]]). [[User:Jpatokal|Jpatokal]] 04:46, 19 August 2009 (EDT)
 
 
: '''Support'''.  A backup to Tatabot won't hurt, and it would be awesome if there was (additionally) a way to automate Wikipedia links, although that isn't a prerequisite for supporting this bot. -- [[User:Wrh2|Ryan]] &bull; ([[User talk:Wrh2|talk]]) &bull; 10:48, 19 August 2009 (EDT)
 
 
 
 
 
 
* '''Support''' [[User:76.87.85.16|76.87.85.16]] 16:20, 22 August 2009 (EDT)
 
 
 
 
  
 
== weTravelWrite.com ==
 
== weTravelWrite.com ==
Line 176: Line 158:
  
 
::::: I like the idea of each user using their own Wikitravel account.  They can be fully fledged community members then.  Is there something we can do to make it easier using OpenID?  I mean, could you run a server on your site, that way you could sort of insert their ID seamlessly, and then use it as their wikitravel id?  I'm not trying to make things more complicated, just thinking out loud.  If it all sounds to hard, I'm happy with the users creating a real wikitravel account. --[[User:Inas|inas]] 22:20, 30 August 2009 (EDT)
 
::::: I like the idea of each user using their own Wikitravel account.  They can be fully fledged community members then.  Is there something we can do to make it easier using OpenID?  I mean, could you run a server on your site, that way you could sort of insert their ID seamlessly, and then use it as their wikitravel id?  I'm not trying to make things more complicated, just thinking out loud.  If it all sounds to hard, I'm happy with the users creating a real wikitravel account. --[[User:Inas|inas]] 22:20, 30 August 2009 (EDT)
 +
 +
:::::: Hmm. That's an interesting notion. App Engine uses Google Accounts, so once users authorize themselves on my server, I ''think'' I can then seamlessly log them into Wikitravel using OpenID. The problem is the authorization between their phones and my server -- I'm not comfortable with transmitting Gmail/Google Account passwords in plaintext, and https is hard to do (and slow) on an iPhone. Tell you what: I'll let them use their existing Wikitravel username and password, if any, and I'll look into doing a Google Account / OpenID login if that's available and a Wikitravel ID isn't provided. (No guarantees until I've analyzed the problem in detail, though...) [[User:Rezendi|Rezendi]] 16:58, 31 August 2009 (EDT)
 +
 +
Update: I have submitted a new version of the iTravelWrite app that fixes bugs and lets users enter their Wikitravel username and password, and won't allow edits unless they either do that or provide an email address (which will be added to the edit comments.) Once it's accepted, I'll withdraw the old version of the app and turn editing back on. [[User:Rezendi|Rezendi]] 15:23, 8 September 2009 (EDT)
 +
 +
==PoI File==
 +
 +
This is more a testing-the-water at the moment and comes from a question I asked in the [[Travellers'_pub#Lat.2FLong_in_listings|Travellers' Pub]]. Given many articles are starting to get latitude and longitude tags in listings, and given the popularity of GPS (not just SatNav devices but many mobile phones are starting to have GPS built in) would there be any use in creating a script that can scour a page for listings containing lat/long tags and creating a PoI file for that page?
 +
 +
I would propose to use the GPX Standard [http://www.topografix.com/gpx.asp] since it seems to be the best attempt at a common format (and I know Garmin MapSource will open these files). Just then a case of how to present it to the reader - do we have a /PoI page for each article that shows the raw XML of the GPX file that the reader then has to copy and paste into a text file (not my prefered way) or is there perhaps some way to generate a file that can be downloaded straight away as a GPX file.
 +
 +
Anyway, if people think it would be of use and would fit in with policy (I suspect it should since it's generally speaking a read-only script where the article pages are concerned) and also can point me to some useful articles or examples on how to create scripts for MediaWiki I would be interested in giving this a go. [[User:Nrms|Nrms]] 03:07, 8 March 2010 (EST)
 +
 +
==Cluebot on wikitravel==
 +
 +
ClueBot [[http://en.wikipedia.org/wiki/User:ClueBot]]and cluebot NG are bots that run on wikipedia and serve the purpose of reverting vandalism. Both bots have made millions of edits each with very few false positives. I see there is a discussion here about porting the now inactive AntiVandalBot over to wikitravel but was stalled because of the whole $150 thing. Cluebot, however, is open source[[http://en.wikipedia.org/wiki/User:ClueBot/Source]] and much more effective than the old AVB. If it is at all possible, porting ClueBot or ClueBot NG to wikitravel will send vandalism rates through the floor. [[Special:Contributions/116.49.154.12|116.49.154.12]] 08:15, 24 October 2012 (EDT)
 +
 +
{{WikitravelDoc|collaboration}}
 +
  
 
[[de:Wikitravel:Skriptkandidaten]]
 
[[de:Wikitravel:Skriptkandidaten]]

Revision as of 12:25, 24 October 2012

According to the Wikitravel script policy, scripts have to be approved by the Wikitravel administrators. To create a script that runs against Wikitravel, post the name and reason for the script beneath the line below.

Explain why we need the script, why it can't be done by hand, and what the script will do. If 2 administrators voice their support for the script within 7 days, and none oppose it in that period, the script can be run.

Scripts that have passed through this process can be found in Wikitravel:Approved scripts.

NOTE: you must apply for approval on each language version of Wikitravel. Approval on this page only allows you to run a bot on Wikitravel in english.


Contents

YggdrasilBot

I'd like to create a script to add "isIn" information for as many articles as possible. We've done a lot by hand already, but I think it might make sense to do a pass automatically. I'm calling it "YggdrasilBot", since its intention is to build a Tree of the World. I think it'll be tricky, but here's how it will work:

  1. It will start with the main 7 continents as input, plus other top-level areas like Central America and the Caribbean.
  2. For each input article, it will read the article, and look for sections of the article with particular names ("Sections", "Regions", "Countries", "Other destinations", "Districts", "States", "Provinces", "Counties", etc.). This will be configurable so that we can use the same bot for different language versions.
  3. In each of those article sections, it will look for links of the form "* [[Link]]" -- that is, only links where the link is the first item in a top-level list. This is the Wikitravel:Manual of style format for listing regions, cities, etc. It will ignore second- or higher-level links in hierarchical lists, and links embedded in text.
  4. It will add the articles referenced by the links to its queue of input articles.
  5. It will note that the input article is a parent for each of the linked articles -- making a two-way relationship between the article being read and the linked articles.
  6. Once it has finished with the input, it will do some calculations on the graph it's built. For each node (destination), it will look at each parent. It will discard any parent that is an ancestor of one of its other parents. Then, it will choose the first parent out of the remaining list.
  7. After the calculations are done, for each node, it will check to see if the article for the node exists, and if node already has a parent defined (by reading the RDF data). If it exists and doesn't already have a parent, the bot will append an {{isIn|Parent}} template to the bottom of the article.

I'm going to write this over the next few days (either in Pywikipediabottish or Perl + WWW::Mediawiki::Client), and run it next week (pending approval). The code will be available here and from my arch repository.

I'm trying to be as conservative as possible with this, but I'm going to try it out on a test repository first. --Evan 15:37, 9 Dec 2005 (EST)

Support. As soon as it can run on even the most conservative and obvious cases please get it running. -- Colin 00:23, 11 Dec 2005 (EST)
Good idea = Support. A point though. While regions are able to be parts of the branches, the cities can only be the leaves on the end. The regions should have city destinations listed on them, the cities (except huge ones with districts) should not. Also, if a city has an appearance in multiple region articles then the most distant or deepest region should be chosen as the IsIn link. There should also be some consistency, with all city articles off the same region, including the existing ones, getting the same IsIn link. Thus if an IsIn link already exists on a sister page then the validity of that IsIn should be checked for consistency with all the other sister pages. This could also mean not adding an IsIn link if there are too many sister links off one page - possibly because there need to be more regions. Finally, in situations where regions are not built out, but cities are, there is the potential for the IsIn link to be misassigned too early in the tree. This could be overcome by the Bot changing the IsIn link once a region is built out or else reporting misplaced IsIn's, excessive IsIn's and missing region pages. The assumption here is that a human added IsIn is more reliable than a Bot one. -- Huttite 00:59, 11 Dec 2005 (EST)
First: the point of step 6 is to make sure that only the deepest region is used. Second, what about adding a lint comment to the talk page of articles that have weird conditions? That would give us the benefit of the automated tool without writing to the actual article when there's a strange case. --Evan 13:04, 12 Dec 2005 (EST)
Support. This is similar to User:Elgaard#Wikitravel_as_book (which I hope to replace by a script that use isIn). --elgaard 16:35, 11 Dec 2005 (EST)
I'd support this as a maintenance tool, but would it be possible to first create a list of articles that still need isIn information, and then wait a week or two to see how many people can manually set up? At the moment it seems like isIn is doing wonders for getting people to create parent regions and more clearly decide what belongs where, so there seems to be some value in holding off for a bit longer. -- Ryan 16:43, 11 Dec 2005 (EST)
Support. Ryan has made a good point though..... Why don't we hold off on running the bot until after New Year....? Paul James Cowie 18:35, 11 Dec 2005 (EST)
Me too. I don't think this is terribly urgent, so let's let humans to the hard part first. Jpatokal 01:32, 12 Dec 2005 (EST)
Just for info sake: I just did a query and of 6164 non-redirect, non-disambiguation pages in the main namespace, 1536 have a link to the "isin" template. How about this: I'll the script ready to go and run it against a test database, and then let's wait till January to run it. I'd like to actually have it go on scheduled basis once every few days to keep the entire site well-stitched. --Evan 12:55, 12 Dec 2005 (EST)
I'm for the script, especially if it uses the perl library, and if you give me some feedback on its design and any bugs you find. -- Mark 04:58, 12 Dec 2005 (EST)
How would you handle the case of one place belonging to two regions, such as Lake Tahoe or Niagara Falls? Also some links that this bot should look at have "The" or some other short word before them. -phma 09:35, 21 Dec 2005 (EST)
I support the script as well. Having nearly 5000 articles to do by hand would take an awfully long time, so automation is good. Weekly sounds nice, too. -- Ilkirk 10:30, 30 Dec 2005 (EST)
Can the bot update a Special:Yggdrasil or a Wikitravel:Yggdrasil page with the actual tree of the world? I am thinking of a page where the world's continents, countries, regions, states and cities are shown in the form of a hierarchical tree, which can be expanded and collapsed when needed? It would be great if the article's status were also shown, so if I want to find all the stub articles under India, I could do it at one go. --Ravikiran 05:25, 4 Jan 2006 (EST)

AutotranslateBot

As part of the autotranslation feature I'm going to start testing a bot and adding autotranslations to subpages of article Talk pages (ie /Parigi:Discussione/Translation for an Italian version of the English article or /Talk:Cologne/Translation for a english version of the German article). I'm going to start out with half-a-dozen or so, for feedback, and then work on small batches where it would be useful. Majnoona 14:26, 1 July 2006 (EDT)

AntiVandalBot

WikiPedia currently runs a bot named WikiPedia:User:AntiVandalBot which uses pattern matching to revert obvious vandalism. The vandalism reversions it makes seem to be remarkably accurate and include everything from the "Bob is gay" edits to page blanking (see WikiPedia:Special:Contributions/AntiVandalBot). It also leaves a note on the user page for an author whose change was reverted, and will not revert a change twice when in "calm" mode, which is the default - there is also an "angry" mode. They have a WikiPedia:User:AntiVandalBot/requests page for getting it to run on Wikis other than Wikipedia, and I'd like to try to get it running on English Wikitravel. Should there be any problems with the bot the failsafe is for an admin to block the user the bot runs under, so it seems like it would be safe enough. -- Ryan 17:43, 18 July 2006 (EDT)

  • Support. Tawkerbot2 does fine work on Wikipedia. For the few pages I have on my watchlist, it usually gets the silly vandalism first before anyone else has a chance to take a whack at it. I've reviewed takwerbot2's contribution log a few times, and it looks perfect to me. -- Colin 17:53, 18 July 2006 (EDT)
  • Question: WikiPedia:User:Tawkerbot2/requests mentions a cost for the software; is this an issue? - Todd VerBeek 19:24, 20 July 2006 (EDT)
    • If that is an issue I don't see why we can't handle vandalism manually. Of course it'd be nice not to have to worry about a cleverly hidden "I want to eat your children", "Bob is gay", or wwww.crazyfarmsexffffffffffffff.com. Aside from the aforementioned concern of cost the Tawkerbot page also says something about needing a seperate server to run off of would that be a problem? -- Andrew Haggard (Sapphire) 19:32, 20 July 2006 (EDT)
    • The comment about a fee has been added recently, and it wasn't added by the script's author so it's not clear to me if it's a fee that's going to be charged, or if it was just a suggestion by someone else to start charging. If I make a request to get the bot running here and they say there's a fee then all bets are off - this is just something that would be "nice to have", it's not a necessity. As to a separate server, I think that's the script's author saying he wants a separate server to run it from, although I guess he's also looking at a P2P solution. -- Ryan 19:35, 20 July 2006 (EDT)
      • We can run it off the old wikitravel.org server (it just runs the mailing lists, now... and my blog, and Maj's home page, and a couple of other miscellaneous sites); I'm not sure how I feel about any additional fees for use, but I can ask IB about it. --Evan 19:46, 20 July 2006 (EDT)
  • Support, provided everything pans out. -- Andrew Haggard (Sapphire) 19:43, 20 July 2006 (EDT)
  • A couple notes on the fee thing, mostly that was mentioned as we were in a major server crunch at the time and neither of us could afford the $150 or so a month a server capable of running it on all the wikis. In short the script right now can be grabbed from an svn, do you have an IRC recent changes feed, that's the one other major thing that it needs. -- 207.216.238.134 12:23, 21 July 2006 (EDT)
    • A feed can be set up if needed, just let us know any next steps by leaving a comment on my talk page or on User_talk:Evan#Tawkerbot. -- Ryan 16:02, 21 July 2006 (EDT)
      • I gotta say, though: these notices are generated in MediaWiki, then sent to an IRC server, and read by a bot, then sent through a Web interface back to the MediaWiki server... it sure seems like the long way around the block! --Evan 16:10, 21 July 2006 (EDT)
        • So should we just do manual reverts or click the rollback button instead? - Andrew
          • No, I'm just wondering if we could short-circuit the process by taking the pipe where warnings come out and twist it around so it fits into the hole where reverts go in. --Evan 17:08, 21 July 2006 (EDT)
            • Update: the bot has been renamed and as of 24 October 2006 is running on a new machine, so with any luck we might see some action on this one soon. -- Ryan 03:09, 2 November 2006 (EST)

SpamFilterBot 2.0

So Evan and I were discussing various ideas about Wikitravel and I think it would be great if we could get a bot up and running which updates the spam filter across all language versions, Shared, and review and keeps all of the spam filters consistent so to prevent EN spammers from spamming ES, IT, etc. Who ever writes the script gets a barnstar. On the count of three... 1... 2... GO! -- Andrew Haggard (Sapphire) 23:25, 6 August 2006 (EDT)

Would it be easier to just have a single spam filter on shared? -- Ryan 22:11, 15 August 2006 (EDT)

Dotmodroid

As updating DotMs/OtBPs by hand is annoying (update front page, place into index, remove from current discussion, slot into archive), I'd like to build a bot along the lines of DiscoverBot to automate the process. (Wikipedia might have something snarfable.) Ideally, it could maintain all Wikitravel language versions that have active DoTM processes, and it would be smart enough to do nothing if the queue is empty. Jpatokal 02:14, 1 November 2006 (EST)

  • Support. Dotmodroid is a good name, but when will we see the JaniBot? -- Ryan 02:44, 1 November 2006 (EST)
What would JaniBot do? Travel the world, apply for Indian visas for people, speak eight languages semi-decently and write new scripts for more bots? -- Andrew H. (Sapphire) 03:05, 6 November 2006 (EST)
"Would"? Ha! The useless bloated sac of protoplasm known as User:Jpatokal was long since terminated and replaced by JaniBot. Prepare to be assimilated. -- JaniBot 03:08, 6 November 2006 (EST)
 :) -- Andrew H. (Sapphire) 03:14, 6 November 2006 (EST)
This came to mind after reading this. -- Andrew H. (Sapphire) 04:19, 25 November 2006 (EST)
  • Support. Sounds like a good way to deal with a tedious task. --Evan 10:43, 2 November 2006 (EST)
  • Support. Have I made life easier or tougher for the bot by modifying the format of the DotM candidates page? I am guessing easier, as it can just snarf the stuff between the div tags, but you never know with bots... — Ravikiran 21:23, 13 November 2006 (EST)

CommaBot

I noted this morning that someone had started Newark, New Jersey when we already had a Newark (New Jersey). I wonder if it would be useful to have a bot that added redirects for US and Canadian towns where a comma-separated city-and-state is a common naming convention. I'd probably just use the US and Canadian cities from the UN/LOCODE list, which would leave out the dots-on-the-map, but would cover most of the countries. It would make redirects of the form City, State and City, ST to City (State) or just City (depending on if the article exists). --Evan 14:22, 22 November 2006 (EST)

When the article already exists as "City", how do you plan to ensure that it is the right city? Walk the isIn tree until you get to a state? Check for a Wikipedia link? (I mention this half as clarification, and half as "I may want to do something similar") Do you plan on making these redirects only for existing articles? -- Colin 15:29, 22 November 2006 (EST)

MapmakerBot

I'm very close to being able to fully automate the process of turning a Wikitravel article combined with OSM data into a mostly-usable map. I would like to propose creating a bot account on shared for uploading the results of this process. When complete the MapmakerBot would do this:

  1. Watch pages in a given category, like say "Has Map"
  2. When anything changes on one of those pages, regenerate the map from configuration found in the map image on the page and OSM data
  3. Upload the SVG result of that map generation to shared
  4. Wait for someone to tweak the SVG file uploaded
  5. Retrieve the SVG
  6. Convert it into a properly-sized PNG
  7. Upload the PNG

To see examples of the sort of configuration the map generation requires have a look the Paris articles. At some point I'll write up some documentation on how to use the thing. -- Mark 11:42, 21 May 2009 (EDT)

Support. -- Colin 16:15, 21 May 2009 (EDT)
Support --Stefan (sertmann) Talk 16:19, 21 May 2009 (EDT)
Support. Jpatokal 05:43, 22 May 2009 (EDT)
Just in case you're curious, I've pushed the source code to http://gitorious.org/osmatravel -- Mark 11:11, 22 May 2009 (EDT)
I've created the account on shared. I don't remember, is there a process to request the bot bit be flipped? -- Mark 11:22, 22 May 2009 (EDT)

weTravelWrite.com

iTravelWrite is an iPhone app (currently available, and free, in the App Store) intended to make it easy for people to update Wikitravel from their phones - with GPS location information, in particular. This will add to the lat/long data available on Wikitravel, making mapping much easier, and will hopefully increase accuracy too, by allowing people to make edits while literally on the spot.

The app is built to funnel updates through a central service, a Python script running on Google App Engine, at www.wetravelwrite.com. This reduces the bandwidth cost for the phone app considerably, provides a central point of logging and control, and lets users jot down quick notes that they can later expand into edits when they're at a real computer. To be clear, this is a script, but not a bot: it only acts in response to specific user requests from the iPhone app.

I plunged forward and built the app and service without previously asking for an OK here - sorry about that! I've disabled edits from the app for the moment, until (/if) the the script is approved, and also until I fix a bug in how it handles non-ASCII characters. (Which in turn is an example of why a central point of control is useful.)

Rezendi 15:18, 25 August 2009 (EDT)

  • Support as long as the character bug is resolved, I'm a happy camper. --Stefan (sertmann) Talk 15:31, 25 August 2009 (EDT)
  • As an aside, I've noticed that some coordinates that have been added so far are enormously incorrect - for example, this edit [1] to Brisbane actually has coordinates for Ohio [2]. I don't think we can deal with errors like that automatically, and it would be easy for more to slip past... does the app default to adding current coordinates? If so, should it? - Dguillaime 15:54, 25 August 2009 (EDT)
    • The app has an "I'm There!" button that plugs in the current coordinates (or at least what the phone believes to be the current coordinates); people can also use it to enter lat/long coordinates by hand, if they are truly hardcore. Rezendi 21:10, 29 August 2009 (EDT)
  • Support. Can someone provide a test server for Rezendi to debug his app against? If no test server is available, I support testing it here. -- Colin 17:25, 25 August 2009 (EDT)
  • Oppose - for now. There has to be some way of turning it off, and giving the user a message from WT. Having to block IP addresses if it malfunctions is unacceptable, and may even be impractical if it is installed widely. If wikitravel API changes, or we change sections, or definitions, there has to be someway of telling people that they need to update, and to prevent them from making changes that will damage the system without having to block them entirely. I would suggest that the program use a particular user account, or the existance of a page to refuse to update, and delivery a user message. Possiblly this file could be based on the installed versioh. --inas 17:39, 25 August 2009 (EDT)
Actually, I don't think iTravelWrite is a script at all. In effect, it's a very specialist browser not too different from (say) a Firefox mod, operated by real live humans, and technically it doesn't need our permission.
That said, I do see the potential for damage here, and I applaud User:Rezendi for letting us know about this. Careful testing is necessary, and there should be simple safeguards built in, eg. not changing any existing coordinates and a check at startup to see if the program is allowed to run. Jpatokal 23:48, 25 August 2009 (EDT)
Don't worry - while the iPhone app can be widely distributed, it was deliberately designed so that doesn't connect to Wikitravel directly; instead, it connects to the Web server at weTravelWrite.com, and hence all edits from the iTravelWrite app go through there. Meaning that there is a single point of control at which edits from the app can be turned off from everyone. (As is the case at the moment, until I fix the special-character bug, which will hopefully be very soon after my paid work becomes less time-intensive than it is at the moment.)
I'm happy to set it up so that all edits use an "iTravelWrite" user account, or, alternately, to poll a page on Wikitravel to see if edits are OK (although the first solution seems more elegant.) Rezendi 17:22, 26 August 2009 (EDT)
Agreed — the first option is better. Gorilla Jones 18:45, 26 August 2009 (EDT)
I'd like to add that I think the overall idea is great. We need more original research from travellers in WT to progress, and giving more people the ability to do updates on the road is great. However, I also see an essential part of a wiki being able to attempt communication between editors, and to have some identification that a group of edits were done by the same person. Its part of the collaborative approach. It's why talk pages exist. It's why we try to insist every message is signed. It is why the IP or account is logged on every edit. If everything is coming from one IP, and the users are anonymous, then we don't really have any reverse path to communicate with the user about their edit. We can't point them at the mos, to correct a mistake they are making. If someone is a vandal, we have no granularity of control. Has this sort of thing done for other wiki's, that we can see how they manage these issues. Has anyone written a iphone WP editing application? Would it be too hard for the application to allow users to create accounts on WT? --inas 22:12, 26 August 2009 (EDT)
I do see your point. Lumping all edits together under the same user would limit accountability and communication. It is possible to have users enter their Wikitravel usernames and passwords into the app, and forward them onwards, although there might be security concerns, as the passwords would be in plaintext. (Although, um, now that I look at it, it seems the main Wikitravel login page uses http not https, so it wouldn't be any less secure than the status quo.) You would then be able to distinguish edits from the app via a) the edit summary comments, which will always include "weTravelWrite"; b) the user-agent, I think; c) the IP number (though c) is less reliable as it depends on Google App Engine.)
I could also log the iPhones' unique device IDs, and pass them in as part of the edit summary, which might be helpful in terms of identifying edits from a specific user. I could even pass on the IP addresses they're using ... but really, that wouldn't help - they'll change frequently, both over time and as users move from network to network to Wi-Fi site. Ultimately, as people move to using their smartphones, IP numbers are just going to be less and less meaningful; it's a problem many sites are going to have to deal with.
Would that be OK? If the app requests a Wikitravel login from its users, so you can identify them individually, and if I pass in the iPhone's unique device ID regardless? Then you should be able to distinguish iTravelWrite edits from other edits, and also group edits by unique phone IDs, even if they don't choose to log in. It won't be as convenient as it has been via IP numbers, but, well, that's the mobile Web for you. Rezendi 13:22, 29 August 2009 (EDT)
Also, it seems that I have fixed the non-ASCII-characters bug (though I won't be completely certain until I can test against real data) so I'm ready to proceed as and when you guys are. Rezendi 21:07, 29 August 2009 (EDT)
I like the idea of each user using their own Wikitravel account. They can be fully fledged community members then. Is there something we can do to make it easier using OpenID? I mean, could you run a server on your site, that way you could sort of insert their ID seamlessly, and then use it as their wikitravel id? I'm not trying to make things more complicated, just thinking out loud. If it all sounds to hard, I'm happy with the users creating a real wikitravel account. --inas 22:20, 30 August 2009 (EDT)
Hmm. That's an interesting notion. App Engine uses Google Accounts, so once users authorize themselves on my server, I think I can then seamlessly log them into Wikitravel using OpenID. The problem is the authorization between their phones and my server -- I'm not comfortable with transmitting Gmail/Google Account passwords in plaintext, and https is hard to do (and slow) on an iPhone. Tell you what: I'll let them use their existing Wikitravel username and password, if any, and I'll look into doing a Google Account / OpenID login if that's available and a Wikitravel ID isn't provided. (No guarantees until I've analyzed the problem in detail, though...) Rezendi 16:58, 31 August 2009 (EDT)

Update: I have submitted a new version of the iTravelWrite app that fixes bugs and lets users enter their Wikitravel username and password, and won't allow edits unless they either do that or provide an email address (which will be added to the edit comments.) Once it's accepted, I'll withdraw the old version of the app and turn editing back on. Rezendi 15:23, 8 September 2009 (EDT)

PoI File

This is more a testing-the-water at the moment and comes from a question I asked in the Travellers' Pub. Given many articles are starting to get latitude and longitude tags in listings, and given the popularity of GPS (not just SatNav devices but many mobile phones are starting to have GPS built in) would there be any use in creating a script that can scour a page for listings containing lat/long tags and creating a PoI file for that page?

I would propose to use the GPX Standard [3] since it seems to be the best attempt at a common format (and I know Garmin MapSource will open these files). Just then a case of how to present it to the reader - do we have a /PoI page for each article that shows the raw XML of the GPX file that the reader then has to copy and paste into a text file (not my prefered way) or is there perhaps some way to generate a file that can be downloaded straight away as a GPX file.

Anyway, if people think it would be of use and would fit in with policy (I suspect it should since it's generally speaking a read-only script where the article pages are concerned) and also can point me to some useful articles or examples on how to create scripts for MediaWiki I would be interested in giving this a go. Nrms 03:07, 8 March 2010 (EST)

Cluebot on wikitravel

ClueBot [[4]]and cluebot NG are bots that run on wikipedia and serve the purpose of reverting vandalism. Both bots have made millions of edits each with very few false positives. I see there is a discussion here about porting the now inactive AntiVandalBot over to wikitravel but was stalled because of the whole $150 thing. Cluebot, however, is open source[[5]] and much more effective than the old AVB. If it is at all possible, porting ClueBot or ClueBot NG to wikitravel will send vandalism rates through the floor. 116.49.154.12 08:15, 24 October 2012 (EDT)

Variants

Actions

Destination Docents

In other languages

other sites