Jump to content

Wikipedia talk:Bots/Archive 22

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia
Archive 15Archive 20Archive 21Archive 22

WikiBot

I have recently come across an apple store app for iPhone and iPad named wikibot. It is just a regular wiki reader, one of many available on the app store for reading (but not editing I notice) articles. Concerned the name implies something to do with bots - just wanted to see the community's thoughts on this. The link is at [1]. Perhaps we could convince them to rename it? Rcsprinter (talkin' to me?) @ 23:00, 7 September 2012 (UTC)

Bot

I need a bot to create mass categories (~400), on other wikiproject. What solution can you suggest? XXN (talk) 13:56, 19 November 2013 (UTC)

Can you be a little more specific? or any more vague? Werieth (talk) 14:03, 19 November 2013 (UTC)

Bot that fixes hyphens

I recall seeing a bot making changes, such as changing a hyphen to an n-dash in a range, such as changing 1998-99 to 1998–99. But I can't remember which bot it was. Does anyone know? — Preceding unsigned comment added by Jc3s5h (talkcontribs) 00:16, 16 December 2013 (UTC)

When citation style guides change

Since citations are ever-popular items for bots to operate upon, I am providing this link to a discussion about what to do when citation template documentation, or editions of external printed style guides, change: WT:CITE#When citation style guides are updated.

Obviously it is important for bot operators who change citations should be following the current rules for the citation stye chosen for a particular article, or the rules as they existed at the time the article was written. Jc3s5h (talk) 15:25, 8 May 2014 (UTC)

Create a BOT to alphabetize and organize categories automatically

As someone who has been doing this manually for years, I hereby dutifully beg of anyone who is technically proficient and knows how to create and run a bot that will:

  1. Automatically sort all Categories on each article and category page alphabetically;
  2. Create a uniform system for where to place categories on each article and category page that commence with numbers, such as years of birth/death, centuries, and any category that starts with a number/numeral.

Please see the centralized discussion at Wikipedia:Bot requests/Archive 61#Create a BOT to alphabetize and organize categories automatically. Thank you, IZAK (talk) 09:31, 4 August 2014 (UTC)

Discussion re-opened at VPP

Please see Wikipedia:Village pump (policy)/Archive 114#Create a BOT to alphabetize and organize categories automatically. Thank you, IZAK (talk) 22:53, 5 August 2014 (UTC)

Tech help required to improve categories

Please see Wikipedia:Village pump (policy)#CatVisor and User:Paradoctor/CatVisor#Planned features if you are willing and able to assist this innovative WP project move along it would be greatly appreciated. Thank you, IZAK (talk) 23:38, 12 August 2014 (UTC)

Bot code issue

This may or may not be a problem with a bot, it could be an AWB issue according to something I read. But this bot made another 1000 edits after this problem, and it is a problem, bad code should be fixed.

I wrote an article with a heading typo: instead of equal signs two left and two right, I put one left and two right. A bot changed it to two left and three right.

I told the bot operator what happened, and his attitude is that since I made a typo and his bot's edit made a similar typo--it is still an unbalanced headline code--there is nothing wrong with the bot. But the bot should only correct headline levels, and it identified this as a levels problem and made a pointless edit, a coding deficiency.

Can an experienced coder explain this error to the bot operator and that it should be fixed? He does not understand. (Someone making 1000s of bot edits a day should get this.) Please don't ping me, if you control bots on Wikipedia, you, too, should get this. MicroPaLeo (talk) 23:20, 26 February 2015 (UTC)

I've tried to explain to MicroPaLeo, but he does not understand. Per WP:ACCESSIBILITY, section headers do not start with just one "=". This is CheckWiki error #19. The bot has approval to fix this problem and did fix it. The bot does not have approval to fix unbalanced section headers (more "=" on one side than the other). This is only something than can be done manually in most cases. (do you add or subract a "=" from ==Mid-life=== ?) There was an unbalanced section header before the bot arrived and still was one after the bot left. The bot didn't cause any errors, didn't fix any errors that it wasn't supposed to and did fix the error it arrived there to do. How is that broken? The bot is not there to fix every error... That's like saying the bot fixed the spelling of "practic", but didn't fix the spelling of "thier" in the same sentence, so the bot should be deactivated. Bgwhite (talk) 23:38, 26 February 2015 (UTC)
"A heading starts with "= XY =". It should be "== XY ==" This is not what your bot did. MicroPaLeo (talk) 00:14, 27 February 2015 (UTC)
Technically it did. The value of " XY " in your statement was "External links=" in the edit. Anomie 00:32, 27 February 2015 (UTC)
So the value of XY can be anything and the bot will edit? You are going with that? And now, you agree that the problem is fixed with "==External links==="?
Why is there so much hostility towards new editors participating on Wikipedia? There is a category, one category alone, that has 60,000 bad articles from as far back as 2008, there are articles online about Wikipdia losing editors all the time, yet you have a bot that makes pointless edits with a rude edit summary, a bot operator who can't communicate what he is doing, and open hostility towards someone trying to figure out why anyone would waste time writng code that "technically" fixes something by leaving it a mess. I have an editor stalking me because I disagreed with him on the content of his article. You have no instructions on how to communicate with a bot operator except to turn off the bot, and now you want to be smug, too? "Technically" I see why Wikipedia is losing editors. MicroPaLeo (talk) 00:56, 27 February 2015 (UTC)
You came here being hostile, throwing around factually incorrect accusations. You're continuing to be hostile here while accusing everyone else of bad faith. The simple fact is that sometimes a bot will fix one error while missing a different error that a human would see was causing the first error. Accusing the bot operator of bad faith because of this is not helping anything, nor is doing the same to anyone else who disagrees with your assessment. Anomie 21:31, 1 March 2015 (UTC)

MicroPaLeo surely the bot's logic could be better to check for these cases. But my experience show that a case like that is not very common. One of Wikipedia's principles is WP:BEBOLD. Feel free to fix articles and improve edits. Bots are good but not good enough. Human editors are always better. Most of the backlogs need human attention and Wikipedia is bases on active editors not bots. -- Magioladitis (talk) 13:11, 27 February 2015 (UTC)

Although I am not a admin I have a suggestion about bots on Wikipedia.

There should be a reliability history report generator, that generates the reliability of the bot based of its history based on how many errors have it gone thru since the beginning of the bot. There also should be a bot error, log off all the error the bots have gone thru since the beginning of the bots operation.Doorknob747 (talk) 03:51, 24 March 2015 (UTC)

@Doorknob747: How would you define an error? Not all bot errors are reverted, and not all reversions are bot errors. GoingBatty (talk) 16:31, 24 March 2015 (UTC)
@GoingBatty:I am thinking something similar too how windows action center reliabilty report works. The logs will contain any mistakes the bots have made,or any vandlaisms, or things the bots have done that they were not programed for. Doorknob747 (talk) 17:04, 24 March 2015 (UTC)
@Doorknob747: Could you be more specific? For example, how would a report generator determine if this bot edit is an error or not? GoingBatty (talk) 02:50, 25 March 2015 (UTC)
Such is generally tracked on the bot's talk page (or its owner's if that is preferred). This means only edits that editors think are errors are reported but that's probably the only way to do it, there's no objective way to do it, certainly no automated tool. This also means editors can also take into account the seriousness of the error - a bot incorrectly deleting pages for example is far worse than one making changes with no visible effect, even though both are technically errors.--JohnBlackburnewordsdeeds 17:31, 24 March 2015 (UTC)

List of bots

Ok so there's List of bots which was tantalizing to me, but really basically useless. Is there any list anywhere that has all the bots and what they do? I was trying to quickly find the name of the bot that handles tagging of new articles when they are found to be copied from other sites. --Hammersoft (talk) 16:49, 17 July 2015 (UTC)

CorenSearchBot (BRFA · contribs · actions log · block log · flag log · user rights) iirc. –xenotalk 17:25, 17 July 2015 (UTC)
Why is the list useless? Because it is not updated? Oops, I thought you were referring to: Wikipedia:Bots/Status (redirect from "Wikipedia:List of bots") –xenotalk 20:04, 17 July 2015 (UTC)
You could try the search engine, like: https://en.wikipedia.org/w/index.php?title=Special%3ASearch&profile=default&fulltext=Search&search=foo+prefix%3AWikipedia%3ABots%2FRequests+for+approval%2Fxenotalk 20:02, 17 July 2015 (UTC)

Semi-protected edit request on 27 November 2015

Could someone add User:JeffGBot to the list of Wikipedia bots? He checks pages for dead links. Here is an example. Thanks, 73.223.175.207 (talk) 00:47, 27 November 2015 (UTC)

Hasn't edited in four years, so it doesn't seem like a great example. There's more than enough already, I think. — Earwig talk 04:28, 27 November 2015 (UTC)

Your Input on Archive Bot Behavior

The following discussion is closed. Please do not modify it. Subsequent comments should be made on the appropriate discussion page. No further edits should be made to this discussion.


Moved to WP:BOWN

Hi there,

What do you think about this edit (and the corresponding edit to the article's talk page). I find access dates by finding the earliest occurrenceof the url in the article. Is that a good idea? What are the standards for bots that add archives? --Tim1357 talk|poke 02:45, 23 September 2016 (UTC)

Tim1357 I've moved this to WP:BOWN for a better audience. — xaosflux Talk 03:53, 23 September 2016 (UTC)
The discussion above is closed. Please do not modify it. Subsequent comments should be made on the appropriate discussion page. No further edits should be made to this discussion.

Your Input on Archive Bot Behavior

The following discussion is closed. Please do not modify it. Subsequent comments should be made on the appropriate discussion page. No further edits should be made to this discussion.


Moved to WP:BOWN

Hi there,

What do you think about this edit (and the corresponding edit to the article's talk page). I find access dates by finding the earliest occurrenceof the url in the article. Is that a good idea? What are the standards for bots that add archives? --Tim1357 talk|poke 02:45, 23 September 2016 (UTC)

Tim1357 I've moved this to WP:BOWN for a better audience. — xaosflux Talk 03:53, 23 September 2016 (UTC)
The discussion above is closed. Please do not modify it. Subsequent comments should be made on the appropriate discussion page. No further edits should be made to this discussion.

Village pump discussion: Bot content with updates

See Wikipedia:Village pump (idea lab)/Archive 21#Bot content with updates. The idea is a category of bot that pulls data from a reliable external source, formats it as text and places it in files that can be transcluded into articles. Obvious types of data for a settlement are terrain, temperatures, rainfall, census data, election results. Abuko is a mock-up of the result, using data generated by User:Lsj's Swedish bot. Edit the location section to see the transclusion. Where this differs from other content generators is that it can be rerun any time to refresh the data, perhaps quite frequently. That may imply a need for rules or processes specific to this type of bot. Any input welcome. Aymatth2 (talk) 16:56, 24 September 2016 (UTC)

Bot is confusing me

The bot? SummerPhD is confusing me. Sausagea1000 (talk) 19:44, 30 December 2016 (UTC)

@Sausagea1000:, SummerPhDv2.0 is not a bot, but a person. You can reach them at User talk:SummerPhDv2.0. — xaosflux Talk 21:44, 30 December 2016 (UTC)

Setting the bot flag on other wikis

The "bot" flag setting option is not obvious, how is it done if the bureaucrat roles don't exist. - 180.149.192.139 (talk) 02:41, 13 February 2017 (UTC)

If a WMF wiki does not have local bureaucrats, bot flags can be set by WMF stewards here: meta:Steward_requests/Bot_status. — xaosflux Talk 04:26, 13 February 2017 (UTC)

I put a new section on how to hide AWB here, since this is often a WP:MEATBOT related issue, but we could move it to a section of WP:AWB if you feel it's out of place here. Headbomb {talk / contribs / physics / books} 15:18, 24 February 2017 (UTC)

Hi, not sure how to proceed with this situation.

It concerns User:Dispenser's Checklinks tool.

Facts as I know them:

  • User:Dispenser is mainly available on IRC (linked on his page)
  • Checklinks has many problems that need to be fixed. Examples:
  • I spoke with Dispenser on IRC a few months ago. He said Checklinks needs to be rewritten. Acknowledges problems. Says he does not have time due to IRL commitments. Says the source is closed due to possible copyvio.
  • Dispenser said it was being used about 6-12 times a day by a relatively small number of editors (going from memory maybe a little less).
  • A new tool is available as of today. Created by User:cyberpower678 it is available here. It accomplishes fundamentally the same thing but is IMO a better design (links are monitored over period of days before determining dead), more features (everything IABot does), active developer. It marks links dead but also saves them.

How should we proceed? I suspect Dispenser would be OK with turning it off if requested, but I don't know what kind of community discussion is needed or where. -- GreenC 20:29, 6 March 2017 (UTC)

Bot-on-Bot Editing Wars

  • "Study: Bot-on-Bot Editing Wars Raging on Wikipedia's pages". Sci-tech-today.com. Retrieved March 13, 2017.--Moxy (talk) 05:30, 13 March 2017 (UTC)

Robotic editing on Siri

Not an issue for Bots. Referred elsewhere
 – This is not related to the WP:Bots page. Talk:Siri is the appropriate location to discuss the article.

Is there a page where one can raise complaints as to robotic editing of article pages? I have had a number of improvements to the Siri page, contextualizing or removing unreliable sources removed by a username who does not seem to do anything but undo edits. 203.109.212.42 (talk) 09:47, 24 March 2017 (UTC)

@Gilliam: This user appears to be talking about your reverts on Siri. -Newyorkadam (talk) 09:50, 24 March 2017 (UTC)Newyorkadam
I have restored the citation needed tags.– Gilliam (talk) 09:54, 24 March 2017 (UTC)

Bot Setup Help (Wikia, not Wikipedia)

Hi! I'm trying to set up some anti-vandal bots and such over on my wiki I am an owner of the wiki, and right now I use AWB for basic fixes, but vandalism has been a big problem in the past before I adopted the wiki and reverted all of it. Can someone help me learn to use a Bot for my Wiki? I currently run on Windows 10. Thanks!FiveCraft (talk) 23:06, 1 April 2017 (UTC)

Bot account questions

I am new to bots and eventually plan to develop a bot for Wikidata which will compare information published by the US Bureau of the Census to information about New England towns and make appropriate edits. It's possible that this might be adopted to edit the corresponding Wikipedia articles later.

For now I'm just following the tutorial at wikidata:Wikidata:Pywikibot - Python 3 Tutorial which will not involve making any edits, except to sandbox items, but it will involve reading data from Wikipedia. I'm wondering when is the best time to create a bot account, now, or when the nature of the edits will require approval. I am aware that separate approvals would be needed for each project where non-sandbox edits are to be made.

I am also wondering about the preferred method for logging in, or equivalent, and whether the same credentials will work on both Wikidata and Wikipedia. Jc3s5h (talk) 18:01, 30 July 2017 (UTC)

@Jc3s5h: You can create bot accounts whenever you want, the restrictions for non-approved bots can be found at WP:BOTPOL#Valid operations without approval. For logging in, we also require that new bots make use of WP:API#Assert. Others might have more specific advice with Pywikibot / Python 3. Headbomb {t · c · p · b} 19:23, 30 July 2017 (UTC)
See also WP:BOTACC. Headbomb {t · c · p · b} 19:24, 30 July 2017 (UTC)

关于我的页面,机器人就可以改吗?

把我的网页串改 Zhanlang1975 (talk) 23:24, 1 November 2017 (UTC)

What page are you referring to? — xaosflux Talk 23:46, 1 November 2017 (UTC)

Guidance on disruptive use of semi-automated tools?

Do we have anything on this already? We're contemplating adding a footnote to WP:Manual of Style about not using AWB, etc., to "enforce" the "rules" of MoS across a zillion pages (see WT:Manual of Style#Proposed footnote to discourage mass changes – the actual proposal, not the joke thread under it, though you may find that amusing on the side). While we've identified a relevant ArbCom ruling, it seems like something that should be in the behavioral or editing guidelines somewhere.  — SMcCandlish ¢ 😼  22:08, 4 July 2018 (UTC)

@SMcCandlish: See WP:CONSENSUS,WP:DE, WP:AWB#Rules, WP:MEATBOT. However, there is nothing wrong with using AWB to enforce the majority of MOS recommendations. The efforts should be on identifying where AWB is applying unwanted fixes, and boot those out of WP:GENFIXES (or demote them to minor genfixes). Headbomb {t · c · p · b} 02:04, 5 July 2018 (UTC)
@Headbomb: I already checked CONSENSUS, and EDITING policies, and at DE; they don't cover it that I can see. I'm searching the pages for key strings like "mass", "multi", "automat", "many", "a lot", "large number", etc. You'll probably want to see that WT:MOS discussion and suggest any needed wording changes so that it's clear that the kind of stuff GENFIXES does isn't snared. I think I have that covered now in latest draft. It should be tied to changes of one acceptable option to another being unproductive, per MOS:RETAIN. Integrated material from BOTPOL and AWB rules.  — SMcCandlish ¢ 😼  03:07, 5 July 2018 (UTC)

In fact we need the rules to encourage editors and bot owners to enforce rule of MoS. Not enforcing rules is allowing custom styles over Wikipedia. I would be glad to participate in a discussion where bots will run daily to enforce rules. In an ideal situation editors using VE or any other editor get a heads up of how to comply with MoS. -- Magioladitis (talk) 16:10, 5 July 2018 (UTC)

I suggest that we leave massive messages to AWB editors to tunr general fixes on to help enforce MoS rules. -- Magioladitis (talk) 16:13, 5 July 2018 (UTC)

MoS is not a rule it's a Guideline. According to WP:GUIDES it is a "best practice" and "Editors should attempt to follow guidelines, though they are best treated with common sense, and occasional exceptions may apply." Not everyone agrees with the MoS on every point, in every case, nor should it be enforced by a bot/AWB that can't exercise common sense and/or exceptions. Indeed the MoS is widely ignored in some cases (even in FA) and enforcing it for the sake of it is a recipe for conflict. Finally the consensus behind many MoS guidelines are weak or non-existent. -- GreenC 16:27, 5 July 2018 (UTC)
GreenC yes this is true. But in order to form a stronger consensus we should try to apply it. Then we can see whether the MoS rule have to change. AWB does not implement every single rule and it is not the only tool that tries to implement MoS rules. Some years ago there were even conflict between semi-automated tools and scripts and there might be some conflicts till nowadays. -- Magioladitis (talk) 16:43, 5 July 2018 (UTC)
If that's the intention start a WP:BOTREQ like everyone else using automated tools to achieve some result they believe is important. There isn't preexisting consensus to use automated tools to test the strength of MoS consensus. -- GreenC 17:21, 5 July 2018 (UTC)
I've done similar requests in the past. I think a wider opinion may be needed. I am ot the one who started this discussion anyway. I am just typing my opinion. -- Magioladitis (talk) 18:24, 5 July 2018 (UTC)

MfD notice for an old bot report page.

See Wikipedia:Miscellany for deletion/Wikipedia:WikiProject Spam/Report, where it is proposed to delete Wikipedia:WikiProject Spam/Report, an old Betacommand bot report page. —SmokeyJoe (talk) 00:48, 27 January 2019 (UTC)

Registration

  • If a bot uses a template, it should register on the template talkpage, for template discussion purposes.
  • A bot function template may lie dormant for a million years and still be a bot function.
  • It will get deleted in two minutes, eventually, if you don't register your bots interest. That's not a suggestion. It's a report.
  • If you want to delete a robots mechanical appendage, template, you should have to check and tidy, or delete the whole bot. If you are going to mess with the electric, you must do a qualified finish. Prejudice should be given to keeping pages which provide things like bot function (like bot function?) ~ R.T.G 07:09, 10 March 2019 (UTC)

How to use Special:ApiFeatureUsage

Looking at a query by one of my bots, I see the message:

Use Special:ApiFeatureUsage to see usage of deprecated features by your application.

So I go to Special:ApiFeatureUsage and it wants me to fill in a "User agent" field. What is a "user agent" in this context? What do I enter in this field to specify that "my application" that I want to see what deprecated features it's using is one of my bots (i.e. RMCD bot or Merge bot). Thanks, wbm1058 (talk) 15:30, 13 May 2019 (UTC)

"php wikibot classes" if you haven't changed it assuming your code is still this using botclasses.php, assuming the value wasn't changed since last time you updated this. —  HELLKNOWZ   ▎TALK 15:52, 13 May 2019 (UTC)
Thanks again. I see curl_setopt($this->ch,CURLOPT_USERAGENT,'php wikibot classes'); used by bot function get and function post in User:RMCD bot/botclasses.php – so can I change that to something unique in my copy of that function library to ensure that only my applications are reported, e.g. 'php wikibot classes wbm1058'?
So looking at the report for the last two days, it doesn't exactly say which are the "deprecated features". Why is it only showing logins and queries, but no posts? wbm1058 (talk) 16:16, 13 May 2019 (UTC)
I can't really advise you on how to change the code since I'm not familiar with it at all. I imagine there must be some way to set a custom user agent string without actually editing the class.
I really don't know much about Special:ApiFeatureUsage. I understand that everything logged there from enWP there is deprecated in some manner, probably some param for those calls. I see pretty much the same ones for my bot except logins, which I updated a couple years back. —  HELLKNOWZ   ▎TALK 18:27, 13 May 2019 (UTC)
So you take it that this only shows deprecated features. Which would explain why I don't see any POST stuff (if none of my posts are deprecated). So then why isn't it called Special:DeprecatedApiFeatureUsage? I found mw:Extension:ApiFeatureUsage... "fetch summaries of data logged by ApiBase::logFeatureUsage(), which is typically an indication of use of deprecated API features", but the most helpful thing about that page is that it documents the author... @Anomie:
I was kind of aware that my logins were "deprecated"... see User talk:RMCD bot/Archive 2#Bot is down... I did try implementing Special:BotPasswords, but found that it was too much trouble to try to figure it out (and I'm not sure that that's not deprecated too!) – wbm1058 (talk) 19:20, 13 May 2019 (UTC)
@Wbm1058: You're right, it only shows usage of deprecated features. Trying to log everything for everyone would be far too much data when Wikimedia sites get around 6000 API requests per second.
Special:BotPasswords isn't deprecated, and it's specifically intended to be used with minimal changes to your existing code. You set up the BotPassword on the special page (while logged in to the bot account), and then use the username and password it gives you (rather than the normal username+password for your bot account) in your bot's configuration.
As for the other things on there,
  • action=query&prop=revisions&!rvslots: This is about the fact that you're using action=query&prop=revisions without specifying the rvslots parameter. The plan is that MediaWiki will someday allow more than one "slot" of content in a page, so for example it might be possible to have the template and its TemplateStyles stylesheet at the same title instead of having to use a subpage. Right now though, the only slot is rvslots=main.
  • action=login&!lgtoken: You're using action=login without supplying the lgtoken parameter to get the login token from the NeedToken response. The new way to do it is to use action=query&meta=tokens, supplying type=login.
  • action=query&prop=info&intoken: You're using action=query&prop=info&intoken=... to fetch a token, probably an edit token with intoken=edit Again, the new way to do it is to use action=query&meta=tokens, most likely supplying type=csrf. The help for whichever token-needing module you're using will tell you for sure what value of type you need, e.g. at Special:ApiHelp/edit it says token: A "csrf" token retrieved from action=query&meta=tokens.
Of course, where I say "you" it's possible it's actually someone else using the same library with the same default agent. Hope this helps, feel free to ask if you have any more questions. Anomie 10:40, 14 May 2019 (UTC)

Editable instruction summaries for Special pages, located in the MediaWiki namespace

Examples (which I've edited):

BotPasswords

@Anomie: is there a similar MediaWiki:-summary page for Special:ApiFeatureUsage? MediaWiki:ApiFeatureUsage-summary? I don't want to create it before I can confirm the pagename. wbm1058 (talk) 15:49, 14 May 2019 (UTC)

@Wbm1058: it is MediaWiki:Apifeatureusage-text. — xaosflux Talk 16:00, 14 May 2019 (UTC)
@Wbm1058: FYI, you can often find the message by adding uselang=qqx to the URL, e.g. https://en.wikipedia.org/wiki/Special:ApiFeatureUsage?uselang=qqx. It's a little trickier for pages that only show as the result of a form post; for those I normally use the developer console to find the <form> and adjust the action attribute. Anomie 23:01, 14 May 2019 (UTC)

Implementing a secure and not-deprecated login using BotPasswords

OK, I'll pick up this dropped ball and work my way through it, documenting what I do here so that (1) others with similar needs can perhaps follow this for guidance, and (2) anyone watching this page can correct me on anything I get wrong.

The first point I want to make is on the need for application ("front-end" in modern lingo) developers to maintain their own libraries. Rather than the systems (back-end) guys doing it. Neither my copy nor Sam Reed's copy nor Kunal Mehta's copy of botclasses.php checks for [warnings] and, if found, passes them on to the end user. So don't expect your end users to be immediately aware of these warnings passed through the API, as they weren't showing up on my bot's console.

The library has a helpful comment with a link to a page explaining the two-step login procedure: /* This is now required - see https://bugzilla.wikimedia.org/show_bug.cgi?id=23076 */

After inserting a couple print_r($ret); lines in the appropriate places, the library printed the warnings (something it only did when there was an error) so I could see them on my console:

[main] => Subscribe to the mediawiki-api-announce mailing list at <https://lists.wikimedia.org/mailman/listinfo/mediawiki-api-announce> for notice of API deprecations and breaking changes. Use Special:ApiFeatureUsage to see usage of deprecated features by your application.

This message is returned by step 1 of the login process:

[login] => Fetching a token via "action=login" is deprecated. Use "action=query&meta=tokens&type=login" instead.

This message is returned by step 2 of the login process:

[login] => Main-account login via "action=login" is deprecated and may stop working without warning. To continue login with "action=login", see Special:BotPasswords. To safely continue using main-account login, see "action=clientlogin".

Relevant advice from above:

  • action=login&!lgtoken: You're using action=login without supplying the lgtoken parameter to get the login token from the NeedToken response. The new way to do it is to use action=query&meta=tokens, supplying type=login.

So I replaced "action=login" with "action=query&meta=tokens&type=login" for step 1. But it's not that simple. I got a new warning:

[main] => [*] => Unrecognized parameters: lgname, lgpassword.

So I don't need to send my login name and password in step 1 anymore? The query did return a [logintoken], similar to the [token] returned by the old method. Apparently not: mw:API:Tokens. So, I simply removed the $post array that passed in the ID & pw from the query in the library. No longer a need to check if ($ret['login']['result'] == 'NeedToken'). For step 2, I replace ['login']['token'] with ['query']['tokens']['logintoken']

endorphin rush It worked! I'm still getting the warning about the need to use BotPasswords in step 2, but time to take a break to have a beer to celebrate success in step 1. Meanwhile, you can tell Sam and Kunal to make THIS change in their copy of botclasses.php – wbm1058 (talk) 20:18, 15 May 2019 (UTC)

I just observed that while the old login consisted of two POST: operations, the new login is a GET: followed by a POST: – hence no login credentials passed in step 1, since it's a GET:! – wbm1058 (talk) 21:51, 15 May 2019 (UTC)
Help:Creating a bot § Logging in needs to be updated. "For security, login data must be passed using the HTTP POST method. Because parameters of HTTP GET requests are easily visible in URL, logins via GET are disabled." Well, I understand this well enough now (though not an expert yet) that I might finally be able to update that section. – wbm1058 (talk) 22:24, 15 May 2019 (UTC)

Success! I implemented BotPasswords for all three of my bot accounts, and updated Help:Creating a bot § Logging in for both the new GET/POST login method and BotPasswords. I created MediaWiki:Botpasswords-text and MediaWiki:Botpasswords-label-appid, which should make the BotPasswords interface much more intuitive. – wbm1058 (talk) 14:39, 16 May 2019 (UTC)

Welcome messages

Would a bot that automatically checks Special:Log/newusers and posts a welcome message overload the project? TheEditster (talk) 09:52, 11 July 2019 (UTC)

See Wikipedia:Bot_requests/Frequently_denied_bots#Bots_to_welcome_users. —  HELLKNOWZ   ▎TALK 11:08, 11 July 2019 (UTC)

Wikidata items for the bots?

Hi all, I did some searching around and found this page and the list of bots. Of course, the bots are users, but they are special. Which is why this project page exists in the first place. Now, last Monday we were hacking on federated SPARQL queries against Wikidata and a few external end points, and the question came up what data sources are used by Wikidata. So, I explained the general idea, and the bots came up. And I was wondering if we could use the WDQS to list all databases that bots use as input to synchronize with Wikidata. But it seems that the bots, or just the longer running bots, do not have Wikidata items themselves. I could imagine this semantic bot page would list the operating user, which databases it imports CC0 data from, which properties it uses (in the population), maybe relevant ShEx for those imports, etc. Hoping I did not just overlook it, I was wondering if there is something equivalent that I could use instead? If not, what are your thoughts on actually making items in Wikidata for bots with information like given above? --Egon Willighagen (talk) 10:38, 23 August 2019 (UTC)

@Egon Willighagen: "bots" are "users" and wikidata doesn't want items for user accounts. — xaosflux Talk 11:15, 23 August 2019 (UTC)
As said, bots are not regular users. Where is it stated that Wikidata does not want items for bots? --Egon Willighagen (talk) 12:27, 23 August 2019 (UTC)
You should probably propose the addition on Wikidata itself, not here. That's where it would be stated what they want. Anomie 12:41, 23 August 2019 (UTC)
Overall, I think it would be useful for us - just really if it is going to be in the mission of wikidata. For example, bots should have an "operator" property - which would be their operator (which means adding "users" to wikidata as well then....) — xaosflux Talk 12:51, 23 August 2019 (UTC)

Found old IABot problem - where to report?

Moved to WP:BOTN#Found old IABot problem - where to report?. Headbomb {t · c · p · b} 10:49, 20 May 2020 (UTC)

BotPasswords permission for merge history

Which box do I check in BotPasswords to give my bot permission to merge page histories? I couldn't find anything in the list that looked relevant. @Anomie:? wbm1058 (talk) 03:07, 4 July 2020 (UTC)

The needed right is mergehistory. According to Special:ListGrants, there is no grant that provides that right, so you'd have to request that MediaWiki (or Wikimedia's configuration) be changed to add that to one of the existing grants or to create a new grant for it. Also of note (for anyone else reading this) is that normal bots don't have that right anyway, but User:Merge bot is an adminbot. Anomie 03:15, 4 July 2020 (UTC)
Hmm –
* https://en.wikipedia.org/wiki/Special:ListGrants?uselang=qqx
* MediaWiki:Right-delete
* MediaWiki:Right-mergehistory
* MediaWiki:Grant-delete
* MediaWiki:Grant-mergehistory
Who can change the MediaWiki configuration? Administrators? Bureaucrats? Developers? wbm1058 (talk) 11:50, 4 July 2020 (UTC)
It could either be changed locally, by adding it to the WMF equivalent of LocalSettings.php, or it could be changed in MediaWiki more broadly. It strikes me that it'd be a good idea to add this to MediaWiki itself, and it should be a fairly trivial change so to do in terms of the software; the only hold up would be getting it internationalised. I've created phab:T257110 on this issue. Turns out someone is already on this - see phab:T257097. Naypta ☺ | ✉ talk page | 12:49, 4 July 2020 (UTC)
Thanks. I see that LocalSettings.php is a redirect to MediaWiki but Spanish Wikipedia does have such a file: es:LocalSettings.php (but that's not actually written in PHP). <!--NOTE: This is NOT THE LocalSettings.php file for en.wikipedia.org's install; the settings file is not editable via the wiki interface-->
LocalSettings.php is the name commonly given to a php script file used to define local settings for a software installation. This allows for settings to be standardized, but still customizable as needed by the installer. For info about the MediaWiki settings file please see mw:Manual:LocalSettings.php. – wbm1058 (talk) 13:52, 4 July 2020 (UTC)
@Wbm1058: In case you didn't notice, I added a new grant for this to MediaWiki. You can now add it your BotPassword. — JJMC89(T·C) 06:01, 15 July 2020 (UTC)

 You are invited to join the discussion at User talk:MajavahBot/Bot status report § Making this table more useful for identifying bots that have failed. {{u|Sdkb}}talk 02:04, 7 August 2020 (UTC)

Important: maintenance operation on September 1st

User:Trizek (WMF) (talk) 10:30, 31 August 2020 (UTC)

Important: maintenance operation on October 27

This is a reminder of a message already sent to your wiki.

On Tuesday, October 27 2020, all wikis will be in read-only mode for a short period of time.

You will not be able to edit for up to an hour on Tuesday, October 27. The test will start at 14:00 UTC (14:00 WET, 15:00 CET, 10:00 EDT, 19:30 IST, 07:00 PDT, 23:00 JST, and in New Zealand at 03:00 NZDT on Wednesday October 28).

Background jobs will be slower and some may be dropped. This may have an impact on some bots work.

Know more about this operation.

-- User:Trizek (WMF) (talk) 09:25, 26 October 2020 (UTC)

Jog my memory ...

First, can anyone tell me which bot it is that goes around replacing potentially confusing template shortcuts in mainspace with actual template names (e.g. replaces {{cn|...}} with {{citation needed|...}}?

Second, where is this stuff listed? I.e., how can I answer this question for myself? There are a lot of bot-related talk pages, that all have notices basically saying "this is probably not the talk page you want", but this is also obviously not a question that needs bots-noticeboard attention.  — SMcCandlish ¢ 😼  00:28, 19 February 2021 (UTC)

There is no bot that replaces template shortcuts, as it is a cosmetic edit. AWB has a list of shortcuts that it will replace when genfixes are on, so if you see bots making those changes, chances are good it's an AWB-run bot using genfixes.
As to your second question... it isn't? Honestly, the various bot talk pages get so little action I think I'm going to propose we redirect everything (except maybe WT:BOTPOL) to WP:BOTN. Primefac (talk) 00:47, 19 February 2021 (UTC)
User:AnomieBOT expands them (and also dates them, so it's not a cosmetic task). Headbomb {t · c · p · b} 00:50, 19 February 2021 (UTC)
Right, AnomieBOT wouldn't expand {{cn|date=February 2020}} because it is dated. Primefac (talk) 00:51, 19 February 2021 (UTC)
AnomieBOT doesn't expand maintenance templates, it only dates. SmackBot/Helpful Pixie Bot used to do both, and I think there were a few others at various times too. AnomieBOT has on occasion expanded WikiProject templates when making other changes to them, as the names there were more in need of standardization. Anomie 13:34, 19 February 2021 (UTC)
iirc I proposed that last year. I continue to support it, if you want to propose it. ProcrastinatingReader (talk) 03:59, 19 February 2021 (UTC)
This question is fine for the bot noticeboard, and other talk pages wouldn't help you answer it yourself any more than this would. What you'd need would be a list of all current and past bot tasks, which I don't recall anyone ever maintaining. You could try searching subpages of Wikipedia:Bots/Requests for approval for relevant keywords, although this sort of cosmetic thing is reasonably likely to not be mentioned (particularly in ones from a decade or so ago). Anomie 13:34, 19 February 2021 (UTC)
We do have such a list, incidentally, though the bot that maintains it is currently not active. In re-reading that BRFA, though, it's only tracking scheduled tasks and not one-offs like mine. Primefac (talk) 13:52, 19 February 2021 (UTC)
Looks like that one never got out of the testing stage. There have been attempts at various points, but the maintenance has always been the problem. Anomie 13:59, 19 February 2021 (UTC)
While we may have "only hundreds" of current bots, many of them have lots of tasks - we're probably approaching 1000 approved tasks (some of which are very generic tasks) - a central index of all of these would be nice, but labor intensive to make. — xaosflux Talk 14:09, 19 February 2021 (UTC)
It wouldn't too difficult to maintain, every time a BRFA is approved, the operator or approver could add an entry for it in the central index. As to how to start the central index with the data for tasks approved thus far, this could be scriptable. Use the RecentChanges or EventStreams API to listen to all bot edits being made; use some heuristics on the edit summary to tell apart different tasks. Dump the bot usernames and edit summaries they used in a table – that should give a good picture of all bot editing taking place. Something good to begin with. – SD0001 (talk) 11:18, 21 February 2021 (UTC)
@SD0001: yea ongoing wouldn't be too hard - easy enough to make BAG enforce it - the bootstrapping is the hard part. Ideally the bot user pages should list their tasks as well (e.g. User:DannyS712 bot, User:Fluxbot do - but we don't require this exactly - and certainly not in a specific format. If anyone wants to start building though, I can't see any downsides besides your time. — xaosflux Talk 12:22, 21 February 2021 (UTC)

Split thread

information Note: This thread follows on from the yea ongoing wouldn't be too hard... post above at 12:22, 21 February 2021 (UTC) by xaosflux

This is only tangentially related, but it might be enough for splitting into its own discussion, but I think we should mandate links to BRFAs in edit summaries (and possibly on bot user pages as well). I know, historically, this was avoided in some instances due to the edit summary length limitations, but those effectively don't exist any more (well, at the very least, for adding a link to a BRFA with "Task X" goes). Primefac (talk) 13:39, 21 February 2021 (UTC)

I would fully support this - I would probably say that only approved and active BRFAs need to be listed on bot user pages (some bots have lots of one-off tasks that will never run again, little point cluttering up the page with them). ƒirefly ( t · c ) 13:41, 21 February 2021 (UTC)
I'd be against mandating BRFAs in the edit summary. Some bots have multiple BRFAs, and it's impossible to tie why specific edits where made to specific tasks. For example, a bot might be approved to change X→Y (task 1), and also to change A→B (task 2). Both tasks then get combined, so that any of X→Y and A→B gets done when it can be. You can see how that complexity scales up as tasks get added. Having links to BRFA 1/2/3/4/5/6/.../12 to cover all possible edits which might get made is not useful. Likewise if tasks evolves, and BRFAs no longer reflect the most up-to-date bot logic/consensus.
This is what bot userpages are for, not edit summaries. Headbomb {t · c · p · b} 14:12, 21 February 2021 (UTC)
I think some sort of medium ground can be had; if a bot (such as Danny's or NicoV's) has a ton of active tasks, I don't want to guess which task(s) is/are being run when I see a bot edit.
Also, regarding BRFAs no longer reflect the most up-to-date bot logic/consensus: if a bot task has been updated or changed, it should be reflected somewhere on the BRFA, either as an extra note or on its talk page. Primefac (talk) 14:23, 21 February 2021 (UTC)
That'd be affecting hundreds of bot tasks with near-trivial logic/bot tweaks. Again, the bot page is a more than adequate page to tackle this. Headbomb {t · c · p · b} 16:23, 21 February 2021 (UTC)
IMO, if the change is trivial it wouldn't be worth a note on the BRFA, and therefore this is moot. If the change is not trivial, then the change should be noted. ƒirefly ( t · c ) 16:25, 21 February 2021 (UTC)
Or, you know, the bot's page. Or talk page. We (BAG) don't need to micromanage routine updates, tweaks, and reasonable extensions to bot logic. Headbomb {t · c · p · b} 16:30, 21 February 2021 (UTC)
Personally, I've filed follow-up BRFAs in the cases where I wanted to add new functionality that wasn't covered in the original BRFA, but I know others are less cautious about expanding their tasks. Anomie 22:59, 21 February 2021 (UTC)
I'd rather not have to update all AnomieBOT's tasks to add a link that most people aren't likely to care about. Anomie 22:59, 21 February 2021 (UTC)
That's a fair point, since I think your bot's edit summaries and userpage are one of the best around (accurate, concise, etc). Primefac (talk) 23:53, 21 February 2021 (UTC)

Critique of my bot's architecture

First off, apologies if this is the wrong talk page, feel free to direct me to the correct one. I want to write a simple bot for practice and I want to get feedback on my design decisions.

  • I and others have userscripts that highlight usernames based on perms. This bot could update the lists of users that it uses. Example: User:Bellezzasolo Bot/userhighlighter.js/excon.js. Some of these lists don't get updated often. The list I linked there is 3 months old, for example.
  • It would be in my userspace, so probably wouldn't need a BRFA.
  • I'm thinking of writing it in PHP, my best language. Maybe I'll use mediawiki-api-base, which seems to be the only PHP library that is maintained (supports PHP 7). I'm open to feedback on the language or framework though.
  • I want to automate it so it runs regularly. Maybe I'll do a cron job on a web server. Although other ideas come to mind such as AWS. Open to feedback on this as well.

Workflow:

  1. Cron job executes (once a week? once a day?)
  2. Use api.php?action=query&list=users&usprop=groups to run an SQL query to get all enwiki usernames who possess X permission.
  3. Process the data a bit.
  4. Use api.php?action=edit to write the data to a page in the bot's userspace.

Thoughts? Thanks in advance. –Novem Linguae (talk) 09:09, 11 April 2021 (UTC)

First off, let me share my unsolicited opinion that personally I hate the idea of user highlighting scripts and I'm surprised so many users have invested their energies in writing them. Technically though, most of what you say sounds fine. As for a place for running the cron job, you may want to use Toolforge – it's free – rather than AWS. I'm not familiar with PHP libraries – for some reason people have been gradually moving away from using PHP itself. – SD0001 (talk) 10:40, 11 April 2021 (UTC)
Help:Creating a bot#PHP. I maintain the online botclasses.php and am currently running it under PHP version 7.3 – once every several moons I get around to bothering to update to a newer version and fix any issues found in the old code upon upgrading. I just haven't bothered to update the table. I suppose version 5 is what was current when this framework was originally developed. I've taken it over from the original developers who all seem to have abandoned it. – wbm1058 (talk) 14:48, 11 April 2021 (UTC)
Thanks for the feedback everybody. I'll use ToolForge and botclasses.php as suggested. I also updated botclasses.php in the PHP frameworks table. –Novem Linguae (talk) 04:55, 12 April 2021 (UTC)
Wbm1058, I started using botclasses.php today. Any interest in starting a GitHub? I have some ideas for issues and pull requests. If not, no worries. –Novem Linguae (talk) 04:31, 22 April 2021 (UTC)
You've seen that there already is a GitHub version here. That's the version supported by Legoktm, who works for the Wikimedia Foundation, though I believe all his botclasses.php work is done as a volunteer. I've not attempted to play with that because (1)I have no real experience with using GitHub (2)I'm comfortable with maintaining the publicly-posted copy on-wiki (3)I didn't want to step on Legoktm's toes. We've briefly met at a Wiki-conference but we don't have a working relationship. I've coded some additional functions myself, which so far I've kept for myself in my private, extended version of the library, er "framework". I'm a retired programmer who worked professionally on mainframes, back in the day. I guess I'm a little uncomfortable with the idea that the Foundation raises tons of money ostensibly for the purpose of supporting the volunteers, and then spends next to nothing on actually supporting the volunteers. OK, they (or someone) did pay expenses for my trip to the 2019 Boston conference, but, nothing for actual work done.
I guess I hesitated about updating the table to say "7.3" because there are many functions in the library I've never used, and I haven't verified that they still work in PHP 7.3. That would be a job for someone providing, ahem, "professional" support. – wbm1058 (talk) 14:00, 22 April 2021 (UTC)
Please, step on my toes... or at least don't let me stand in the way of making progress/improvements. I'm not really a good maintainer of botclasses.php, I've done just about the bare minimum to keep it working over the years. I never really thought of my version as the canonical version, but I'm more than happy to pass that torch onto someone else. Personally I'm more interested in Python/Rust bots these days and have been trying to rewrite the PHP ones I inherited, but it's always more complicated and time consuming than it seems. HTH, Legoktm (talk) 06:18, 23 April 2021 (UTC)

Page size issues

Alright, I finished my bot's code tonight. It generates output that looks like this. The page size is about 1,000,000 bytes. I'm running into some kind of perm issue... when I use the credentials for my main account, I can post the data just fine, but when I try to post it with User:NovemBot, who has no perms, the edit quietly fails. Two questions:

  1. What perm is needed to make large posts, autoconfirmed or extended confirmed?
  2. Is there any issue with making a 1 MB edit daily? Will sysadmins get mad that the page is so big and has 365 revisions a year? If it's a problem, maybe I'll switch the cadence to weekly.

There are other bots that post similar data of similar size. For example, User:Bellezzasolo Bot/userhighlighter.js/excon.js. So there is precedent for doing this. Thanks for your thoughts. –Novem Linguae (talk) 11:49, 22 April 2021 (UTC)

@Novem Linguae the edit quietly fails. The API will always give an error response with reason on why it failed, though it might be the case that the bot library you're using doesn't expose this error.
The page will only get 365 revs a year if the data changes every day which is unlikely. On the server only the deltas are stored so sysadmins are unlikely to get mad (though ask on wikitech-I if you want to be sure). If it's a concern, you can also use Toolforge to serve static assets without any setup required (see wikitech:Help:Toolforge/Web#Serving_static_files).
However, what I'd be concerned about is that if your userscript loads this huge page on every single page load, that costs a lot of bandwidth for users. A $.getJSON call which the script is currently using results in a response with Cache-Control header private, s-maxage=0, max-age=0, must-revalidate which clearly means there's no caching taking place. You should look into ways on getting the client to cache the result (I guess using the API to fetch the page with uselang=content and appropriate maxage & smaxage values should do the trick). – SD0001 (talk) 13:27, 22 April 2021 (UTC)
SD0001, thanks for the detailed response.
  • It'd be 365 days a year because I'd have my bot run and update the data page every 24 hours.
  • MediaWiki doesn't store page text for every revision, and reconstructs them from deltas?!? That sounds complex, I didn't expect that. Feel free to elaborate. Unless I'm missing it, they don't talk about deltas at mw:Manual:page table and mw:Manual:text table.
  • Great advice on the caching, thank you. I forked the user script from somebody else, and I had a feeling it might not be caching efficiently. I can certainly adjust the API call thanks to the tips you gave me.
  • Another optimization that I plan to do now that I have NovemBot is reduce the # of getJSON queries by combining some of the data onto one page.
  –Novem Linguae (talk) 14:01, 22 April 2021 (UTC)
Yes but many of those edits will be null edits, right? As for delta storage see mw:Manual:MediaWiki_architecture#Database_and_text_storage paragraph 6. – SD0001 (talk) 14:22, 22 April 2021 (UTC)
I think the edits will change every day. The list of extended confirmed users probably adds one or two every day, I would imagine. Thank you for elaborating on the database deltas, very good information. –Novem Linguae (talk) 14:40, 22 April 2021 (UTC)
SD0001 advised me here HERE that I could improve the efficiency of my goodarticlenominators.php by combining queries but I haven't yet taken time to look into how to do that. – wbm1058 (talk) 17:07, 22 April 2021 (UTC)
That uses some of my "private functions" that I mentioned in the section above. – wbm1058 (talk) 17:08, 22 April 2021 (UTC)
SD0001, thanks again for recommending the user script api.php caching idea. I tried to implement it today but I don't think it's working. Does anything obvious come to mind? Thank you. Code: User:Novem Linguae/Scripts/UserHighlighterSimple.js. API query: /index.php?action=raw&ctype=application/json&maxage=86400&smaxage=86400&uselang=content&title=User:Novem_Linguae/User_lists/Staff_and_sysadmins.js. Response headers: cache-control: private, s-maxage=0, max-age=0, must-revalidateNovem Linguae (talk) 04:01, 26 April 2021 (UTC)
@Novem Linguae: You're using index.php instead of api.php. I think index.php calls are never cached for logged-in users. You'll have to use mw:API:Revisions, get the text using apiResponse.query.pages[0].revisions[0].content and apply JSON.parse on that. – SD0001 (talk) 06:31, 26 April 2021 (UTC)