![]() | All editors are encouraged to participate in the requests below – your comments are appreciated more than you may think! |
New to bots on Wikipedia? Read these primers!
- Approval process – How these discussions work
- Overview/Policy – What bots are/What they can (or can't) do
- Dictionary – Explains bot-related jargon
To run a bot on the English Wikipedia, you must first get it approved. Follow the instructions below to add a request. If you are not familiar with programming, consider asking someone else to run a bot for you.
Instructions for bot operators | ||||||||||
---|---|---|---|---|---|---|---|---|---|---|
|
Bot-related archives |
---|
Bot Name | Status | Created | Last editor | Date/Time | Last BAG editor | Date/Time |
---|---|---|---|---|---|---|
IUCNStatusBot (T|C|B|F) | Open | 2025-03-25, 05:46:08 | AidenD | 2025-07-03, 06:17:58 | TheSandDoctor | 2025-07-02, 22:46:23 |
AussieBot 2 (T|C|B|F) | In trial | 2025-06-18, 11:00:52 | Hawkeye7 | 2025-07-11, 20:59:49 | DreamRimmer | 2025-07-08, 10:37:27 |
GraphBot 2 (T|C|B|F) | In trial | 2025-07-02, 21:00:07 | DreamRimmer | 2025-07-08, 06:35:57 | DreamRimmer | 2025-07-08, 06:35:07 |
CX Zoom AWB 2 (T|C|B|F) | In trial | 2025-06-24, 12:55:26 | DreamRimmer | 2025-07-03, 16:54:49 | DreamRimmer | 2025-07-03, 16:54:49 |
SodiumBot 3 (T|C|B|F) | In trial | 2025-06-18, 18:16:08 | Sohom Datta | 2025-07-02, 15:18:13 | ProcrastinatingReader | 2025-06-29, 10:48:39 |
VWF bot 2 (T|C|B|F) | In trial | 2025-05-28, 16:11:04 | Primefac | 2025-06-08, 13:32:41 | Primefac | 2025-06-08, 13:32:24 |
CFA (bot) (T|C|B|F) | In trial: User response needed! | 2024-12-31, 05:00:34 | DreamRimmer | 2025-07-08, 06:53:40 | DreamRimmer | 2025-07-08, 06:53:40 |
SodiumBot 2 (T|C|B|F) | In trial: User response needed! | 2024-07-16, 20:03:26 | Sohom Datta | 2025-07-13, 14:57:22 | DreamRimmer | 2025-07-13, 08:17:29 |
DeadbeefBot II (T|C|B|F) | Trial complete | 2025-05-23, 02:10:50 | SD0001 | 2025-07-12, 19:11:54 | SD0001 | 2025-07-12, 19:11:54 |
CanonNiBot 1 (T|C|B|F) | Trial complete: User response needed! | 2024-12-17, 12:50:01 | Primefac | 2025-07-12, 12:23:32 | Primefac | 2025-07-12, 12:23:32 |
Current requests for approval
[edit]Operator: AidenD (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)
Time filed: 05:45, Tuesday, March 25, 2025 (UTC)
Function overview:
A bot that updates the IUCN Red List status of Wikipedia pages.
Automatic, Supervised, or Manual:
Automatic
Python
Source code available: https://github.com/DartAiden/IUCN_Status_Bot/tree/main
Links to relevant discussions (where appropriate): No discussion really applicable. I briefly made a request to see if anything was existing, only to receive little reply. https://en.wikipedia.org/wiki/Wikipedia:Bot_requests#IUCN_Status_Bot
Edit period(s):
Run once
Estimated number of pages affected:
Around 11,000.
Namespace(s): Wikipedia pages for species.
Exclusion compliant (Yes/No): Not really relevant.
Function details:
This is a simple script to update the IUCN Red List status of Wikipedia pages for birds, though the functionality can most likely be extended elsewhere. The IUCN Red List status is a measure of conservation status included in the species box of most birds. This bot extracts the name of the citation to ensure contiguity of editing, updates the status itself, and then updates the reference. This bot would be run once now, to standardize the citations, and once every time the Red List is updated, though it is necessary only to run it with those species that are being updated. I may add other functionality in the future, such as updating the actual pages that list statuses (List of critically endangered birds).
Discussion
[edit]- Would it be worth doing this on Wikidata instead, and then have the template pull from there? — Qwerfjkltalk 12:41, 27 May 2025 (UTC)
- I wrote a bot once that did sommat similar updates by querying the IUCN API. There was sufficient adverse editor pushback that I retired the bot.
- —Trappist the monk (talk) 13:49, 27 May 2025 (UTC)
- What was the nature of the pushback? ꧁Zanahary꧂ 21:16, 27 May 2025 (UTC)
- Convenience links: Wikipedia:Bots/Requests for approval/Monkbot 19, User talk:Trappist the monk/Archive 23#Monkbot Task 19 - IUCN status –Novem Linguae (talk) 21:23, 27 May 2025 (UTC)
- Thanks. Looks like it was really a content dispute about whether "Critically Endangered (Possibly Extinct)" is a valid IUCN designation. ꧁Zanahary꧂ 21:34, 27 May 2025 (UTC)
- AidenD, Edit period:
Runce on
? Primefac (talk) 13:29, 8 June 2025 (UTC)- What? AidenD (talk) 20:15, 8 June 2025 (UTC)
- That is what you have listed in the edit period for this bot. I was looking for clarification because I have never heard of that sort of time frame. Primefac (talk) 20:36, 8 June 2025 (UTC)
- I think they mean "Run once". – SD0001 (talk) 21:28, 8 June 2025 (UTC)
- That would make sense. Incidentally, that's about the only option I didn't think of. Primefac (talk) 22:50, 8 June 2025 (UTC)
- Whoops, that is what I meant. My apologies. AidenD (talk) 04:42, 9 June 2025 (UTC)
- That would make sense. Incidentally, that's about the only option I didn't think of. Primefac (talk) 22:50, 8 June 2025 (UTC)
- I think they mean "Run once". – SD0001 (talk) 21:28, 8 June 2025 (UTC)
- That is what you have listed in the edit period for this bot. I was looking for clarification because I have never heard of that sort of time frame. Primefac (talk) 20:36, 8 June 2025 (UTC)
- What? AidenD (talk) 20:15, 8 June 2025 (UTC)
- {{BAG assistance needed}} * Pppery * it has begun... 16:37, 1 July 2025 (UTC)
- AFAICS, this bot would run into the same concern raised with Monkbot 19 - that it doesn't consider "PE" as a valid designation, although Template:Taxobox/species does, and therefore would overwrite the classifications of species currently denoted with a status of "PE". Is that a correct understanding? If so, probably that should be settled as a content decision in some other venue first. ProcrastinatingReader (talk) 19:02, 1 July 2025 (UTC)
- It might also be possible to work around the content dispute. For example, programming the bot not to touch anything that is already PE onwiki. –Novem Linguae (talk) 23:19, 1 July 2025 (UTC)
- It may also comment out new PE designations, which could then be automatically removed or uncommented on resolution of the dispute. ꧁Zanahary꧂ 02:29, 2 July 2025 (UTC)
- That is a very good point. @AidenD: Would it be possible to integrate that from above before we consider any sort of trial? TheSandDoctor Talk 22:46, 2 July 2025 (UTC)
- They can be manually added, the unfortunate problem is that the IUCN API does not discern between those - I can always manually add them, however. AidenD (talk) 06:17, 3 July 2025 (UTC)
- That is a very good point. @AidenD: Would it be possible to integrate that from above before we consider any sort of trial? TheSandDoctor Talk 22:46, 2 July 2025 (UTC)
- It may also comment out new PE designations, which could then be automatically removed or uncommented on resolution of the dispute. ꧁Zanahary꧂ 02:29, 2 July 2025 (UTC)
- It might also be possible to work around the content dispute. For example, programming the bot not to touch anything that is already PE onwiki. –Novem Linguae (talk) 23:19, 1 July 2025 (UTC)
- AFAICS, this bot would run into the same concern raised with Monkbot 19 - that it doesn't consider "PE" as a valid designation, although Template:Taxobox/species does, and therefore would overwrite the classifications of species currently denoted with a status of "PE". Is that a correct understanding? If so, probably that should be settled as a content decision in some other venue first. ProcrastinatingReader (talk) 19:02, 1 July 2025 (UTC)
Bots in a trial period
[edit]Operator: Hawkeye7 (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)
Time filed: 11:00, Wednesday, June 18, 2025 (UTC)
Function overview: The Bot will go through:
- Category:Start-Class articles conflicting with C-Class project-independent quality rating (10,568),
- Category:Stub-Class articles conflicting with C-Class project-independent quality rating (168),
- Category:C-Class articles conflicting with Start-Class project-independent quality rating (3,188),
- Category:Stub-Class articles conflicting with Start-Class project-independent quality rating (10,529),
- Category:Start-Class articles conflicting with Stub-Class project-independent quality rating (11,190),
- Category:C-Class articles conflicting with Stub-Class project-independent quality rating (373)
and resolve conflicting quality ratings, setting it to stub or start depending on its rating in Liftwing.
Automatic, Supervised, or Manual: Automatic
Source code available: [1]
Links to relevant discussions (where appropriate): Template talk:WikiProject banner shell#Conflicting quality ratings
Edit period(s): Daily until all are resolved.
Estimated number of pages affected: ~ 35,000
Namespace(s): Talk
Exclusion compliant (Yes/No): Yes
Function details: Bot goes through each article in the category, clears the current ratings and applies the Liftwing rating of start or stub as appropriate.
Discussion
[edit]Thanks for filing this. I think we discussed running this for all Stub/Start/C classes? So that would include the following categories:
- Category:Start-Class articles conflicting with C-Class project-independent quality rating (10,568)
- Category:Stub-Class articles conflicting with C-Class project-independent quality rating (168)
- Category:C-Class articles conflicting with Start-Class project-independent quality rating (3,188)
- Category:Stub-Class articles conflicting with Start-Class project-independent quality rating (10,529)
- Category:Start-Class articles conflicting with Stub-Class project-independent quality rating (11,190)
- Category:C-Class articles conflicting with Stub-Class project-independent quality rating (373)
— Martin (MSGJ · talk) 12:06, 18 June 2025 (UTC)
- Yes, the bot can handle this. Hawkeye7 (discuss) 12:33, 18 June 2025 (UTC)
- @MSGJ: In testing, I have found that some of the C-class articles would be eligible for a B-class rating. Should they be rated B or C? Hawkeye7 (discuss) 20:28, 22 June 2025 (UTC)
- I think the bot should rate according to its algorithm. Could it produce a log so that we can spot check a sample manually? — Martin (MSGJ · talk) 07:58, 23 June 2025 (UTC)
- Yes, it can. Normally on WikiProject:Military History we double-check all the articles that have been Bot-rated as B-class. Hawkeye7 (discuss) 19:24, 25 June 2025 (UTC)
- I think the bot should rate according to its algorithm. Could it produce a log so that we can spot check a sample manually? — Martin (MSGJ · talk) 07:58, 23 June 2025 (UTC)
- Hmmm. Maybe I cannot. The Bot requires the
createpage
right to be able to do this. Hawkeye7 (discuss) 04:28, 2 July 2025 (UTC)- All registered users have the right. You likely need to adjust the BotPassword grants to include it. – SD0001 (talk) 15:48, 4 July 2025 (UTC)
- Hmmm. Maybe I cannot. The Bot requires the
- All users should have this right, but AussieBot does not and I do not know how to grant it. Hawkeye7 (discuss) 20:11, 4 July 2025 (UTC)
- Go to Special:BotPasswords, open the botpassword you're using in the bot, tick the "Create, edit, and move pages" grant. Update. – SD0001 (talk) 07:14, 5 July 2025 (UTC)
- All users should have this right, but AussieBot does not and I do not know how to grant it. Hawkeye7 (discuss) 20:11, 4 July 2025 (UTC)
If you plan to cover all the mentioned categories, please update the function details to reflect them, and adjust the estimated number of pages affected and the function overview accordingly. – DreamRimmer ■ 06:05, 8 July 2025 (UTC)
Done Hawkeye7 (discuss) 07:12, 8 July 2025 (UTC)
- Since LiftWing may rate some of them as B or C class, please update the task details to reflect that. The lines that say "rating of start or stub as appropriate" and "stub or start depending on its rating in LiftWing" should be updated to include B and C as well. – DreamRimmer ■ 10:32, 8 July 2025 (UTC)
Approved for trial (120 edits). Please provide a link to the relevant contributions and/or diffs when the trial is complete. Please process 20 pages from each category. – DreamRimmer ■ 10:37, 8 July 2025 (UTC)
Done The report can be found here Hawkeye7 (discuss) 02:34, 9 July 2025 (UTC)
- Looks great to me. I spot-checked about 15-20 articles and could not disagree with any of the assessments given — Martin (MSGJ · talk) 08:14, 9 July 2025 (UTC)
- @Hawkeye7 the bot seems to be continuing with its edits today. Has an additional trial been authorised? — Martin (MSGJ · talk) 11:58, 11 July 2025 (UTC)
- Oops. My apologies. After trial I have to move the bots from my server to the Toolforge, and I activated AussieBot 2 instead of AussieBot 1. Hawkeye7 (discuss) 20:59, 11 July 2025 (UTC)
Operator: GalStar (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)
Time filed: 21:00, Wednesday, July 2, 2025 (UTC)
Function overview:
Automatic, Supervised, or Manual: Automatic
Programming language(s): Rust/Python
Source code available: Uses mwbot
Links to relevant discussions (where appropriate): Wikipedia:Bot_requests#Redirects_related_to_those_nominated_at_RfD and Wikipedia talk:Redirects for discussion#Avoided double redirects of nominated redirects
Edit period(s): Continous
Page: Wikipedia:Redirects_for_discussion
Exclusion compliant (Yes/No): Yes (but N/A)
Adminbot (Yes/No): No
Function details:
- Look at each RFD on each RFD Page
- Determines whether there are any other redirects, in any namespace, that meet one or more of the following criteria:
- Are marked as an avoided-double redirect of a nominated redirect
- Are redirects to the nominated redirect
- Redirect to the same target as the nominated redirect and differ only in the presence or absence of non-alphanumeric characters, and/or differ only in case
- If the bot finds any redirects that match and which are not currently nominated at RfD, then it should post a message in the discussion (final details about the message are TBD, but the bot request outline the general point). The bot limits the length of it's message, ensuring that the RfD is not over-cluttered.
Discussion
[edit]Thanks for working on this GalStar, but it's not clear whether it is checking for redirects that differ only in the presence/absence of diacritics? Thryduulf (talk) 23:41, 2 July 2025 (UTC)
- Diacritics fall under non-alphanumeric characters. GalStar (talk) (contribs) 16:48, 3 July 2025 (UTC)
Approved for trial (30 days). Please provide a link to the relevant contributions and/or diffs when the trial is complete. – DreamRimmer ■ 06:35, 8 July 2025 (UTC)
Operator: CX Zoom (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)
Time filed: 12:55, Tuesday, June 24, 2025 (UTC)
Function overview: Implementing behavior change for {{PresFoot}}, compliance with MOS:CHRONO
Automatic, Supervised, or Manual: Manual
Programming language(s): AWB, Python
Source code available: Yes, see User:CX Zoom AWB/Task 2
Links to relevant discussions (where appropriate): Special:Permalink/1296419942#Behavior change for {{PresFoot}}
Edit period(s): One-time run
Estimated number of pages affected: 2950
Namespace(s): Mainspace (mostly), few Userspace and Projectspace pages too
Exclusion compliant (Yes/No): No; templates will have erroneous output if all uses are not resolved
Function details: This task will find uses of {{PresFoot}} and replace them with {{PresRow}} followed by {{PresFoot}} (with no parameters; new behaviour for this template). It will then reorder all the {{PresRow}} in the article in chronological order. It will also check for potential stray table closer |} and remove them.
Discussion
[edit] Approved for trial (50 edits). Please provide a link to the relevant contributions and/or diffs when the trial is complete. – DreamRimmer ■ 16:54, 3 July 2025 (UTC)
Operator: Sohom Datta (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)
Time filed: 18:15, Wednesday, June 18, 2025 (UTC)
Function overview: Taking over the functions of User:Yapperbot (frs, pruner, uncurrenter, not wikidatatable), the maintainer has been inactive for 3 years and API changes have caused the bot to become brittle, keep stopping and requiring frequent restarts that require upkeep and/or Toolforge admin intervention. I intend to keep the same code and actions but include maintaince fixes to make it less brittle. The primary reason to take over the task is so that we don't need to bother Toolforge admins for every outage.
Automatic, Supervised, or Manual: Automatic
Programming language(s): Golang
Source code available: Yes, (forked repos: [2], [3], [4])
Links to relevant discussions (where appropriate): User talk:Yapperbot (the three discussions on the page are all about failures of the bot)
Edit period(s): Continous
Estimated number of pages affected: FRS and Pruner are opt-in, so as many users and pages opt-in, uncurrenter depends on the number of {{current}} templates placed
Namespace(s): Mainspace, Talk, Userspace
Exclusion compliant (Yes/No): No, uncurrenter is article-facing, and frs and pruner are explicitly opt-in
Function details:
- FRS - Running the Feedback request service
- Pruner - Removes inactive members from WP:FRS and other Wikiproject lists
- Uncurrenter - Removes {{current}} from pages where it has been around for a long time.
For everything except FRS the original un-modified code will be used, for FRS, minor maintainence fixes will be made. If this bot is trialed or approved, Yapperbot should be blocked to only operate in userspace
Discussion
[edit]- Original BRFAs for reference:
- – SD0001 (talk) 14:50, 19 June 2025 (UTC)
- Can you email the botop to stop the tasks being taken over? We could partial-block if necessary but that's more of a last resort. – SD0001 (talk) 14:51, 19 June 2025 (UTC)
- Sent! Sohom (talk) 15:34, 19 June 2025 (UTC)
- SD0001, do you have any thoughts about this being three tasks in one? Obviously FRS and pruner are related so I could see those combined, but uncurrenter seems rather unrelated. I'd rather split that off into a separate task. Primefac (talk) 01:23, 21 June 2025 (UTC)
- I think it's fine as these are previously approved tasks and the same code is being reused, so there's probably not much chance of things breaking. – SD0001 (talk) 06:46, 21 June 2025 (UTC)
Approved for trial (30 days). Please provide a link to the relevant contributions and/or diffs when the trial is complete. The FRS job is stuck again, and since the current operator has not responded, we should consider other ways to stop it. There is a switch page at User:Yapperbot/kill/FRS that can be used to stop the job, which is a better option than blocking it. I am approving a 30-day trial so that we can monitor for any errors and give you time to adjust the code for long-term stability. This trial is limited to the FRS and pruner tasks. Since there is no control page for the pruner task, partially blocking it from the Wikipedia and User talk namespaces should be a suitable workaround. – DreamRimmer ■ 17:15, 28 June 2025 (UTC)
Do we still need uncurrenter? Does Wikipedia:Bots/Requests for approval/ProcBot 10 not cover that area sufficiently? * Pppery * it has begun... 04:09, 29 June 2025 (UTC)
- I did some looking through the code, there was/is a kill switch in the form of User:Yapperbot/kill/Pruner for pruner. I've gone and activated it alongside the FRS kill switch. Sohom (talk) 19:59, 28 June 2025 (UTC)
- (For the period of 30 days) Sohom (talk) 19:59, 28 June 2025 (UTC)
- Noting that the pruner job appears to be broken, I'll throw some time into fixing it and run a manual run (the typical schedule appears to be once a month). Sohom (talk) 13:47, 1 July 2025 (UTC)
- Pruner's manual run is complete this is a list of all edits made for the task. Sohom (talk) 15:18, 2 July 2025 (UTC)
- Noting that the pruner job appears to be broken, I'll throw some time into fixing it and run a manual run (the typical schedule appears to be once a month). Sohom (talk) 13:47, 1 July 2025 (UTC)
- (For the period of 30 days) Sohom (talk) 19:59, 28 June 2025 (UTC)
- @Pppery ProcBot says that the task is inactive and [5] says that the uncurrenter job is active. I'm honestly not sure what is going on and who is doing it. (cc @ProcrastinatingReader who might be answer this more cleanly -- I'm happy to drop it completely if the task has already been supersed -- though we should kill the task from User:Yapperbot's side then) Sohom (talk) 04:24, 29 June 2025 (UTC)
- Yapperbot was intermittently crashing when I filed ProcBot 10, but was working again sometime after approval, so I run ProcBot 10 (which is active atm on Toolforge) to deal with current-related templates (like {{current related}}, which Yapperbot didn't cover). I don't know if Yapperbot has broken on {{current}} since then. I can extend ProcBot 10 to cover regular {{current}} in any case.
- I'm not opposed to the approval of this task as-is, including uncurrenter, though. Doesn't hurt to improve reliability (especially as I am around a bit less these days). ProcrastinatingReader (talk) 10:48, 29 June 2025 (UTC)
Operator: Vanderwaalforces (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)
Time filed: 16:11, Wednesday, May 28, 2025 (UTC)
Automatic, Supervised, or Manual: automatic
Programming language(s): Python
Source code available:
Function overview: Automatically remove transclusions of closed RfD discussions (see example) for convenience regular upkeep.
Links to relevant discussions (where appropriate): Wikipedia:Bot_requests#Removing_fully_completed_daily_RfD_list_pages_from_the_main_RfD_page
Edit period(s): Continuous
Estimated number of pages affected: 1
Exclusion compliant (Yes/No): No
Already has a bot flag (Yes/No): Yes
Function details: The bot will remove transclusions of closed RfD discussions (see example) for convenience regular upkeep. More examples at Special:History/Wikipedia:Redirects for discussion.
Discussion
[edit]What is being proposed is not quite what the description implies. The bot would actually remove transclusions of daily log pages on the main RfD when all discussions on that day have been closed or relisted, rather than removing the transclusion of individual discussions (which are not typically transcluded anywhere). Thryduulf (talk) 16:46, 28 May 2025 (UTC)
- @Thryduulf from my overview and details above, I clearly mentioned “closed RfD discussions” and from what you just said, it’s only “relist” I didn’t mention (which this process will obviously cover), so I do not think I clearly get what you mean by “not quite what the description implies”. Vanderwaalforces (talk) 18:47, 28 May 2025 (UTC)
- The description implies the bot will be working at the level of individual RfD discussions, rather than what it will be doing which is working at the level of a day's worth of RfD discussions. For example the bot will take no action in relation to Wikipedia:Redirects for discussion/Log/2025 May 27 until all the discussions on that page on that page have been closed or relisted, at which point it will untransclude the page from Wikipedia:Redirects for discussion. I should clarify that I do support the proposal, I just think it's important that the description of what it does clearly matches what it will actually do. Thryduulf (talk) 20:18, 28 May 2025 (UTC)
It says that the source is available, where can it be viewed? GalStar (talk) 05:31, 6 June 2025 (UTC)
Approved for trial (20 edits). Please provide a link to the relevant contributions and/or diffs when the trial is complete. It's a single page being edited, but I'd like to make sure (since it's a very visible one) that everything's working as intended. Primefac (talk) 13:32, 8 June 2025 (UTC)
Operator: CFA (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)
Time filed: 19:59, Tuesday, December 31, 2024 (UTC)
Function overview: Removes articles from Category:Wikipedia requested images of biota if they have an image
Automatic, Supervised, or Manual: Automatic
Programming language(s): Python
Source code available: No, but it can be if necessary
Links to relevant discussions (where appropriate): Uncontroversial
Edit period(s): Weekly
Estimated number of pages affected: ~3-6k first run; likely no more than 10/week afterwards
Namespace(s): Talk
Exclusion compliant (Yes/No): Yes
Function details:
- Removes talk pages of articles with images from Category:Wikipedia requested images of biota and its subcategories
- Removes {{image requested}} or the "needs-image" banner parameter if an extant image is present in the taxobox
Discussion
[edit] Approved for trial (100 edits). Please provide a link to the relevant contributions and/or diffs when the trial is complete. Primefac (talk) 13:24, 1 January 2025 (UTC)
A user has requested the attention of the operator. Once the operator has seen this message and replied, please deactivate this tag. (user notified) Status of trial? – DreamRimmer (talk) 17:06, 14 February 2025 (UTC)
- It would be nice to have a generalized bot that can do this for all projects (just a comment, not against this specific bot). --Gonnym (talk) 21:15, 26 March 2025 (UTC)
- Sorry, I forgot about this. The code has already been written. I'll see if I have time to deploy it this weekend. C F A 02:24, 27 March 2025 (UTC)
A user has requested the attention of the operator. Once the operator has seen this message and replied, please deactivate this tag. (user notified) Anything happening here? * Pppery * it has begun... 16:03, 20 May 2025 (UTC)
- @CFA: Any update? – DreamRimmer ■ 06:53, 8 July 2025 (UTC)
Operator: Sohom Datta (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)
Time filed: 20:03, Tuesday, July 16, 2024 (UTC)
Automatic, Supervised, or Manual: automatic
Programming language(s): Python
Source code available: https://github.com/DreamRimmer/SodiumBot
Function overview: Notify previous reviewers of a article at AFD about the nomination
Links to relevant discussions (where appropriate): Initial discussions on NPP Discord + previous BRFAs surrounding AFD notifications
Edit period(s): Continuous
Estimated number of pages affected: 1-2 per day (guessimate?)
Exclusion compliant (Yes/No): Yes
Already has a bot flag (Yes/No): No, on enwiki, yes, for other wikis on other tasks
Function details:
- Use the eventstream API to listen for new AfDs
- Extract page name by parsing the AfD wikitext
- Identify previous reviewers of page at AFD
- Notify said reviewers on their talk pages with a customised version of the existing AfD notification message
Discussion
[edit]- I like this concept in general. I tried to make a user script that does this (User:Novem Linguae/Scripts/WatchlistAFD.js#L-89--L-105), but it doesn't work (I probably need to rewrite it to use MutationObserver). Would this bot be automatic for everyone, or opt in? Opt in may be better and easier to move forward in a BRFA. If not opt in, may want to start a poll somewhere to make sure there's some support for "on by default". –Novem Linguae (talk) 07:58, 17 July 2024 (UTC)
- I think it would be better to be on by default with the option for reviewers to disable. (t · c) buidhe 14:28, 17 July 2024 (UTC)
- Ah yes. "Opt out" might be a good way to describe this third option. –Novem Linguae (talk) 22:13, 17 July 2024 (UTC)
- I think it would be better to be on by default with the option for reviewers to disable. (t · c) buidhe 14:28, 17 July 2024 (UTC)
- Support - seems like a good idea. I've reviewed several articles that I've tagged for notability or other concerns, only to just happen to notice them by chance a few days later get AfD'ed by someone else. A bot seems like a good idea, and I can't see a downside. BastunĖġáḍβáś₮ŭŃ! 16:31, 17 July 2024 (UTC)
- This is the sort of thing that would be really good for some people (e.g., new/infrequent reviewers) and really frustrating for others (e.g., people who have reviewed tens of thousands of articles). If it does end up being opt-out, each message needs to have very clear instructions on how to opt out. It would also be worth thinking about a time limit: most people aren't going to get any value out of hearing about an article they reviewed a decade ago. Maybe a year or two would be a good threshold. Extraordinary Writ (talk) 18:48, 17 July 2024 (UTC)
- The PREVIOUS_NOTIF regex should also account for notifications left via page curation tool ("Deletion discussion about xxx"). The notification also needs to be skipped if the previous reviewer themself is nominating. In addition, I would suggest adding a delay of at least several minutes instead of acting immediately on AfD creation – as it can lead to race conditions where Twinkle/PageTriage and this bot simultaneously deliver notifications to the same user. – SD0001 (talk) 13:41, 19 July 2024 (UTC)
- {{Operator assistance needed}} Thoughts on the above comments/suggestions? Also, do you have the notice ready to go or is that still in the works? If it's ready, please link to it (or copy it here if it's hard-coded elsewhere). Primefac (talk) 12:48, 21 July 2024 (UTC)
- @Primefac I've implemented a few of the suggestions, I've reworked the code to exclude pages containing
{{User:SodiumBot/NoNPPDelivery}}
, which should serve as a opt out mechanism :) I've also reworked the code to include SD0001's suggestion of adding a significant delay by making the bot wait at least a hour and also added modified the regex to account for the messages sent by PageTriage. - Wrt to Extraordinary Writ's suggestions, I have restricted the lookup to the last 3 years as well and created a draft User:SodiumBot/ReviewerAfdNotification which has instructions on how to opt out. Sohom (talk) 16:02, 21 July 2024 (UTC)
- Thanks, I'll leave this open for a few days for comment before going to trial. Primefac (talk) 16:07, 21 July 2024 (UTC)
Approved for trial (50 edits). Please provide a link to the relevant contributions and/or diffs when the trial is complete. Please make sure this BRFA is linked in the edit summary. Primefac (talk) 23:50, 4 August 2024 (UTC)
- {{Operator assistance needed}} Any progress on this? Primefac (talk) 12:44, 23 December 2024 (UTC)
- I had left the bot running, it hasn't picked up a single article by the looks of the logs. I'mm gonna try to do some debugging on what the issue is/was. Sohom (talk) 14:22, 26 December 2024 (UTC)
- I've pushed some fixes, gonna see how that does. Sohom (talk) 15:24, 7 January 2025 (UTC)
- I had left the bot running, it hasn't picked up a single article by the looks of the logs. I'mm gonna try to do some debugging on what the issue is/was. Sohom (talk) 14:22, 26 December 2024 (UTC)
- {{Operator assistance needed}} Any progress on this? Primefac (talk) 12:44, 23 December 2024 (UTC)
- Thanks, I'll leave this open for a few days for comment before going to trial. Primefac (talk) 16:07, 21 July 2024 (UTC)
- @Primefac I've implemented a few of the suggestions, I've reworked the code to exclude pages containing
- I ran across Wikipedia:Bots/Requests for approval/SDZeroBot 6 today, which is a very similar task, and uses an "opt out" strategy. This suggests that the community may be OK with having AFD notifications be on by default for a bot task like this. –Novem Linguae (talk) 07:10, 8 August 2024 (UTC)
A user has requested the attention of the operator. Once the operator has seen this message and replied, please deactivate this tag. (user notified) What is the status of this? * Pppery * it has begun... 16:05, 20 May 2025 (UTC)
- I remember @DreamRimmer helping me out a bit with this, I'm not sure what is going on with the bot atm/why it is still stuck :( Sohom (talk) 16:11, 20 May 2025 (UTC)
- Lemme take a deeper look and get back to you by ETOW Sohom (talk) 16:11, 20 May 2025 (UTC)
- @Sohom Datta: What is the status of this? Requests shouldn't linger for almost a year. * Pppery * it has begun... 16:35, 1 July 2025 (UTC)
- Noting that I'm aware -- I'll try some stuff over the weekend and report back -- If it doesn't work out, I'll close this as declined. Sohom (talk) 15:09, 2 July 2025 (UTC)
- Job is now working (thanks to a lot of work by DreamRimmer!) Sohom (talk) 14:57, 13 July 2025 (UTC)
- Noting that I'm aware -- I'll try some stuff over the weekend and report back -- If it doesn't work out, I'll close this as declined. Sohom (talk) 15:09, 2 July 2025 (UTC)
- @Sohom Datta: What is the status of this? Requests shouldn't linger for almost a year. * Pppery * it has begun... 16:35, 1 July 2025 (UTC)
- Lemme take a deeper look and get back to you by ETOW Sohom (talk) 16:11, 20 May 2025 (UTC)
- I remember @DreamRimmer helping me out a bit with this, I'm not sure what is going on with the bot atm/why it is still stuck :( Sohom (talk) 16:11, 20 May 2025 (UTC)
Bots that have completed the trial period
[edit]Operator: Dbeef (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)
Time filed: 02:10, Friday, May 23, 2025 (UTC)
Automatic, Supervised, or Manual: automatic
Programming language(s): Rust
Source code available: https://github.com/fee1-dead/usync
Function overview: Sync userscripts from Git(Hub/Lab) to Wikipedia.
Links to relevant discussions (where appropriate): Wikipedia:Village pump (technical)/Archive 220#Syncing user scripts from an external Git repository to Wikipedia, Wikipedia:Bots/Requests for approval/DeltaQuadBot 9, User:Novem Linguae/Essays/Linking GitHub to MediaWiki
Edit period(s): Continuous
Estimated number of pages affected: All pages that transclude Wikipedia:USync
Exclusion compliant (Yes/No): No
Already has a bot flag (Yes/No): No
Function details: The bot scans and parses the list of user scripts at this list, and they must start with the following header format:
// {{Wikipedia:USync|repo=REPO_LINK|ref=REPO_REF|path=FILE_PATH}}
so for example:
// {{Wikipedia:USync |repo=https://github.com/fee1-dead/cplus |ref=refs/heads/production |path=main.js}}
And will start syncing from the Git file to the on-wiki script.
Any user script author intending to use the bot must (1) insert the header both on-wiki, and on the Git file themselves, serving as an authorization for the bot to operate. (2) Create an application/json
webhook in their Git repository pointing to https://deadbeefbot-two.toolforge.org/webhook to notify the bot of new commits that have occured on the file.
The bot will then make edits using the commit message and author information to update the user scripts.
Currently, it only supports js files in the User namespace, but its scope could be trivial expanded to cover more formats (CSS/plain wikitext) depending on usage.
This is an improvement upon the previous DeltaQuadBot task: Auditability is achieved through linking on-wiki edits to GitHub/GitLab URLs that tell you who made what changes. Webhooks are used instead of a periodic sync. Authorization must be given on-wiki to allow syncs to happen.
The code is currently a working demo. I'm planning on expanding its functionality to allow Wikimedia GitLab's webhooks, and actually deploying it. I will also apply for Interface Administrator perms as this bot requires IA permissions. Will also request 2FA on the bot when I get to it.
Discussion
[edit]- Just so we are aware of the alternatives here: bd808 suggested on Discord of an alternative solution to this problem which does not involve an IntAdmin bot, where script developers can create OAuth tokens and submit those tokens to a Toolforge service, and the Toolforge service would use those OAuth tokens to make edits as the script author (0xDeadbeef/GeneralNotability/etc.) instead of having the edits coming from a single bot account. There are different trade offs. I think if we're okay with a bot having IA permissions, then this solution is more convenient to setup, as the OAuth one requires going through the extra steps of creating a token. This bot also makes those edits in a centralized place when people want to inspect which scripts are maintained using this way. beef [talk] 02:34, 23 May 2025 (UTC)
I see a risk here in having a bot blindly copy from the github without any human verification. Interface editor rights are restricted for very good reason, as editing the site's js would be very vaulable to a potential attacker. By introducing this bot, we now also have to be concerned about the security of the github repo's the bot is copying from. Something which is external to Wikipedia. We have no control over who might be granted access to those repos, and what they might do.
In fact, it may actually hinder development of tools/scripts. Currently, as a maintainer, one can be fairly liberal in who you add to your github repo, knowing that you can review any changes when you manually move them from the GitHub to on-wiki. With this change, anyone you add to the repo, realistically should be someone the community would trust with interface admin rights. --Chris 09:49, 23 May 2025 (UTC)
- I think the bot task is more aimed at user scripts than gadgets. You don't need to be an interface admin to edit your own scripts. Being an opt-in system, script maintainers who don't wish to take on the risk can choose not to use the system. As for security, it should be the responsibility of the script author to ensure that they, and others who have been added to the repo, have taken adequate measures (like enabling 2FA) to secure their github/gitlab accounts. – SD0001 (talk) 10:14, 23 May 2025 (UTC)
- For what it's worth, there are already people doing this kind of things to their own userscripts, such as User:GeneralNotability/spihelper-dev.js. However, they were never done with a bot because the bot would need to be interface admin. So they just store BotPasswords/OAuth tokens in GitHub and write a CI job that uses that to edit on-wiki.
- Being someone with some fair bit of the open source process, I don't see why someone who wants to personally review any changes themselves should choose to add people liberally to the GitHub repo, and then choose to use this bot if it gets approved. They should try to move the development/approval cycle onto GitHub, appropriately using pull requests and protected branches, or just keep doing what they are doing. beef [talk] 10:22, 23 May 2025 (UTC)
- Script maintainers might be happy to take the risk of automatically copying scripts from an external site to become active client-side scripts at Wikipedia, and they might be happy with the increased vulnerability surface area. The question here is whether the Wikipedia community thinks the risk–benefit ratio means the procedure should be adopted. Johnuniq (talk) 10:36, 23 May 2025 (UTC)
- User scripts are an "install at your own risk" already, so feel free to avoid installing user scripts that do any automatic syncing. If the community doesn't like a bot that does this for whatever reason, I can also be fine with a "store OAuth tokens that give a toolforge service access to my account" approach which requires no community approval and no bots to run, just slightly less convenient to setup.
- All I am saying is that the
increased vulnerability surface area
remains to be proven. WP:ULTRAVIOLET and WP:REDWARN have been doing this for years. Whether approval for code occurs on-wiki or off-wiki shouldn't matter. beef [talk] 11:00, 23 May 2025 (UTC)- The bot as proposed crosses a pretty major security boundary by taking arbitrary untrusted user input into something that can theoretically change common.js for all users on Wikipedia.
- Has anyone looked at the security of the bot itself? Chess (talk) (please mention me on reply) 01:44, 9 June 2025 (UTC)
- @Chess:
theoretically change common.js for all users on Wikipedia
- no, only common.js that link to the specified page/transclude the specified page would be in scope for the bot. dbeef [talk] 01:47, 9 June 2025 (UTC)- @Dbeef: I understand what's in scope, but is the authorization token actually that granular? If there's a vulnerability in the bot, I could exploit that to edit anything. Chess (talk) (please mention me on reply) 02:04, 9 June 2025 (UTC)
- @Chess: I'm not sure what you mean.
- I had thought about the security implications long before this BRFA:
- The only public facing API of the bot is a webhook endpoint. While anyone can send in data that looks plausible, the bot will only update based on source code returned from api.github.com. So malicious actors have to be able to modify the contents of api.github.com to attack that.
- The credentials are stored on Toolforge, standard for a majority of Wikipedia bots. Root access is only given to highly trusted users and I don't think it will be abused to obtain the bot's OAuth credentials. If you think otherwise, I can move the bot deployment to my personal server provided by Oracle.
- The public facing part uses Actix Web, a popular and well-tested Web Framework. Toolforge provides the reverse proxy. Don't think there's anything exploitable to get RCE.
- The bot always checks the original page for the template with the configured parameters before editing. If the sync template is removed by the original owner or any interface administrator, the bot will not edit the page.
- dbeef [talk] 04:51, 9 June 2025 (UTC)
- @Dbeef: To answer Chess about BotPasswords, there is just one checkbox for "Edit sitewide and user CSS/JS" that encompasses both. ~ Amory (u • t • c) 01:06, 10 June 2025 (UTC)
While anyone can send in data that looks plausible, the bot will only update based on source code returned from api.github.com. So malicious actors have to be able to modify the contents of api.github.com to attack that.
How does the bot verify the contents_url field in a request made to the webhook is hosted on api.github.com in the same repository as the .js file it is synching to?- I'd be reassured by OAuth, mainly because it avoids taking untrusted user input into a bot with the permissions to edit MediaWiki:Common.js on one of the top ten most visited websites on Earth. Chess (talk) (please mention me on reply) 01:58, 10 June 2025 (UTC)
How does the bot verify the contents_url field in a request made to the webhook is hosted on api.github.com in the same repository as the .js file it is synching to?
That's a really good point. I need to fix that. dbeef [talk] 02:10, 10 June 2025 (UTC)- @Dbeef: I'm uncomfortable with interface admin being granted to a bot that hasn't had anyone else do a serious code review.
- Not verifying contents_url would've allowed me to modify any of the scripts managed by dbeef onwiki, to give an example.
- OAuth limits the impact of any flaws to just making edits under certain user accounts. Chess (talk) (please mention me on reply) 14:34, 13 June 2025 (UTC)
- @Chess: That is a valid concern and an oversight. It was originally not there when I queried raw.githubusercontent, but I noticed that that updated slowly. I then decided to use api.github.com but hadn't realized contents_url was user input.
- That was quickly fixed two days ago.
- I won't be of much help reviewing my own code, but maybe other people can take a look as well? Maybe we can ping some rust developers.. dbeef [talk] 15:17, 13 June 2025 (UTC)
- I'm a C++ developer unfortunately. I know nothing about Rust and can't even compile the bot right now. Chess (talk) (please mention me on reply) 04:37, 14 June 2025 (UTC)
- @Dbeef: I understand what's in scope, but is the authorization token actually that granular? If there's a vulnerability in the bot, I could exploit that to edit anything. Chess (talk) (please mention me on reply) 02:04, 9 June 2025 (UTC)
- @Chess:
- Script maintainers might be happy to take the risk of automatically copying scripts from an external site to become active client-side scripts at Wikipedia, and they might be happy with the increased vulnerability surface area. The question here is whether the Wikipedia community thinks the risk–benefit ratio means the procedure should be adopted. Johnuniq (talk) 10:36, 23 May 2025 (UTC)
Has there been a discussion establishing community consensus for this task, per WP:ADMINBOT? I don't see one linked here, nor one from Wikipedia:Bots/Requests for approval/DeltaQuadBot 9. The community might also decide whether the OAuth route is preferable to the interface-admin route. Anomie⚔ 11:13, 23 May 2025 (UTC)
- Good idea, I'll post a summary to WP:VPT soon. beef [talk] 11:16, 23 May 2025 (UTC)
- See Wikipedia:Village pump (technical)#Syncing user scripts from an external Git repository to Wikipedia beef [talk] 12:18, 23 May 2025 (UTC)
- {{BotOnHold}} This is just until the discussion concludes (feel free to comment out when it has). Primefac (talk) 23:48, 25 May 2025 (UTC)
- The discussion was archived at Wikipedia:Village pump (technical)/Archive 220#Syncing user scripts from an external Git repository to Wikipedia with a rough consensus to implement the bot. dbeef [talk] 03:31, 8 June 2025 (UTC)
Approved for trial (30 edits or 30 days, whichever happens first). Please provide a link to the relevant contributions and/or diffs when the trial is complete. I will be cross-posting this to both WP:AN and WP:BN for more eyes. Primefac (talk) 13:23, 8 June 2025 (UTC)
- I will be deploying the bot in a few days and do some deliberate test edits to get this started. If any user script authors are willing to try this for trial please let me know :) dbeef [talk] 13:37, 8 June 2025 (UTC)
- The linked discussion seemed to settle pretty quickly on using OAuth rather than interface editor permissions. Is that still the plan? Anomie⚔ 03:07, 9 June 2025 (UTC)
- That's not how I read it. It was explored as an alternative but to me it looks like more editors expressed support for the interface editor bot. dbeef [talk] 03:37, 9 June 2025 (UTC)
- On reviewing again, it looks like I misremembered and misread. The subdiscussion that concluded in OAuth was about the possible alternative to interface editor. OTOH I'm not seeing much support for the conclusion that interface editor was preferred over (normal) OAuth either; the few supporting statements may have been considering only interface editor versus password sharing. Anomie⚔ 11:31, 9 June 2025 (UTC)
- It isn't necessarily an either/or thing. Both solutions can co-exist. If some people prefer the OAuth-based approach, they can of course implement that – it doesn't even need a BRFA. What's relevant is whether the discussion had a consensus against the interface editor approach – I don't think it does. – SD0001 (talk) 11:39, 9 June 2025 (UTC)
What's relevant is whether the discussion had a consensus against the interface editor approach – I don't think it does.
As I said, I misremembered and misread. OTOH, dbeef claimedbut to me it looks like more editors expressed support for the interface editor bot
which I don't unambiguously see in the discussion either.If some people prefer the OAuth-based approach, they can of course implement that – it doesn't even need a BRFA.
I don't see any exception in WP:BOTPOL for fully automated bots using OAuth from the requirement for a BRFA. WP:BOTEXEMPT applies to the owner's userspace, not anyone who authorizes the bot via OAuth. WP:ASSISTED requires human interaction for each edit. WP:BOTMULTIOP does not contain any exemption from a BRFA. Anomie⚔ 12:00, 9 June 2025 (UTC)- That's a fair observation. I do see support for an interface admin bot and I believe there are no substantial concerns that would make a blocker. I continue to think of interface admin bot as the easier solution but I am not opposed to figuring out the OAuth piece also at a later time. It is just that I don't have truckloads of time to focus on stuff that seems on its surface a bit redundant. dbeef [talk] 12:46, 9 June 2025 (UTC)
- With OAuth, the edits would be from the users' own accounts. No bot account is involved as edits are WP:SEMIAUTOMATED with each push/merge to the external repo being the required human interaction. – SD0001 (talk) 13:43, 9 June 2025 (UTC)
- I look at WP:SEMIAUTOMATED as having the user approve the actual edit, not just do something external to Wikipedia that results in an edit that they've not looked at. But this discussion is getting offtopic for this BRFA; if you think this is worth pursuing, WP:BON or WT:BOTPOL would probably be better places. Anomie⚔ 12:01, 10 June 2025 (UTC)
- It isn't necessarily an either/or thing. Both solutions can co-exist. If some people prefer the OAuth-based approach, they can of course implement that – it doesn't even need a BRFA. What's relevant is whether the discussion had a consensus against the interface editor approach – I don't think it does. – SD0001 (talk) 11:39, 9 June 2025 (UTC)
- On reviewing again, it looks like I misremembered and misread. The subdiscussion that concluded in OAuth was about the possible alternative to interface editor. OTOH I'm not seeing much support for the conclusion that interface editor was preferred over (normal) OAuth either; the few supporting statements may have been considering only interface editor versus password sharing. Anomie⚔ 11:31, 9 June 2025 (UTC)
- That's not how I read it. It was explored as an alternative but to me it looks like more editors expressed support for the interface editor bot. dbeef [talk] 03:37, 9 June 2025 (UTC)
- The linked discussion seemed to settle pretty quickly on using OAuth rather than interface editor permissions. Is that still the plan? Anomie⚔ 03:07, 9 June 2025 (UTC)
- I will be deploying the bot in a few days and do some deliberate test edits to get this started. If any user script authors are willing to try this for trial please let me know :) dbeef [talk] 13:37, 8 June 2025 (UTC)
- The discussion was archived at Wikipedia:Village pump (technical)/Archive 220#Syncing user scripts from an external Git repository to Wikipedia with a rough consensus to implement the bot. dbeef [talk] 03:31, 8 June 2025 (UTC)
- {{BotOnHold}} This is just until the discussion concludes (feel free to comment out when it has). Primefac (talk) 23:48, 25 May 2025 (UTC)
- See Wikipedia:Village pump (technical)#Syncing user scripts from an external Git repository to Wikipedia beef [talk] 12:18, 23 May 2025 (UTC)
- Instead of requiring it to be linked and followed by some text in an arbitrary sequence, I'd suggest to use a transclusion for clarity, like:
{{Wikipedia:AutoScriptSync|repo=<>|branch=<>|path=<>}}
(perhaps also better to put the page in project space). – SD0001 (talk) 15:52, 8 June 2025 (UTC)- that's a little harder to parse but I suppose not too hard to implement, if parsoid can do it (hand-parsing is an option too). I'll take a look in the next few days. dbeef [talk] 16:02, 8 June 2025 (UTC)
After reading comments here, I'm unsure. (1) Why do we need a bot for this? Is there a need to perform this task repeatedly over a significant period of time? (Probably this is answered in the VPT discussion linked above, but it's more technical than I can understand.) (2) Imagine that a normal bot copies content from github to a normal userspace page, and then a human moves it to the appropriate page, e.g. first the bot puts a script at User:Nyttend/pagefordumping, and then I move it to User:Nyttend/script. This should avoid the security issue, since there's no need for the bot to have any rights beyond autoconfirmed. Would this work, or is this bot's point to avoid the work involved in all those pagemoves? (3) On the other hand, before interface admin rights were created, and normal admins could handle this kind of thing, do we know of any adminbots that were working with scripts of any sort, and if so, how did the security process work out? Nyttend (talk) 10:17, 9 June 2025 (UTC)
- (1) Yes, because platforms like GitHub give better experiences when developing user scripts, instead of having people copy from their local code editor and paste to Wikipedia each time. This includes CI and allowing transpiled languages such as TypeScript to work. (2)
is this bot's point to avoid the work involved in all those pagemoves
- Yeah. (3) I don't think there was any bot that did this. dbeef [talk] 10:34, 9 June 2025 (UTC)
- How are you handling licensing? When you, via your bot, publish a revision here you are doing so under CCBYSA4 and GFDL. What are you doing to ensure that the source content you are publishing is available under those licenses? — xaosflux Talk 10:07, 13 June 2025 (UTC)
- I think I could put a section on WP:USync that says "by inserting the header you assert that any code you submit through the Git repository is licensed under CCBYSA4/GFDL or another compatible license", but that's the best I can do.
- Would you want me to parse SPDX licenses or something? I think the responsibility is largely on the people who use the bot and not the bot itself when it comes to introducing potential copyvios. dbeef [talk] 15:09, 13 June 2025 (UTC)
- Is a compatible license even common on that upstream? You can't delegate authority, whoever publishes a revision is the one issuing the license on the derivative work. — xaosflux Talk 18:31, 13 June 2025 (UTC)
- This appears that it may end up whitewashing licenses. Anyone that reads any page from our project should be able to confidentially trust the CCBYSA license we present, including required elements such as the list of authors. — xaosflux Talk 00:56, 14 June 2025 (UTC)
- SPDX is an industry standard and is meant for automatically verifying the licence of a source file. Would that be inappropriate here? Chess (talk) (please mention me on reply) 04:36, 14 June 2025 (UTC)
- I was just wondering how exactly we should be doing it.
- For example, we can require that one must use something like
{{Wikipedia:USync|authors=Foo|license=MIT}}
, with license being a manually approved list. dbeef [talk] 04:39, 14 June 2025 (UTC)- Including it in a template in the userscript makes sense, since then the list of authors' preferred attribution can be maintained on the repo instead of onwiki, while still being replicated onwiki.
- The "license" field should probably be SPDX if that makes it easier to parse.
- Specifically, the "licence" field should contain
CC-BY-SA-4.0 OR GFDL-1.3-or-later
since that matches the requirements for contributing to Wikipedia, which is that all content must be available under both licences. I don't think allowing MIT-only (or other arbitrary permissive licences) makes sense right now under the assumption it's compatible with CC-BY-SA/GFDL. We might have to maintain the MIT licence text, and the only people using this bot would be those writing userscripts specifically for Wikipedia. Multiply that by the many variants of licences that exist. - I think it's a good idea to keep the amount of parsing in the bot as small as possible given its permissions and impact. Chess (talk) (please mention me on reply) 02:30, 15 June 2025 (UTC)
- We can't take content in incompatible licenses and copy them to Wikipedia. Any page published needs to be available to those reading it CCBYSA and GFDL. Additionally, if the remote site uses a -BY- license, we need to ensure that the remote authors continue to be properly attributed when republishing here. — xaosflux Talk 12:59, 17 June 2025 (UTC)
- I don't see a problem. All edits are implicitly released under those licenses, whether done from the UI or through some script. All you've to do is to declare in the bot documentation that "you agree to release all code you deploy to the wiki via the bot under CC-BY-SA and GFDL". – SD0001 (talk) 13:45, 17 June 2025 (UTC)
- this scheme is what i had in mind as well. I am not entirely sure whether a mandatory license field is needed. The assertion that content is compatibly licensed should ideally come from the very edit that inserts the WP:USync header (and we should assume such), and it's not that inserting a license field will be any different from it. dbeef [talk] 10:58, 18 June 2025 (UTC)
- I've added a note for licensing at Wikipedia:USync#Licensing note. dbeef [talk] 05:54, 4 July 2025 (UTC)
- I don't see a problem. All edits are implicitly released under those licenses, whether done from the UI or through some script. All you've to do is to declare in the bot documentation that "you agree to release all code you deploy to the wiki via the bot under CC-BY-SA and GFDL". – SD0001 (talk) 13:45, 17 June 2025 (UTC)
- We can't take content in incompatible licenses and copy them to Wikipedia. Any page published needs to be available to those reading it CCBYSA and GFDL. Additionally, if the remote site uses a -BY- license, we need to ensure that the remote authors continue to be properly attributed when republishing here. — xaosflux Talk 12:59, 17 June 2025 (UTC)
- Is a compatible license even common on that upstream? You can't delegate authority, whoever publishes a revision is the one issuing the license on the derivative work. — xaosflux Talk 18:31, 13 June 2025 (UTC)
- If it matters, I can vouch that CI/CD is a basic requirement now for much of software development, so I'm generally supportive of the intent of this proposal. It's better because it creates a single source of truth for what is currently deployed to the Wiki. Chess (talk) (please mention me on reply) 16:35, 13 June 2025 (UTC)
- Someone may also want to modify the edit filter 960 (hist · log) to not log this bot's edits if it gets approved. – PharyngealImplosive7 (talk) 18:42, 19 June 2025 (UTC)
- Small suggestion, not a blocker: consider removing "(https://github.com/NovemLinguae/UserScripts/compare/a2f0328d4361...9b5e44e3be32)" from the edit summary. It doubles the size of the edit summary, making each line on the history page about 3 lines long for me instead of 1 or 2. might be more readable with less clutter. Example history page. Thanks for this bot. I'm really liking it so far. –Novem Linguae (talk) 00:01, 29 June 2025 (UTC)
- I'm not sure if I want to do that. Using the name of the committer and linking to the original commits is helpful for attribution. It's bad that GitHub isn't in our Special:Interwiki map (and that edit summaries don't support external links), but once I add support for Wikimedia GitLab that a shorter link would be supported (example: (diff)) dbeef [talk] 03:06, 29 June 2025 (UTC)
- I was going to suggest that GitHub be added to the interwiki map, but it looks like that idea was rejected in 2017: m:Talk:Interwiki_map/Archives/2017#GitHub * Pppery * it has begun... 16:34, 1 July 2025 (UTC)
- I agree. That seems like a rather opinionated rejection from 2017 that could be revisited. – SD0001 (talk) 04:36, 2 July 2025 (UTC)
- We do now have a use case that external links don't fulfill. Anomie⚔ 12:14, 2 July 2025 (UTC)
- I agree. That seems like a rather opinionated rejection from 2017 that could be revisited. – SD0001 (talk) 04:36, 2 July 2025 (UTC)
- I was going to suggest that GitHub be added to the interwiki map, but it looks like that idea was rejected in 2017: m:Talk:Interwiki_map/Archives/2017#GitHub * Pppery * it has begun... 16:34, 1 July 2025 (UTC)
- I'm not sure if I want to do that. Using the name of the committer and linking to the original commits is helpful for attribution. It's bad that GitHub isn't in our Special:Interwiki map (and that edit summaries don't support external links), but once I add support for Wikimedia GitLab that a shorter link would be supported (example: (diff)) dbeef [talk] 03:06, 29 June 2025 (UTC)
- The bot's intadmin permission is going to expire in a week and may need extending. –Novem Linguae (talk) 23:24, 1 July 2025 (UTC)
- We should probably let it expire, as that is when the trial ends. dbeef [talk] 03:41, 2 July 2025 (UTC)
Trial complete., see Special:Contribs/DeadbeefBot II, WP:USync and source code. dbeef [talk] 09:12, 9 July 2025 (UTC)
- @BAG: and Novem Linguae (sorry for the mass-ping, but IA for a bot is kind of a "big deal"), do any of you see issue with this bot getting indefinite IA (i.e. "task approved")? I'm not seeing any issues but I'd like at least 1-2 other folk to sign off on this. Primefac (talk) 12:20, 12 July 2025 (UTC)
- I don't see any issues with this. – SD0001 (talk) 13:35, 12 July 2025 (UTC)
- I have fairly major security concerns with this. What prevents me from going to GitHub, maliciously replace the userscript with a "Find Image.img --> Replace with Very_Gross_Image.img" type of script instead? This Bot would then sync Wikipiedia and GitHub, uploading my malicious script on Wikipedia, all this without anyone doing any review at any point.
- I'm open to being convinced I'm not understanding the situation clearly, but an OAuth "upon request by the script maintainer" type of solution seems better to me. Headbomb {t · c · p · b} 13:57, 12 July 2025 (UTC)
- You'd need collaborator or member access to that GitHub repo. I imagine the user script creator would be aware and in control of who they add, and would only add people they trust. –Novem Linguae (talk) 14:00, 12 July 2025 (UTC)
- Hi @Headbomb: the bot will only use content from the GitHub repo that you specify when inserting the header. Everything won't work unless you insert the header that contains the link to the GitHub repo on the Wikipedia script page first.
- If you mean that, you yourself insert the malware to your own user script, but do it through the bot - I don't see how you'd evade scrutiny or force the bot to take responsibility in that situation. dbeef [talk] 15:19, 12 July 2025 (UTC)
- Testing - I was one of the testers. Looked good in testing. Was useful and not buggy. The bot responded very fast (like 5 seconds).
- For security reasons, we should probably also scrutinize the security algorithm, and the security code. Dbeef, please correct me if I get anything wrong, or feel free to expand on thie below.
- Security algorithm - Detailed at Wikipedia::USync. The bot checks for 1) an authorization string on the onwiki .js page, and 2) an authorization string in the GitHub repo. The authorization string contains the exact GitHub repo, and must be present in both places. So you need access to edit 1 and 2 in order to set up and authorize the bot. Access to edit 1 is obtained by a) that .js file being in your userspace, or b) being an interface administrator. Access to edit 2 is obtained by being a collaborator or member on that GitHub repo. The user script user picks which GitHub repo. The assumption is that the user script owner will own the repo, and will only grant access to the repo to people that they trust. The repo is specifically spelled out in the onwiki edit (1).
- Everything in this chain of security checks starts with the edit to the .js page, and the edit contains the specific GitHub repo to link. So anyone who can edit that .js page has control over all this. The folks that can edit a .js page are the user if it's one of their subpages, and interface administrators. Those users are all trusted, so this should be fine.
- Security code - Here's one of the more important lines of code for security. It takes the string it found onwiki specifying which GitHub repo is to be linked, and compares it to the GitHub page to make sure they're identical. parse_js_header() also looks important. Other eyes encouraged to make sure I didn't miss anything. –Novem Linguae (talk) 14:17, 12 July 2025 (UTC)
- Attack vector 1 - Social engineering attacks. If a user script writer can be convinced to add a
[[Wikipedia:USync]]
that points to a repo they don't own, that could be an issue. However, I don't see that as a deal breaker. I can think of a worse social engineering vector that involves user scripts, and we still allow that. - Attack vector 2 - Adding someone to your GitHub repo who later goes rogue or gets hacked. It's a risk, but in my opinion not worth blocking this over. Multiple people being able to collaborate on a repo that has continuous deployment set up, which is what this bot enables, is worth it, in my opinion. –Novem Linguae (talk) 14:24, 12 July 2025 (UTC)
- Attack vector 3 - Can this bot be tricked into updating a non-js page? If so, someone could trick it into spamming mainspace or something. Dbeef, can you talk a bit more about the bot's page whitelist algorithm? Things to think about... Could someone get the bot to edit a page that doesn't end in .js? Could someone get the bot to edit a page that ends in .js but isn't content model = javascript? –Novem Linguae (talk) 14:28, 12 July 2025 (UTC)
- Thanks for the great summary, @Novem Linguae. I plan to elaborate a bit more with a walkthrough of the code on this, but to answer your question first, the bot's page allowlisting happens at parser.rs or the search function in particular. Only pages that (1) transclude the WP:USync page and (2) have the javascript contentmodel will be stored in an in-memory map.
- This in-memory map is the source of truth for any incoming webhooks, and only pages stored in the in-memory map (that is, already known tranclusions) will be considered for further processing. dbeef [talk] 14:47, 12 July 2025 (UTC)
- (note that CSS support may be added in the future, which will result in a check for the css content model when that has been implemented) dbeef [talk] 14:48, 12 July 2025 (UTC)
- "Access to edit 2 is obtained by being a collaborator or member on that GitHub repo."
- This is my fear/contention. On Wikipedia, we tightly control who can edit userscripts. The user themselves, or an IA. On GitHub, it's whoever the script maintainer decides. You might say "but that's the same as trusting the script coder on wiki", but really is not. If, on wikipeda, ScriptCoder31's account get taken over, we block them. On GitHub... now we're depending on a third party deciding to get involved in a credentials fight. Or maybe I trust ScriptCoder31 to be a sane coder, but they have poor judgment on granting codebase access and grants their high school sibling access because they think it'll be a good learning experience and said high school sibling decides that replacing all images on wikipedia with VERY_GROSS_IMAGE.IMG would be very funny. Or they run into an issue, ask for help on stack exchange and a rando asks to have access because they want to optimize the code / can fix a problem / makes up whatever excuse to gain code access.
- Headbomb {t · c · p · b} 17:52, 12 July 2025 (UTC)
- @BAG: and Novem Linguae (sorry for the mass-ping, but IA for a bot is kind of a "big deal"), do any of you see issue with this bot getting indefinite IA (i.e. "task approved")? I'm not seeing any issues but I'd like at least 1-2 other folk to sign off on this. Primefac (talk) 12:20, 12 July 2025 (UTC)
- Security Overview. To make sure that we trust this bot enough, here are three points to cover:
- Make sure that the account itself is secure.
- Make sure that the bot does not edit outside of pages that we want them to edit.
- Make sure that the bot does not insert bad content (only the content we want them to insert)
- And here's my summary:
- The bot is run on Toolforge. The bot's OAuth secret is stored in a file:
-rw------- 1 tools.deadbeefbot-two tools.deadbeefbot-two 1312 Jun 9 04:34 secrets.toml
This means only people with access to thedeadbeefbot-two
tool (only me) plus people with root access (WMCS team plus a few trusted volunteers) have access to the account. The account is additionally enrolled with 2FA and with a strong password (note that using the OAuth token does not require 2FA). - All titles that end up being edited have to (1) transclude the WP:USync page, (2) have the javascript contentmodel (may change in the future to include CSS), (3) have the header as described on WP:USync successfully parsed for the bot to have a chance of editing them. This means any attacker wanting to direct the bot to a different page than expected would be extremely difficult
- We use GitHub webhooks as a trigger but not as a source of truth. The webserver that accepts GitHub webhooks is open to the public so all sorts of requests can come through. It is hard to do validation (whether the webhook content is actually coming from GitHub) but we don't need to. The link we use to fetch the content from GitHub is hardcoded to be in the format
https://api.github.com/repos/{repo}/contents/{path
} (may change in the future to allow Wikimedia GitLab as an alternative option). So any requests received by the webserver only acts as a notification for the bot to check the content from actual authoritative sources. The webhook content affects nothing except for the edit summary that the bot uses (which is pointless in attacking - an attacker needs to race with GitHub itself for scripts that are setup to use GitHub webhooks properly to get their malicious version to our server - we can also just stop using the webhook content entirely but I thought it was better in the current scheme) You also can't change the header from Git - the bot will error if the header (parameters containing repo, path, and refspec for Git) on the Git side has different content than the header on Wikipedia - so you can only change the header from Wikipedia.
- The bot is run on Toolforge. The bot's OAuth secret is stored in a file:
- Hopefully this resolves concerns people may have, let me know if people have additional questions or concerns. dbeef [talk] 15:15, 12 July 2025 (UTC)
- Good callout on setting the Toolforge password file to 0600. That is easy to forget for someone used to non-Linux systems, or someone used to mainstream PaaS web hosting where your files are all kept private for you. –Novem Linguae (talk) 15:23, 12 July 2025 (UTC)
- Okay, it turns out we can actually bulletproof this bad boy. GitHub's "ooOh let's validate your webhooks" suggestion is absolutely bonkers. If it was a simple secret parameter that gets attached to every request, at least we can store the hash of that secret to allow us to actually have it configurable per person on-wiki without any middleperson/database/web interface shenanigans. But GitHub choose to hash the entire request for whatever reason - something that GitLab has not decided to do (FOSS projects keep winning).
- But it turns out we can use a way simpler method - only allow GitHub's IPs. I'll debug this a bit to see if there are any interactions with Toolforge proxying requests to us and write up a CIDR check tomorrow. dbeef [talk] 15:39, 12 July 2025 (UTC)
- Toolforge uses an anonymizing reverse proxy that scrubs out IPs from the request. – SD0001 (talk) 19:11, 12 July 2025 (UTC)
Operator: CanonNi (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)
Time filed: 12:49, Tuesday, December 17, 2024 (UTC)
Function overview: A replacement for tasks 1, 2, 7, 8, 9, and 15 of FastilyBot (talk · contribs), whose operator has retired
Automatic, Supervised, or Manual: Automatic
Programming language(s): Rust (mwbot-rs crate)
Source code available: Will push to GitLab later
Links to relevant discussions (where appropriate): See this
Edit period(s): Daily
Estimated number of pages affected: A couple dozen every day
Namespace(s): File:
Exclusion compliant (Yes/No): Yes
Function details: Near identical functionality of the previous bot, just rewritten in a different (and better) language. All are modifying templates on File description pages, so I'm merging this into one task.
Task details (copied from WP:BOTREQ)
| ||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
|
Discussion
[edit]- Thanks for stepping up to help! For easier review and tracking, could you please list all these tasks and their descriptions in the "Function details" section? You can use a wikitable for this. – DreamRimmer (talk) 13:51, 17 December 2024 (UTC)
Added above. '''[[User:CanonNi]]''' (talk • contribs) 13:58, 17 December 2024 (UTC)
Approved for trial (120 edits). Please provide a link to the relevant contributions and/or diffs when the trial is complete. Please perform 20 edits for each task. Primefac (talk) 12:35, 23 December 2024 (UTC)
- {{Operator assistance needed}} Any update? – DreamRimmer (talk) 17:07, 14 February 2025 (UTC)
- Sorry about that - I've been inactive for the past three months or so. (probably should've got this thing done before disappearing, my bad) Will get those trial edits done soon! '''[[User:CanonNi]]''' (talk • contribs) 10:14, 22 May 2025 (UTC)
Trial complete. @Primefac and DreamRimmer: alright, I've finally completed the trial edits after procrastinating for uh... five months? See Special:Contributions/CanonNiBot. Some edits (60 or so according xTools) were performed on pages that have since been deleted. No major issues, and the code has been pushed to GitHub. '''[[User:CanonNi]]''' (talk • contribs) 13:20, 8 June 2025 (UTC)
- This seems to have only made edits removing {{Copy to Wikimedia Commons}}. Do the other tasks not require any edits at this time? Primefac (talk) 13:56, 8 June 2025 (UTC)
- @Primefac very sorry for the late reply - I uh, accidentally removed this page from my watchlist a while back. (Aaron Liu thanks for the reminder) I've performed several trial edits for all tasks, some of which were to pages that are now deleted, which is why they're not showing up on Special:Contribs. The tasks that required the most edits were 1, 7, 8, and 9. [[User:CanonNi]] (💬 • ✍️) 09:51, 23 June 2025 (UTC)
- Please do at least a few edits for each of the other tasks. Feel free to post individual diff links, I'll keep an eye on this page (and of course, I can see deleted revisions). Primefac (talk) 23:24, 23 June 2025 (UTC)
- @Primefac very sorry for the late reply - I uh, accidentally removed this page from my watchlist a while back. (Aaron Liu thanks for the reminder) I've performed several trial edits for all tasks, some of which were to pages that are now deleted, which is why they're not showing up on Special:Contribs. The tasks that required the most edits were 1, 7, 8, and 9. [[User:CanonNi]] (💬 • ✍️) 09:51, 23 June 2025 (UTC)
A user has requested the attention of the operator. Once the operator has seen this message and replied, please deactivate this tag. (user notified) Primefac (talk) 00:16, 21 June 2025 (UTC) Re-enabling, asked a question and need a reply. Primefac (talk) 12:23, 12 July 2025 (UTC)
- This seems to have only made edits removing {{Copy to Wikimedia Commons}}. Do the other tasks not require any edits at this time? Primefac (talk) 13:56, 8 June 2025 (UTC)
- Sorry about that - I've been inactive for the past three months or so. (probably should've got this thing done before disappearing, my bad) Will get those trial edits done soon! '''[[User:CanonNi]]''' (talk • contribs) 10:14, 22 May 2025 (UTC)
- {{Operator assistance needed}} Any update? – DreamRimmer (talk) 17:07, 14 February 2025 (UTC)
Approved requests
[edit]Bots that have been approved for operations after a successful BRFA will be listed here for informational purposes. No other approval action is required for these bots. Recently approved requests can be found here (edit), while old requests can be found in the archives.
- TenshiBot (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 3) Approved 16:20, 13 July 2025 (UTC) (bot has flag)
- GalliumBot (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 4) Approved 19:06, 12 July 2025 (UTC) (bot has flag)
- Bot1058 (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 10) Approved 12:27, 12 July 2025 (UTC) (bot has flag)
- C1MM-bot (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 3) Approved 06:34, 8 July 2025 (UTC) (bot has flag)
- AussieBot (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 1) Approved 17:04, 4 July 2025 (UTC) (bot has flag)
- TenshiBot (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 2) Approved 15:51, 2 July 2025 (UTC) (bot has flag)
- GraphBot (BRFA · contribs · actions log · block log · flag log · user rights) Approved 00:20, 21 June 2025 (UTC) (bot has flag)
- CitationCleanerBot (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 6) Approved 13:55, 8 June 2025 (UTC) (bot has flag)
- DreamRimmer bot III (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 9) Approved 12:56, 8 June 2025 (UTC) (bot has flag)
- RustyBot (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 3) Approved 12:51, 8 June 2025 (UTC) (bot has flag)
- DeadbeefBot (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 4) Approved 12:51, 5 June 2025 (UTC) (bot has flag)
- AnomieBOT (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 84) Approved 13:11, 31 May 2025 (UTC) (bot has flag)
- TenshiBot (BRFA · contribs · actions log · block log · flag log · user rights) Approved 15:34, 15 May 2025 (UTC) (bot has flag)
- PharyngealBOT (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 2) Approved 13:03, 12 May 2025 (UTC) (bot has flag)
- Jlwoodbot (BRFA · contribs · actions log · block log · flag log · user rights) Approved 03:41, 11 May 2025 (UTC) (bot has flag)
- SchlurcherBot (BRFA · contribs · actions log · block log · flag log · user rights) Approved 23:53, 6 April 2025 (UTC) (bot has flag)
- PrimeBOT (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 47) Approved 11:40, 19 March 2025 (UTC) (bot has flag)
- Qwerfjkl (bot) (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 32) Approved 07:08, 15 March 2025 (UTC) (bot has flag)
- BunnysBot (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 6) Approved 14:23, 11 March 2025 (UTC) (bot has flag)
- HilstBot (BRFA · contribs · actions log · block log · flag log · user rights) Approved 20:30, 28 February 2025 (UTC) (bot to run unflagged)
- MinusBot (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 2) Approved 14:20, 11 February 2025 (UTC) (bot has flag)
- Bot1058 (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 9) Approved 12:35, 3 February 2025 (UTC) (bot has flag)
- AnomieBOT (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 83) Approved 15:34, 2 February 2025 (UTC) (bot has flag)
- BunnysBot (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 5) Approved 11:18, 2 February 2025 (UTC) (bot has flag)
- VWF bot (BRFA · contribs · actions log · block log · flag log · user rights) Approved 14:17, 1 February 2025 (UTC) (bot has flag)
- DreamRimmer bot II (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 3) Approved 11:56, 1 February 2025 (UTC) (bot has flag)
- BunnysBot (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 4) Approved 15:50, 26 January 2025 (UTC) (bot has flag)
- KiranBOT (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 12) Approved 12:33, 19 January 2025 (UTC) (bot has flag)
- DreamRimmer bot II (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 2) Approved 16:47, 12 January 2025 (UTC) (bot has flag)
- PrimeBOT (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 46) Approved 12:22, 3 January 2025 (UTC) (bot has flag)
Denied requests
[edit]Bots that have been denied for operations will be listed here for informational purposes for at least 7 days before being archived. No other action is required for these bots. Older requests can be found in the Archive.
- PageLinkScraper (BRFA · contribs · actions log · block log · flag log · user rights) Bot denied 07:55, 10 March 2025 (UTC)
- CiteHelperBot (BRFA · contribs · actions log · block log · flag log · user rights) Bot denied 12:36, 8 January 2025 (UTC)
- MolecularBot (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 3) Bot denied 13:26, 1 January 2025 (UTC)
- Raph65BOT (BRFA · contribs · actions log · block log · flag log · user rights) Bot denied 00:37, 23 December 2024 (UTC)
- Silksam bot (BRFA · contribs · actions log · block log · flag log · user rights) Bot denied 12:54, 2 December 2024 (UTC)
- MdWikiBot (BRFA · contribs · actions log · block log · flag log · user rights) Bot denied 12:04, 3 August 2024 (UTC)
- Arjunaraocbot (BRFA · contribs · actions log · block log · flag log · user rights) Bot denied 07:35, 23 March 2024 (UTC)
Expired/withdrawn requests
[edit]These requests have either expired, as information required by the operator was not provided, or been withdrawn. These tasks are not authorized to run, but such lack of authorization does not necessarily follow from a finding as to merit. A bot that, having been approved for testing, was not tested by an editor, or one for which the results of testing were not posted, for example, would appear here. Bot requests should not be placed here if there is an active discussion ongoing above. Operators whose requests have expired may reactivate their requests at any time. The following list shows recent requests (if any) that have expired, listed here for informational purposes for at least 7 days before being archived. Older requests can be found in the respective archives: Expired, Withdrawn.
- Ow0castBot (BRFA · contribs · actions log · block log · flag log · user rights) Expired 06:50, 8 July 2025 (UTC)
- KiranBOT (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 10) Withdrawn by operator 02:59, 28 April 2025 (UTC)
- KiranBOT (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 14) Withdrawn by operator 02:54, 28 April 2025 (UTC)
- PharyngealBOT (BRFA · contribs · actions log · block log · flag log · user rights) Withdrawn by operator 03:43, 26 April 2025 (UTC)
- UrbanBot (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 3) Withdrawn by operator 02:21, 5 February 2025 (UTC)
- Tom.Bot (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 8) Withdrawn by operator 11:37, 4 February 2025 (UTC)
- RustyBot (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 2) Withdrawn by operator 15:31, 1 February 2025 (UTC)
- JJPMaster (bot) (BRFA · contribs · actions log · block log · flag log · user rights) Expired 15:50, 26 January 2025 (UTC)
- BunnysBot (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 2) Withdrawn by operator 12:31, 13 January 2025 (UTC)
- Platybot (BRFA · contribs · actions log · block log · flag log · user rights) Expired 16:58, 12 January 2025 (UTC)
- PonoRoboT (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 2) Expired 16:53, 12 January 2025 (UTC)
- MacaroniPizzaHotDog Bot (BRFA · contribs · actions log · block log · flag log · user rights) Withdrawn by operator 16:15, 12 January 2025 (UTC)
- DannyS712 bot III (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 74) Expired 12:47, 23 December 2024 (UTC)
- JJPMachine (BRFA · contribs · actions log · block log · flag log · user rights) Withdrawn by operator 04:28, 26 November 2024 (UTC)
- FrostlySnowman (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 10) Withdrawn by operator 04:41, 4 November 2024 (UTC)