Wikiversity:Colloquium
"Knowledge grows when shared." — Bhartrihari (discuss)
Requested update to Wikiversity:Interface administrators
[edit source]Currently, Wikiversity:Interface administrators is a policy that includes a caveat that interface admins are not required long-term and that user right can only be added for a period of up to two weeks. I am proposing that we remove this qualification and allow for indefinite interface admin status. I think this is useful because there are reasons for tweaking the site CSS or JavaScript (e.g. to comply with dark mode), add gadgets (e.g. importing Cat-a-Lot, which I would like to do), or otherwise modifying the site that could plausibly come up on an irregular basis and requiring the overhead of a bureaucrat to add the user rights is inefficient. In particular, I am also going to request this right if the community accepts indefinite interface admins. Thoughts? —Justin (koavf)❤T☮C☺M☯ 23:23, 17 August 2025 (UTC)
- And who will then monitor them to make sure they don't damage the project in any way, or abuse the rights acquired in this way? For large projects, this might not be a problem, but for smaller projects like the English Wikiversity, I'm not sure if there are enough users who would say, something is happening here that shouldn't be happening. Juandev (discuss • contribs) 10:28, 20 August 2025 (UTC)
- Anyone would be who. This argument applies to any person with any advanced rights here. —Justin (koavf)❤T☮C☺M☯ 10:46, 20 August 2025 (UTC)
- I think it is reasonable to allow for longer periods of access than 2 weeks to interface admin and support adjusting the policy to allow for this flexibility. -- Jtneill - Talk - c 04:57, 2 December 2025 (UTC)
- @Koavf I agree that the two-week requirement could be revised, but wouldn’t people just request access for a specific purpose anyway? Instead of granting indefinite access, they should request the specific time frame they need the rights for—until the planned fixes are completed—and then request an extension if more time is required. We could remove the two-week criterion while still keeping the access explicitly temporary. PieWriter (discuss • contribs) 02:48, 25 January 2026 (UTC)
- I just don't see why this wiki needs to be different than all of the others. —Justin (koavf)❤T☮C☺M☯ 07:18, 25 January 2026 (UTC)
- There isn’t really much of a need for a permanent one at this point in time PieWriter (discuss • contribs) 09:53, 25 January 2026 (UTC)
- I just don't see why this wiki needs to be different than all of the others. —Justin (koavf)❤T☮C☺M☯ 07:18, 25 January 2026 (UTC)
- I quite agree with this proposal, so long as they perform the suggested changes as mentioned here. Codename Noreste (discuss • contribs) 04:06, 26 January 2026 (UTC)
Ambitious projects on Wikiversity
[edit source]Greetings,
I have found a project that I might think of reviving, but I may need a bit of help and support from the community:
Would any contributors like to help or support me in these efforts? I might be able to make it a reality.
—RailwayEnthusiast2025 (Talk page - Contributions) 20:41, 4 September 2025 (UTC)
- Can you (or someone else who read this) make a list/page of ideas what help activities you can think of?
- This makes it easier for willingly people to pick up then tasks.
Thanks for the idea, @RailwayEnthusiast2025! --Erkan Yilmaz 10:25, 13 January 2026 (UTC)
Curators and curators policy
[edit source]How does it come, that Wikiversity has curators, but Curators policy is still being proposed? How do the curators exists and act if the policy about them havent been approved yet? Juandev (discuss • contribs) 18:33, 16 October 2025 (UTC)
- It looks as if it is not just curators. The policy on Bureaucratship is still being proposed as well. See Wikiversity:Bureaucratship. —RailwayEnthusiast2025 talk with me! 18:33, 27 October 2025 (UTC)
- I think its just the nature of a small WMF sister project in that there are lots of drafts, gaps, and potential improvements. In this case, these community would need to vote on those proposed Wikiversity staff policies if we think they're ready. -- Jtneill - Talk - c 02:08, 3 December 2025 (UTC)
- What? I thought you were getting it approved, Juandev... :) I'm Mr. Chris (discuss • contribs) 14:20, 12 February 2026 (UTC)
- Yeah I think this one is important too and we need to aprove it too @I'm Mr. Chris. Juandev (discuss • contribs) 15:56, 12 February 2026 (UTC)
- I thinks its ready to made into a policy, it seems to be complete and informative about what the rights does and how to get it. PieWriter (discuss • contribs) 03:08, 15 February 2026 (UTC)
- Yeah I think this one is important too and we need to aprove it too @I'm Mr. Chris. Juandev (discuss • contribs) 15:56, 12 February 2026 (UTC)
After going through the plethora of ChatGPT-generated pages made by Lbeaumont (with many more pages to go), I'd like community input on this proposal to Wikiversity:Artificial intelligence that I think would be benefical for the community:
- Resources generated by AI must be indicated as so through the project box, Template:AI-generated, on either the page or the main resource (if the page is a part of a project).
I do not believe including a small note/reference that a page is AI-generated is sufficient, and I take my thinking from Wikiversity's OR policy for OR work: Within Wikiversity, all original research should be clearly identified as such. I believe resources created from AI should also be clearly indicated as such, especially since we are working on whether or not AI-generated resources should be allowed on the website (discussion is here, for reference). This makes it easier for organizational purposes, and in the event if we ban AI-generated work.
I've left a message on Lee's talk page over a week ago and did not get a response or acknowledgement, so I'd like for the community's input for this inclusion to the policy. —Atcovi (Talk - Contribs) 15:53, 26 January 2026 (UTC)
- I believe that existing Wikiversity policies are sufficient. Authors are responsible for the accuracy and usefulness of the content that is published. This policy covers AI-generated content that is: 1) carefully reviewed by the author publishing it, and 2) the source is noted. Lbeaumont (discuss • contribs) 19:38, 27 January 2026 (UTC)
- A small reference for pages that are substantially filled with Chat-GPT entries, like Real Good Religion, Attributing Blame, Fostering Curiosity, are not sufficient IMO and a project box would be the best indicator that a page is AI-generated (especially when there is a mixture of human created content AND AI-generated content, as present in a lot of your pages). This is useful, especially considering the notable issues with AI (including hallucinations and fabrication of details), so viewers and support staff are aware. These small notes left on the pages are not as easily viewable as a project box or banner would be. I really don't see the issue with a clear-label guideline. —Atcovi (Talk - Contribs) 22:34, 27 January 2026 (UTC)
- @Lbeaumont: I noticed your reversions here & here. I'd prefer to have a clean conversation regarding this proposition. Please voice your concerns here. —Atcovi (Talk - Contribs) 15:53, 28 January 2026 (UTC)
- Regarding Subjective Awareness, I distinctly recall the effort I went to to write that the old-fashioned way. It is true that ChatGPT assisted me in augmenting the list of words suggested as candidate subjective states. This is a small section of the course, is clearly marked, and makes no factual claim. Marking the entire course as AI-generated is misleading. I would have made these comments when I reverted your edit; however, the revert button does not provide that opportunity.
- Regarding the Exploring Existential Concerns course, please note this was adapted from my EmotionalCompetency.com website, which predates the availability of LLMs. The course does include two links, clearly labeled as ChatGPT-generated. Again, marking the entire course as AI-generated is misleading.
- On a broader issue, I don't consider your opinions to have established a carefully debated and adopted Wikiversity policy. You went ahead and modified many of my courses over my clearly stated objections. Please let this issue play out more completely before editing my courses further. Thanks. Lbeaumont (discuss • contribs) 15:11, 29 January 2026 (UTC)
- Understood, and I respect your position. I apologize if my edits were seen as overarching. We could change the project box to "a portion of this resource was generated by AI", or something along those lines. Feel free to revert my changes where you see fit, and I encourage more users to provide their input. EDIT: I've made changes to the template to indicate that a portion of the content has been generated from an LLM. —Atcovi (Talk - Contribs) 15:50, 29 January 2026 (UTC)
- Thanks for this reply. The new banner is unduly large and alarming. There is no need for alarm here. The use of AI is not harmful per se. Like any technology, it can be used to help or to harm. I take care to craft prompts carefully, point the LMM to reliable source materials, and to carefully read and verify the generated text before I publish it. This is all in keeping with long-established Wikiversity policy. We don't want to use a one-drop rule here or cause a satanic panic. We can learn our lessons from history here. I don't see any pedagogical reason for establishing a classification of "AI generated", but if there is a consensus that it is needed, perhaps it can be handled as just another category that learning resources can be assigned to. I would rather focus on identifying any errors in factual claims than on casting pejorative bias toward AI-generated content. An essay on the best practices for using LMM on Wikiveristy would be welcome. Lbeaumont (discuss • contribs) 15:58, 30 January 2026 (UTC)
- The new banner mimics the banner that is available on the English Wikibooks (see b:Template:AI-generated & b:Template:Uses AI), so my revisions aren't unique in this aspect. At this point, I'd welcome other peoples' inputs. —Atcovi (Talk - Contribs) 19:40, 30 January 2026 (UTC)
- Thanks for this reply. The new banner is unduly large and alarming. There is no need for alarm here. The use of AI is not harmful per se. Like any technology, it can be used to help or to harm. I take care to craft prompts carefully, point the LMM to reliable source materials, and to carefully read and verify the generated text before I publish it. This is all in keeping with long-established Wikiversity policy. We don't want to use a one-drop rule here or cause a satanic panic. We can learn our lessons from history here. I don't see any pedagogical reason for establishing a classification of "AI generated", but if there is a consensus that it is needed, perhaps it can be handled as just another category that learning resources can be assigned to. I would rather focus on identifying any errors in factual claims than on casting pejorative bias toward AI-generated content. An essay on the best practices for using LMM on Wikiveristy would be welcome. Lbeaumont (discuss • contribs) 15:58, 30 January 2026 (UTC)
- Understood, and I respect your position. I apologize if my edits were seen as overarching. We could change the project box to "a portion of this resource was generated by AI", or something along those lines. Feel free to revert my changes where you see fit, and I encourage more users to provide their input. EDIT: I've made changes to the template to indicate that a portion of the content has been generated from an LLM. —Atcovi (Talk - Contribs) 15:50, 29 January 2026 (UTC)
Adopt the standard bot policy or only allow global bots?
[edit source]I would like to introduce the following proposals related to bots:
- 1. We adopt the standard bot policy, which will include allowing global bots, as well as allowing automatic approval of certain types of bots. Other bots would still have to apply at Wikiversity:Bots/Status.
- Or 2. We opt-in global bots, but otherwise we will not utilize the standard bot policy. Regarding automatic approval, consensus should decide if it should be allowed here or not.
You can choose only one proposal, or comment here. If there is consensus to implement one of these proposals, it should be ready in two weeks. Thoughts? Codename Noreste (discuss • contribs) 16:27, 26 January 2026 (UTC)
- Seems like a great idea. I lean slightly more towards the first proposal PieWriter (discuss • contribs) 08:04, 27 January 2026 (UTC)
- The first proposal, since getting a global standard would be best. Do you know anything about the Auto archive bot? Harold Foppele (discuss • contribs) 17:10, 3 February 2026 (UTC)
- @Harold Foppele An auto archive bot would require someone to code it and request it to approved at WV:Bots/Status PieWriter (discuss • contribs) 07:27, 13 February 2026 (UTC)
- The first proposal, since getting a global standard would be best. Do you know anything about the Auto archive bot? Harold Foppele (discuss • contribs) 17:10, 3 February 2026 (UTC)
Changes requested to the stewards. Codename Noreste (discuss • contribs) 19:33, 12 February 2026 (UTC)
How do I start making pages?
[edit source]Is there a notability guideline for Wikiversity? What is the sourcing policy for information? What is the Manual of Style? What kind of educational content qualifies for Wikiversity? All the introduction pages are a bit unclear. VidanaliK (discuss • contribs) 02:25, 28 January 2026 (UTC)
- @VidanaliK: Welcome to Wikiversity! I've left you a welcome message on your talk page. That should help you out. Make sure to especially look at Wikiversity:Introduction. —Atcovi (Talk - Contribs) 03:11, 28 January 2026 (UTC)
- It says that I can't post more pages because I have apparently exceeded the new page limit. How long does it take before that new page limit expires? VidanaliK (discuss • contribs) 16:57, 28 January 2026 (UTC)
- This is a restriction for new users so that Wikiversity is not hit with massive spam. As for when this limit will expire, it should be a few days or after a certain number of edits. It's easy to overcome, though I do not have the exact numbers atm. —Atcovi (Talk - Contribs) 15:08, 29 January 2026 (UTC)
- OK, I think I got past the limit. VidanaliK (discuss • contribs) 17:21, 29 January 2026 (UTC)
- This is a restriction for new users so that Wikiversity is not hit with massive spam. As for when this limit will expire, it should be a few days or after a certain number of edits. It's easy to overcome, though I do not have the exact numbers atm. —Atcovi (Talk - Contribs) 15:08, 29 January 2026 (UTC)
- It says that I can't post more pages because I have apparently exceeded the new page limit. How long does it take before that new page limit expires? VidanaliK (discuss • contribs) 16:57, 28 January 2026 (UTC)
Why does it feel like Wikiversity is no longer really active anymore?
[edit source]I've been looking at recent changes, and both today and yesterday there haven't been many changes that I haven't made; it feels like walking through a ghost town, is this just me or is Wikiversity not really active anymore? VidanaliK (discuss • contribs) 03:54, 30 January 2026 (UTC)
- There is fewer people editing these days compared to the past. Many newcomers tend to edit in Wikipedia instead. PieWriter (discuss • contribs) 06:39, 30 January 2026 (UTC)
IMPORTANT: Admin activity review
[edit source]Hello. A policy regarding the removal of "advanced rights" (administrator, bureaucrat, interface administrator, etc.) was adopted by global community consensus in 2013. According to this policy, the stewards are reviewing administrators' activity on all Wikimedia Foundation wikis with no inactivity policy. To the best of our knowledge, your wiki does not have a formal process for removing "advanced rights" from inactive accounts. This means that the stewards will take care of this according to the admin activity review.
We have determined that the following users meet the inactivity criteria (no edits and no logged actions for more than 2 years):
- User:MaintenanceBot (administrator)
These users will receive a notification soon, asking them to start a community discussion if they want to retain some or all of their rights. If the users do not respond, then their advanced rights will be removed by the stewards.
However, if you as a community would like to create your own activity review process superseding the global one, want to make another decision about these inactive rights holders, or already have a policy that we missed, then please notify the stewards on Meta-Wiki so that we know not to proceed with the rights review on your wiki. Thanks, EPIC (discuss • contribs) 17:32, 14 February 2026 (UTC)
- Seems like a request was made here PieWriter (discuss • contribs) 03:06, 15 February 2026 (UTC)
Inactivity policy for Curators
[edit source]I was wondering if there is a specific inactivity polity for curators (semi-admins) as I am pretty sure the global policy does not apply to them as they are not fully sysops. PieWriter (discuss • contribs) 03:20, 15 February 2026 (UTC)
- Unfortunately, I don't see an inactivity policy, but if we were to create such a new policy for curators, it should be the same for custodians (administrators). Codename Noreste (discuss • contribs) 18:45, 15 February 2026 (UTC)


