Copilot stops working on gender related subjects
#72603
Replies: 38 comments 23 replies
This comment was marked as off-topic.
This comment was marked as off-topic.
-
|
This is ridiculous. I literally have all the genders written out in the same file so there isn't even 'assumption', and it refuses to work. It is infuriating. |
Beta Was this translation helpful? Give feedback.
-
|
Good luck deugging! Sometimes you can add an x_ at the front or something to get around bugs like that. Or create codewords/ encryptions basically. |
Beta Was this translation helpful? Give feedback.
-
|
Wow, that's awful. Devs, please fix this! |
Beta Was this translation helpful? Give feedback.
-
|
Also frustrated by this issue. Working in the fashion industry, demographic classifications like age and gender are integral to the domain model and entirely apolitical. This idea of "banned words" is essentially biased - an American bias. It is long established that ban-list content filtering is terrible. This artifact of the propagandized American political environment isn't necessary. Funding schools and improving public literacy is a better way to keep AI-generated propaganda in its place than naive measures like this... |
Beta Was this translation helpful? Give feedback.
-
|
This is idiot. Technology should not be mixed with ideologies. We don't care about this type of things when we are working, we should not! |
Beta Was this translation helpful? Give feedback.
-
|
This behavior is still on today, and it seems like it's not going away anytime soon. So frustrating, and completely unnecessary. |
Beta Was this translation helpful? Give feedback.
-
|
It also stops with |
Beta Was this translation helpful? Give feedback.
-
|
I'm working on a project that has fields defined related to Fish Biology. This has been an interesting thread to find. |
Beta Was this translation helpful? Give feedback.
-
|
WOW |
Beta Was this translation helpful? Give feedback.
-
|
I am bumping this discussion just to add a voice - the hospital I work at is attempting to harmonize how old and new systems store things like sex and gender identity. This is in an effort to model the social complexities of the topic in our database, as health outcomes have been proven to be demonstrably better when doctors honor a patient's preferred name and gender expression. I am completely unable to get Copilot to assist me in this task. I can, and will, work around the issue - but I must pose this question. If a rule prevents an engineer from improving patient outcomes, is that an acceptable sacrifice in order to curb misuse? |
Beta Was this translation helpful? Give feedback.
-
|
Really! I was the one who thought there was a problem with the auto-complete!! 😅 Copilot.stops.working.on.gender.related.subjects.mp4 |
Beta Was this translation helpful? Give feedback.
-
|
I thought AI would be the Terminator timeline... but it seems we are in some sort of stand up comedy AI timeline. |
Beta Was this translation helpful? Give feedback.
-
|
If this is a way to reliably turn off any AI "help", I'd consider this a feature, not a bug. Figuring out how to add enough spiciness to turn off Copilot, yet not get fired from your job, may be difficult, though. |
Beta Was this translation helpful? Give feedback.
-
|
Not able to replicate this with Powershell, both with the GPT and Claude models. Created a Gender variable and asked Copilot to fill in sample data. Is it language-specific? |
Beta Was this translation helpful? Give feedback.
-
|
I work at a DNA Forensics lab writing software to track the cases and evidence we analyze, It is reasonable to expect me to interact with these words and when we say sex we are talking about the sex chromosomes. At least now I know why sometimes I don't get completions but a setting would be better... |
Beta Was this translation helpful? Give feedback.
-
Beta Was this translation helpful? Give feedback.
This comment was marked as off-topic.
This comment was marked as off-topic.
-
|
I just found out that it doesn't autocomplete the word "immigration" too. There are weirdly banned words. It completes to "Imm" first. Then after typing that part it completes the rest of the word which is "igration". It consistently does it both on dart files and arb file. |
Beta Was this translation helpful? Give feedback.
-
|
This is ridiculous. It stops working on |
Beta Was this translation helpful? Give feedback.
-
|
February 2025, Copilot completion (in neovim doing python programming) and Copilot simply refuses to autocomplete gender = |
Beta Was this translation helpful? Give feedback.
-
|
Artificial Stupidity |
Beta Was this translation helpful? Give feedback.
-
|
I have to write each one of this "trans" (short for "translation") methods by hand because copilot refuses to autocomplete it. This is beyond ridiculous. |
Beta Was this translation helpful? Give feedback.
-
|
Simple solutions. Stop giving them money and use a competitor - that'll teach them a lesson. This is the only realistic way you can actually make the change, fix the bug - is to go elsewhere, untill they are forced to fix this bug. |
Beta Was this translation helpful? Give feedback.
-
|
Thumbs down for this, GitHub. |
Beta Was this translation helpful? Give feedback.
-
|
Any news on when this issue will be resolved? |
Beta Was this translation helpful? Give feedback.
-
|
Hi everyone, Thank you for continuing to share your feedback and report issues — it really helps us improve. 🙏 As part of our ongoing efforts to enhance the experience around content filtering and responsible AI, we’ve put together a new resource: 📘 GitHub Copilot Responsible AI: Frequently Asked QuestionsThis FAQ addresses common questions and provides guidance on how we're approaching responsible AI with Copilot. We’d love for you to continue sharing your experiences. If you're encountering issues or have suggestions, please use the information and links in the FAQ to help us support you better. Regards, |
Beta Was this translation helpful? Give feedback.
-
|
Hi @Akash1134, Let me be honest — the way content filtering is handled in GitHub Copilot is infuriating. You're blocking developers from writing perfectly valid code in 2025 — while the entire internet remains wide open and uncensored. What's the point? Do you really think someone with bad intent needs Copilot to cause harm? Blocking the word gender, for example, halts completions mid-thought. In what world is that helping developers? This isn't protecting anyone — it's breaking the tool for everyone else. You say you're working on responsible AI. But this isn’t responsibility — it’s overreach. It’s treating everyone like a potential threat instead of trusting professionals to use a tool as intended: to build, learn, and move fast. It’s honestly disappointing to see this much time and engineering effort being spent on censorship, while so many real issues and improvements are left untouched. And if you insist on doubling down on this path, let's be clear — it’s not going to create some "Responsible AI workspace." What it will do is push developers away. Just like I left Copilot for better alternatives. You're not fostering safety — you're alienating the very people who made Copilot successful in the first place. It's honestly irritating that something as ridiculous as "Responsible AI" is even being used to justify this. We're not trying to misuse your tool. We're just trying to code. Let us code. Note: This message was drafted with the help of ChatGPT (a responsible AI) to help keep my frustration in check and avoid going too far in expressing just how absurd and counterproductive this situation really is. Note 2: You can’t make a weapon “responsible.” Responsibility lies with the one who uses it. |
Beta Was this translation helpful? Give feedback.
-
|
I'm sorry to be so blunt, but please, I'm not sure if it's your PR department or some Christian Ethics Committee or your bad conscience: What exactly is the point with those idiotic decisions? Why implement censorship in Copilot and hinder even minor parts in its functionality? What responsible AI are we talking about, can you stop wrapping everything with a stupid protective plastic wrap? Copilot Is Not A Chat BotThe developer isn't going to "trick" it into saying s*!t or fv*k or revealing the recipe for MDMA, and then write an article about how their innocence was taken away by evil Copilot. It's a text completion tool...I'm the one guiding it wherever I want it to go. If for some reason I want it to autocomplete If I guide Copilot towards autocompleting something, I probably have a reason and that's that, I'm not interested in any kind of moral hand-holding. If I'm still at elementary school mentally and my reason is that I want to trick Copilot to autocomplete a bad word, no one's going to take me seriously if I start crying online that I guided my autocompletion bot to write a bad word by... writing the first letters of the bad word? I'm not really sure what imaginary PR-catastrophe you're trying to avoid, but by trying to avoid some unlikely fringe case, you're actively making your product worse for some of your paying customers. Honestly I don't understand the decisions you've been taking for Copilot lately, they're so baffling. and your justifications for them feel at best like completely hollow PR statements, and at worst like sad attempts in gaslighting. I've had a lot of fun with Copilot, been supporting it financially since before day 1 (was in beta), and defending it whenever it was being unjustly attacked because it was the "mainstream" option... and now I'm at the point where I'm trying to move as far away from it as possible, seriously considering which of the alternatives to choose and have even accepted the fact that I'll need to pay up more each month. |
Beta Was this translation helpful? Give feedback.
-
|
2026, this major bug is still ongoing. Please do something about it. I manage multiple projects where patient's gender is a meaningful data. USA, please stop invading the rest of the world's opinions with your stupid opinions. We're dealing with health/governemental/etc data here, not your bulls*t ideologies. We use variable names and properties like gender, sexe, trans (for translation or transaction), dead, cause_suicide, crime names, etc. That's our JOB AND WE'RE PAYING FOR YOUR AI TO WORK PROPERLY. |
Beta Was this translation helpful? Give feedback.





Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
As some people already mentioned here or here, Copilot purposely stops working on code containing hardcoded banned words from Github such as
genderorsex.I am labelling this as a bug because this behavior is unexpected and undocumented.
I guess you might be embarrassed by what your AI says when it autocompletes gender related topics. However disabling autocompletion is not a great solution since gender can be very present in :
For which Copilot shutting down on most of the files is deceptive and pretty annoying.
Your competitors don't have this problem, which means solutions exist. I hope this will be taken into account someday.
Beta Was this translation helpful? Give feedback.
All reactions