Copilot purposely stops working on code that contains hardcoded banned words from Github, such as gender or sex. And if you prefix transactional data as trans_
Copilot will refuse to help you. 😑
Copilot purposely stops working on code that contains hardcoded banned words from Github, such as gender or sex. And if you prefix transactional data as trans_
Copilot will refuse to help you. 😑
I understand why they need to implement these blocks, but they seem to always be implemented without any way to workaround them. I hit a similar breakage using Cody (another AI assistant) which made a couple of my repositories unusable with it. https://jackson.dev/post/cody-hates-reset/
Because AI is a blackbox, there will always be a “jailbreak” if not a hardcore filter is used in afterFX