cross-posted from: https://lemmy.ml/post/15741608
They offer a thing they’re calling an “opt-out.”
The opt-out (a) is only available to companies who are slack customers, not end users, and (b) doesn’t actually opt-out.
When a company account holder tries to opt-out, Slack says their data will still be used to train LLMs, but the results won’t be shared with other companies.
LOL no. That’s not an opt-out. The way to opt-out is to stop using Slack.
https://slack.com/intl/en-gb/trust/data-management/privacy-principles
Customers own their own Customer Data.
Okay, that’s good.
Immediately after that:
Slack […] will never identify any of our customers or individuals as the source of any of these improvements to any third party, other than to Slack’s affiliates or sub-processors.
You’d hope the owner would get a say in that.
So you sign up to confirm that your IP is yours, while simultaneously agreeing to sell it off , but the source will be anonymous other than to who it’s sold to or anyone else Slack decides they want to know. These tech contracts and TOS should just say “we will (try) not do bad, but you agree to let us do bad, and if bad happens its your fault”.
Instead of working on their platform to get discord users to jump ship they decide to go in the same direction. Also pretty sure training LLMs after someone opts out is illegal?
Also pretty sure training LLMs after someone opts out is illegal?
Why? There have been a couple of lawsuits launched in various jurisdictions claiming LLM training is copyright violation but IMO they’re pretty weak and none of them have reached a conclusion. The “opting” status of the writer doesn’t seem relevant if copyright doesn’t apply in the first place.
but IMO they’re pretty weak
Well, thankfully, it’s not up to you.
Nor is it up to you. But fact remains, it’s not illegal until there are actually laws against it. The court cases that might determine whether current laws are against it are still ongoing.
If copyrights apply, only you and stack own the data. You can opt out but 99% of users don’t. No users get any money. Google or Microsoft buys stack so only they can use the data. We only get subscription based AI, open source dies.
If copyrights don’t apply, everyone owns the data. The users still don’t get any money but they get free open source AI built off their work instead of closed source AI built off their work.
Having the website have copyright of the content in the context of AI training would be a fucking disaster.
- It’s not illegal. 2. “Law” isn’t a real thing in an oligarchy, except insofar as it can be used by those with capital and resources to oppress and subjugate those they consider their lessors and to further perpetuate the system for self gain
fucks sake
These techbros are some grade A geniuses. Goodness, training LLMs on private data from Slack!? One of the most popular corporate messaging apps? Surely! Only good things can come out from training data on corporate secrets into a braindead auto-complete model that can leak its oversized brain if you prod it just right.
Of course, no private data has ever been leaked from an LLM, right?
Is there a self hosted slack-like that has a mobile app
Is there a self hosted slack-like?
Is there self hosting?
Is there self?
Rocket chat I think checks those boxes
Subscription to self host? lol. Am I missing something?
Your wallet?
I’ll pay a flat fee, but not a per user fee.
Subscription to a software is not mutually exclusive with self-hosting. Developers deserve to earn money, especially those who do not rely on collecting data, showing ads and enshittification of their cloud platform.
I’ll pay a flat fee.
I’m self hosting rocket and I never gave them a dime
I have 400 users though.
Sounds like rocket won’t be a good fit for you then.
- Nextcloud Talk
- Mattermost
- Matrix
Maybe Matrix
I use Slack at work everyday. I suppose this does feel off in some way but I’m not sure I’m the right amount of upset about this? I don’t really mind if they use my data if it improves my user experience, as long as the platform doesn’t reveal anything sensitive or personal in a way that can be traced back to me.
Slack already does allow your admin to view all of your conversations, which is more alarming to me
The problem is where you said “as long as” because we already know companies AND the AI itself can’t be trusted to not expose sensitive info inadvertently. At absolute best, it’s another vector to be breached.
It’s obvious when you say it like that. I don’t like the idea of some prompt hacker looking at memes I sent to my coworker
what a shitshow. when is this gonna be made illegal?
mattermost anyone?
An issue with mattermost is that some useful features are behind a paywall, like group calls.
I’d go with nextcloud since all their features are included regardless of if you pay or not
Ain’t everything open-souce?
Its used by NASA and the US Air Force so security wise it probably is pretty solid.
Remember when every platform renamed PMs to DMs and everyone who pointed out that they’re trying to remove the expectation of privacy was “paranoid”?
I don’t, but it sounds exactly like every other time this exact thing happens.
Well, that’s just great. Another reason for corporations to force everyone into the miserable, counterproductive dumpster fire that is MS Teams.
Is MS Teams the only viable alternative ?
At my old job we primarily used RocketChat, I miss it
Also
- Nextcloud Talk
- Self-hosted Matrix server
No, but it’s the only one that big corporations know/care about… primarily because of MS’s aggressive “look, it’s basically free and TOTALLY the same thing, we promise” marketing strategy.
The customer is never just a customer… even if you are paying. You are always the product now. Always.