All fun and games until it happens with nazism instead of slime.
Unless it’s how to get your 100 Nazi scalps.
Ew gross.
I’m not going to keep the scalps of any Nazi I kill while defending my home and loved ones.
I’ll just use pen and paper to keep track.
(I’m not bothered by your comment at all, but am attempting to humorously “yes, and” with it.
I am attempting a homorous misdirect where the reader thinks I’m disgusted by threatening to kill Nazis, but then I’m actually just offended by inefficient messy ways of keeping tracked track of any killed Nazis.)
Yeah, I like a clean, minimalist space. No plants, not many picture frames. Definitely would need an excel spreadsheet lol
The DuoScalpy streak
It’s not getting updated anymore, unfortunately, but this is a cool webpage to get a feel for that: http://www.their.tube/
The phrase “ok slimers” legitimately made me laugh out loud
I’m certain my YouTube feed is trying to radicalize me into some kind of culture warrior. It’s really annoying. I deleted all of my watch history to try and reset it and it just got way worse real quick. I watch one stupid video, now all I see are angry tubers upset that people don’t think exactly like they do and enjoy things they don’t. Then they convince themselves they’re more enlightened than anyone else because they make this content and ban anyone who makes fun of them, all while claiming to be “free speech advocates” of course.
YouTube got bad so fast it’s left my head spinning.
Have you tried clicking the 3 dots on these outrage videos and selecting “don’t recommend channel” or a mix of that and “not interested?” I started to see a bunch of right wing political trash in my feed a while back since a lot of my watched videos could be considered adjacent (cars/trucks/offroading/home improvement/dash cam vids/etc) to what these people like and I haven’t really had this issue again.
It’s wild that right wingers are always complaining about big tech censoring them when YouTube and Facebook are pushing far-right content so much
It’s wild that right wingers are always complaining about big tech censoring them when YouTube and Facebook are pushing far-right content so much
I’ve got a conspiracy theory about this:
- Everyone likes kittens.
- Some of us who like kittens think about how to act decently to each-other, some of the time.
Leading to:
- Right wingers who like kittens will sometimes see something “woke” in their algorithm feed, and they feel attacked.
They still think that YouTube and Facebook are representative of the average person. They don’t understand how incredible curated those feeds are. I think that’s where some of the “silent majority” mythos comes from. Everything they see is people agreeing with them, therefore it’s impossible that Joe Biden got more votes in 2020.
I started with a clean profile: I never log in to YT so it’s just using a local cookie you can always clear to start over.
Anyways, I just searched a few sciencey things to feed the algorithm and now I’m getting loads of crazy fake “science” and conspiracies and the rest is all extremist right wing bullshit.
YouTube is getting useless.
All I get recommended are craft and history videos.
deleted by creator
If you want it to just not recommend things, you might prefer switching to an RSS feed, or to something like NewPipe.
The only reference I have for this was someone who I knew who rubbed said slime on herself for YouTube when she was 17 to build a following for when she turned 18 and started camming.
That’s fucked up
Things to get into this week:
- newpipe
- freetube
Then you decide where to go from there.
deleted by creator
And that is why I only open videos about topics I am only mildly interested on or from controversial channels in incognito mode even though I actually pay for ad-free YouTube Premium.
On my defense, that is the only streaming service I pay for.
TFW you forget to wear protection when clicking on a weird video and you permanently scar your algorithm. You try to heal it, but days or weeks later, you are showing your boss a video on marine grade industrial sealant and Chappell Roan Pink Pony Club shows up in your recommended videos and you have to lie and say you have no idea what it is. When he is gone, you play it again.
clicking on a weird video and you permanently scar your algorithm.
It’s trivial to delete individual videos from your watch history, even moreso if you just saw it. Doing so makes it as if you never clicked on it in the first place.
TIL
At least premium has the benefit of paying creators more for your watch time, which is nice.
Tags: Slime Girl
As recent advances in AI have shown, humans are really quite predictable when you throw enough data and compute at the problem. At some point the algorithm will be sophisticated enough that it’ll be able to get to know you better than you know yourself, and will be able to provide you with things you had no idea were what you really wanted.
Interesting times.
Isn’t this true for many years now?
Yes, but recent advances have really rubbed it in our faces in ways that are a lot harder to deny. Humans haven’t become fundamentally more or less predictable over time but recent advances have shown how predictable we are.
Yep. I learned from an algorithm that I might enjoy music by “The Beatles”. The algorithm was quite correct, but I think my having simple tastes, and the Beatles having amazing music is due most of the credit.
I had this exact experience with music algorithm recommendations:
The algorithm analyzed all the songs I asked it to play, and concluded (correctly) that I might enjoy listening to the Beatles. (True story.)
(Now a bit of sarcasm:) I look forward to future insights, in other art forms, such as perhaps the writings of Shakespeare or the paintings of Leonardo Da Vinci.
Yeah doubtful. I think it finds something you will engage in and push on it over and over again until people get normalized to it.
I think it’s more like cold reading from a psychic. It’s gonna use generic generalized data about the big identifiers for you like age and gender and as you respond try to change its answer to what it needs to based on what you gave it.
That’s not new or magical in any way. And it can be really wrong about the broad stuff if you don’t fit in with generic identifying groups related to you.
It really just feels like a sales pitch for the middle class to buy more stuff.
That is not what happened.
Humans aren’t static. You don’t actually have these secret hidden likes AI can discover, instead, you grow to like the stuff that becomes familiar. You’re being trained.
Problem is that none of the algorithms actually care about showing you things you like.
Ads try to sell you on things that you wouldn’t otherwise buy. Occasionally, they may just inform you about a good product that you simply didn’t know about, but there’s more money behind manipulating you into buying bad products, because it’s got a brand symbol.
And content recommendation algorithms don’t care about you either. They care about keeping you on the platform for longer, to look at more ads.
To some degree, that may mean showing you things you like. But it also means showing you things that aggravate you, that shock you. And the latter is considered more effective at keeping users engaged.