• 0 Posts
  • 119 Comments
Joined 4 months ago
cake
Cake day: March 8th, 2024

help-circle



  • Well, yeah, that’s what I’m talking about here, specifically. There was an application of technology that bypassed regulations put in place to manage a previous iteration of that technology and there was a period of lawlessness that then needed new regulation. The solutions were different in different places. Some banned the practice, some equated it with employees, some with contractors, some made custom legislation.

    But ultimately the new framework needed regulation just like the old framework did. The fiction that the old version was inherently more protected is an illusion created by the fact that we were born after common sense guardrails were built for that version of things.

    AI is the same. It changes some things, we’re gonna need new tools to deal with the things it changes. Not because it’s worse, but because it’s the same thing in a new wrapper.


  • Every industrial transition generates that, though. Forget the Industrial Revolution, these people love to be compared to that. Think of the first transition to data-driven businesses or the gig economy. Yeah, there’s a chunk of people caught in the middle that struggle to shift to the new model in time. That’s why you need strong safety nets to help people transition to new industries or at least to give them a dignified retirement out of the workforce. That’s neither here nor there, if it’s not AI it’ll be the next thing.

    About the linear increase path, that reasoning is the same old Moore’s law trap. Every line going up keeps going up if you keep drawing it with the same slope forever. In nature and economics lines going up tend to flatten again at some point. The uncertainty is whether this line flattens out at “passable chatbots you can’t really trust” or it goes to the next step after that. Given what is out there about the pace of improvement and so on, I’d say we’re probably close to progress becoming incremental, but I don’t think anybody knows for sure yet.

    And to be perfectly clear, this is not the same as saying that all tech disruption is good. Honestly, I don’t think tech disruption has any morality of any kind. Tech is tech. It defines a framework for enterprise, labor and economics. Every framework needs regulation and support to make it work acceptably because every framework has inequalities and misbehaviors. You can’t regulate data capitalism the way you did commodities capitalism and that needed a different framework than agrarian societies and so on. Genies don’t get put back in bottles, you just learn to regulate and manage the world they leave behind when they come out. And, if you catch it soon enough, maybe you get to it in time to ask for one wish that isn’t just some rich guy’s wet dream.


  • OK, so one caveat and one outright disagreement there.

    The caveat is that she herself points out that nobody knows whether the jobs created will outnumber the jobs destroyed, or perhaps just be even and result in higher quality jobs. She points out there is no rigorous research on this, and she’s not wrong. There’s mostly either panic or giddy, greedy excitement.

    The disagreement is that no, AI won’t destroy jobs it’s learning from. Absolutely no way. It’s nowhere near good enough for that. Weirdly, Murati is way more realistic about this than the average critic, who seems to mostly have bought into the hype from the average techbro almost completely.

    Murati’s point is you can only replace jobs that are entirely repetitive. You can perhaps retopologize a mesh, code a loop, marginally improve on the current customer service bots.

    The moment there is a decision to be made, an aesthetic choice or a bit of nuance you need a human. We have no proof that you will not need a human or that AI will get better and fill that blank. Technology doesn’t scale linearly.

    Now, I concede that only applies if you want the quality of the product to stay consistent. We’ve all seen places where they don’t give a crap about that, so listicle peddlers now have one guy proofreading reams of AI generated garbage. And we’ve all noticed how bad that output is. And you’re not wrong in that the poor guy churning those out before AI did need that paycheck and will need a new job. But if anything that’s a good argument for conusming media that is… you know, good? From that perspective I almost see the “that job shouldn’t have existed” point, honestly.


  • I think there’s plenty of rightful criticism to the things she actually says, and plenty of things she says I wouldn’t take at face value because they’re effectively token corporate actions to dismiss genuine concerns.

    She actually gets asked in the Q&A about the IP rights of creators included in training data, and she talks about some ideas to calculate contributions from people and compensate for them, but it’s all clearly not a priority and not a full solution. I’m not gonna get into my personal proposals for any of that, but I certainly don’t think they’re thinking about it the right way.

    Also, if you REALLY want a chilling thing she says, go find the part where she says they may eventually allow people to customize the moral and political views of their chatbots on top of a standard framework, and she specifically mentions allowing churches to do that. That may be the most actually dystopian concept I’ve heard come out of this corner of the techbrosphere so far, even with all the caveats about locking down a common baseline of values she mentions.



  • Thanks. I hate deliberately out of context quotes. Watching the entire interview is actually very interesting. Lots to agree and disagree with here without having to… you know, make things up.

    On the jobs situation she later mentions that “the weigth of how many jobs are created, how many jobs are changed, how many jobs are destroyed, I don’t know. I don’t think anybody knows(…), because it’s not been rigorously studied, and I really think it should be”. That also comes after a comment about how jobs that are “entirely repetitive” (she repeats that multiple times) may be removed, but she clarifies that she means jobs where the human element “isn’t advancing anything”, which I think puts the creative jobs quote in context as well. I like how the intervewer immediately goes to “maybe we can cut QA” and you can see in her face that she goes “yeah, no, I’m gonna need those” before going for a compromise answer.

    I don’t agree with the perspective she puts forward about how the tools are used, I think she’s being disingenuous about the long term impact and especially about the regulations and what they do to their competitors. But latching onto this out of context is missing the point.


  • I guess that depends on the use case and how frequently both machines are running simultaneously. Like I said, that reasoning makes a lot of sense if you have a bunch of users coming and going, but the OP is saying it’s two instances at most, so… I don’t know if the math makes virtualization more efficient. It’d pobably be more efficient by the dollar, if the server is constantly rendering something in the background and you’re only sapping whatever performance you need to run games when you’re playing.

    But the physical space thing is debatable, I think. This sounds like a chonker of a setup either way, and nothing is keeping you from stacking or rack-mounting two PCs, either. Plus if that’s the concern you can go with very space-efficient alternatives, including gaming laptops. I’ve done that before for that reason.

    I suppose it’s why PC building as a hobbyist is fun, there are a lot of balance points and you can tweak a lot of knobs to balance many different things between power/price/performance/power consumption/whatever else.


  • OK, yeah, that makes sense. And it IS pretty unique, to have a multi-GPU system available at home but just idling when not at work. I think I’d still try to build a standalone second machine for that second user, though. You can then focus on making the big boy accessible from wherever you want to use it for gaming, which seems like a much more manageable, much less finicky challenge. That second computer would probably end up being relatively inexpensive to match the average use case for half of the big server thing. Definitely much less of a hassle. I’ve even had a gaming laptop serve that kind of purpose just because I needed a portable workstation with a GPU anyway, so it could double as a desktop replacement for gaming with someone else at home, but of course that depends on your needs.

    And in that scenario you could also just run all that LLM/SD stuff in the background and make it accessible across your network, I think that’s pretty trivial whether it’s inside a VM or running directly on the same environment as everything else as a background process. Trivial compared to a fully virtualized gaming computer sharing a pool of GPUs, anyway.

    Feel free to tell us where you land, it certainly seems like a fun, quirky setup etiher way.


  • Yeah, but if you’re this deep into the self hosting rabbit hole what circumstances lead to having an extra GPU laying around without an extra everything else, even if it’s relartively underpowered? You’ll probably be able to upgrade it later by recycling whatever is in your nice PC next time you upgrade something.

    At this point most of my household is running some frankenstein of phased out parts just to justify my main build. It’s a bit of a problem, actually.


  • OK, but why?

    Well, for fun and as a cool hobby project, I get that. That is enough to justify it, like any other crazy hobbyist project. Don’t let me stop you.

    But in the spirit of practicality and speaking hypothetically: Why set it up that way?

    For self-hosting why not build a few standalone machines and run off that instead? The reason to do this large scale is optimizing resources so you can assign a smaller pool of hardware to users as they need it, right? For a home set of two or three users you’d probably notice the fluctuations in performance caused by sharing the resources on the gaming VMs and it would cost you the same or more than building a couple reasonable gaming systems and a home server/NAS for the rest. Way less, I bet, if you’re smart about upgrades and hand-me-downs.




  • OK, look, I don’t like the online auth requirement for Windows 11, I think it’s dumb and finicky. I’m not trying to defend it here, I was just trying to correct the record on a slightly misleading summary…

    …but come on, any user with those needs can work around the login in like five minutes.

    Retro gaming in 20 years will either work just fine on the next version of Windows or work on a Win11 install supporting an offline account. Heavy machinery shipping with Windows will presumably ship in a state where it can be authetnticated, so it should have some way to be online or to update to a version of Windows that does have auth servers, if Win11 stops having those for some reason. Bad drivers or simply not having connectivity hardware just requires using a USB device. Your phone will USB tether long enough to log in to Windows on first install just fine, I’ve done it before.

    Don’t get me wrong, it shouldn’t be needed, and it’s a stupid annoyance. The real answer to all those use cases is using the known workarounds to support offline accounts on first boot that MS should continue to surface and offer as a supported option. But let’s not be disingenuously obtuse about how the software actually works. I’ve done way worse to keep a legacy OS running on an old machine.


  • Was that a work computer? I know on a work laptop I did have some time restrictions set by IT because they had some authentication policies, but my understanding is that on a Windows Home account you control there should be no time limit, although it may complain about your MS apps or treat it as a not-activated install after a while, I’m not sure. I admit that I have never put that to the test on a Win 11 PC. I definitely did on MS-account enabled Win 10, since I’ve stashed older PCs and then turned them back on offline later, but I don’t think I’ve had an idle Win11 machine more than three months yet.