• grue@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    13 days ago

    “To enable the massive 256GB/s memory bandwidth that Ryzen AI Max delivers, the LPDDR5x is soldered,” writes Framework CEO Nirav Patel in a post about today’s announcements. “We spent months working with AMD to explore ways around this but ultimately determined that it wasn’t technically feasible to land modular memory at high throughput with the 256-bit memory bus. Because the memory is non-upgradeable, we’re being deliberate in making memory pricing more reasonable than you might find with other brands.”

    😒🍎

      • grue@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        11 days ago

        From what I understand, they did try, but AMD couldn’t get it to work because of signal integrity issues.

      • enumerator4829@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        0
        ·
        11 days ago

        Because you’d get like half the memory bandwidth to a product where performance is most likely bandwidth limited. Signal integrity is a bitch.

        • Acters@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          11 days ago

          Many LLM operations rely on fast memory and gpus seem to have that. Even though their memory is soldered and vbios is practically a black box that is tightly controlled. Nothing on a GPU is modular or repairable without soldering skills(and tools).

    • simple@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      13 days ago

      To be fair it starts with 32GB of RAM, which should be enough for most people. I know it’s a bit ironic that Framework have a non-upgradeable part, but I can’t see myself buying a 128GB machine and hoping to raise it any time in the future.

      If you really need an upgradeable machine you wouldn’t be buying a mini-PC anyways, seems like they’re trying to capture a different market entirely.

      • Ulrich@feddit.org
        link
        fedilink
        English
        arrow-up
        0
        ·
        12 days ago

        seems like they’re trying to capture a different market entirely.

        Yes that’s the problem.

              • Ulrich@feddit.org
                link
                fedilink
                English
                arrow-up
                0
                ·
                12 days ago

                The answer is that they’re abandoning their principles to pursue some other market segment.

                Although I guess it could be said to be like Porsche and Lamborghini selling SUVs to support the development of their sports cars…

                  • Ulrich@feddit.org
                    link
                    fedilink
                    English
                    arrow-up
                    0
                    ·
                    11 days ago

                    To be fair, you didn’t ask a question. You made a statement and ended it with a question mark, so I don’t really understand exactly what it is that you were asking.

      • Vinstaal0@feddit.nl
        link
        fedilink
        English
        arrow-up
        0
        ·
        12 days ago

        According to the CEO in the LTT video about this thing it was a design choice made by AMD because otherwise they cannot get the ram speed they advertise.

        • unexposedhazard@discuss.tchncs.de
          link
          fedilink
          English
          arrow-up
          0
          ·
          12 days ago

          Yeah exactly, its worthless… Even the big players already admit to the AI hype being over. This is the worst possible thing to launch for them, its like they have no idea who their customers are.

          • Rexios@lemm.ee
            link
            fedilink
            English
            arrow-up
            0
            ·
            12 days ago

            The AI hype being over doesn’t mean no one is working on AI anymore. LLMs and other trained models are here to stay whether you like it or not.

      • 4am@lemm.ee
        link
        fedilink
        English
        arrow-up
        0
        ·
        12 days ago

        They still could; this seems aimed at the AI/ML research space TBH