• abhibeckert@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    6 months ago

    the google cars few years ago had the boot occupied by big computers

    But those were prototypes. These days you can get an NVIDIA H100 - several inches long, a few inches wide, one inch thick. It has 80GB of memory running at 3.5TB/s and 26 teraflops of compute (for comparison, Tesla autopilot runs on a 2 teraflop GPU).

    The H100 is designed to be run in clusters, with eight GPUs on a single server, but I don’t think you’d need that much compute. You’d have two or maybe three servers, with one GPU each, and they’d be doing the same workload (for redundancy).

    They’re not cheap… you couldn’t afford to put one in a Tesla that only drives 1 or 2 hours a day. But a car/truck that drives 20 hours a day? Yeah that’s affordable.