• Mossy Feathers (She/They)@pawb.social
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 month ago

    I’m kinda looking forward to seeing how this pans out. Personally, I’d want to use it to make small, local hobby networks; kinda like how it used to be that basically anyone with a phone line could start an ISP.

  • henfredemars@infosec.pub
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 month ago

    Ensuring that the system complies with industry standards and integrating security measures for cross-technology communication are also necessary steps, Gao adds.

    This is absolutely a huge factor that could make or break the technology if they don’t do this perfectly. This could be the single most important part of the tech.

    2.4 GHz is super saturated. The last thing we need is long range i.e. large footprint signals in already saturated spectrum. How this technology is deployed should either be not at all, or very carefully, to prevent widespread interference with existing WiFi devices. This spectrum is already on the verge of being complete trash. Please please do not be deploying more stuff on 2.4 spanning an entire “smart city.”

    • shortwavesurfer@lemmy.zip
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 month ago

      I actually ditched 2.4 gigahertz Wi-Fi on my home network entirely for this exact reason. If a device is not compatible with 5 gigahertz Wi-Fi, it doesn’t get purchased.

      • henfredemars@infosec.pub
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 month ago

        It doesn’t just benefit you. You’re benefiting the current users of that spectrum that for one reason or another might not be able to switch.

        I suspect most users though couldn’t tell you what frequency their network uses let alone the devices on it.

      • circuscritic@lemmy.ca
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 month ago

        Do you live in a high density urban environment?

        Because if so, that totally makes sense, and the other benefit of 5GHz/6GHz not traveling too far outside your apartment or condo wall, is pretty nifty as well.

        But if you live in a house in the suburbs, man, that is commitment well outside of necessity, or convenience. Not saying it’s bad choice per se, just seems unnecessarily burdensome IMO.

        • Mossy Feathers (She/They)@pawb.social
          link
          fedilink
          English
          arrow-up
          0
          ·
          1 month ago

          In my experience, having a vr setup with vive body trackers consumes the 2.4ghz band really fast; so there are still reasons to swap in the suburbs, but they’re more niche.

          Source: my PC is too far away from the router for wired, so it uses wifi. I had to switch to using 5ghz because my internet would drop out on 2.4ghz whenever I played VRChat.

        • shortwavesurfer@lemmy.zip
          link
          fedilink
          English
          arrow-up
          0
          ·
          1 month ago

          I live in a single family house, but the area has quite a few single family houses packed pretty close together. So there’s still a lot of traffic on 2.4 GHz.

    • Windex007@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 month ago

      Sounds like they basically crafted some special messages such that it’s nonsense at 2.4ghz but smoothes out to a LoRa message on a much much lower frequency band (<ghz).

      • towerful@programming.dev
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 month ago

        It’s LoRa on 2.4ghz.
        It’s just that chirp signals are easy to decode from a lot of noise.
        And they don’t really affect most other modulation techniques. I think you can even have multiple CSS coded signals on the same frequency, as long as they are configured slightly differently.

        LoRa is incredibly resilient.
        It’s just really really slow

        • Windex007@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          1 month ago

          I don’t think it’s “just” LoRa on 2.4ghz, because if it were existing lora devices wouldn’t be able to decode the signals off the shelf, as the article claims. From the perspective of the receiver, the messages must “appear” to be in a LoRa band, right?

          How do you make a device who’s hardware operates in one frequency band emulate messages in a different band? I think that’s the nature of this research.

          And like, we already know how to do that in the general sense. For all intents and purposes, that’s what AM radio does. Just hacking a specific peice of consumer hardware to do it entirely software side becomes the research paper.

          • towerful@programming.dev
            link
            fedilink
            English
            arrow-up
            0
            ·
            1 month ago

            WiFi uses BPSK/QPSK/OFDM/OFDMA modulation.
            LoRa uses CSS modulation.

            This is about hacking WiFi hardware to make WiFi modulated signal intelligible to a receiver expecting CSS modulation, and have the WiFi hardware demodulate a CSS signal.
            Thus making WiFi chips work with LoRa chips.

            LoRa doesn’t care about the carrier frequency.
            So the fact that it’s LoRa at 2.4ghz doesn’t matter. It’s still LoRa.

            I’m sure there will be a use for this at some point.
            Certainly useful for directly interfacing with LoRa devices from a laptop.
            I feel that anyone actually deploying LoRa IoT would be working at a lower level than “throw a laptop at it” kinda thing

            • Windex007@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              1 month ago

              I didn’t realize that LoRa didn’t care about carrier frequency, that’s for sure the root of my faulty assumption! Thanks for taking the time to explain