• 🇰 🌀 🇱 🇦 🇳 🇦 🇰 ℹ️
    link
    fedilink
    English
    15
    edit-2
    6 months ago

    Radios receiving signals don’t just siphon the signal off lol

    What you’re asking would only really happen with wireless Internet service and it’s not because of the wireless signal, but because the overall bandwidth diminishes the more people connect to it.

      • @YourAvgMortal@lemmy.world
        link
        fedilink
        English
        146 months ago

        It’s like solar energy. You either absorb it with a panel, or it goes to “waste”. You’re not really stealing it from someone else, as long as you’re not getting too much in the way

        • @VirtualOdour@sh.itjust.works
          link
          fedilink
          English
          26 months ago

          Usong your analogy i think Ops question was really if you have a stack of transparent solar panels will the panel below get less power and the answer is of course it will. If one antenna is behind another there will be a small reduction in the power of the signal reaching it, probably very small but with enough of them you could theoretically construct a faraday cage of sorts.

      • Actually, the waves emitted by the radio tower are enough for a receiving device to generate a small electrical current just through the oscillations of the propagating signal.

        • @CanadaPlus@lemmy.sdf.org
          link
          fedilink
          English
          5
          edit-2
          6 months ago

          The current produced in the antenna does (induce a field which goes on to) cancel the wave out a bit. Not enough to be noticeable in the far field, for a normal-sized antenna, but some. Conservation of energy, right?

      • @CanadaPlus@lemmy.sdf.org
        link
        fedilink
        English
        1
        edit-2
        6 months ago

        Yup. It’s typically amplified quite a lot in the receiver, and the vast majority of power transmitted never is received, so it doesn’t usually matter, but it’s not a dumb question.