Skip Navigation
Microblog Memes @lemmy.world The Picard Maneuver @lemmy.world

Why does my treadmill want my email address?

332

You're viewing a single thread.

332 comments
  • I can see the point of needing an account for a smartbulb, if you are away from home and want to turn on the lights before you arrive, it is needed.

    • want to turn on the lights before you arrive

      But... Why?

    • I get the account for a hub, but maybe if you buy like the 1 bluetooth bulb and that has to have an account, that would feel kind of dumb.

    • The problem is, unless you lock yourself into a single ecosystem, like hue, you need multiple apps to manage your fucking lights.

      .... Or you can get home assistant or something.

      • Sure, you have a very valid point.

        Also, I can see a method of setting up remote access to the system without an account.

        Simply have the hue bridge report a UUID and set a token in the app when you press the button to authorize the phone.

        The Hue servers accepts and forwards the request to a specified UUID as long as it is signed with an approved token.

        There is a local admin password to remove individual tokens, and a nice reset button on the bridge that will clear any config and let you start again.

        Sure you can use VPNs, however I may be an IT guy but I don't have the energy to deal with this stuff on my free time, I'd rather be out walking with my camera

        • I'm also an IT guy. I'm trying to make most of my stuff at home "smart" and had to go down the home assistant rabbit hole just to get everything managed under a single app. All so that my family doesn't have to deal with it (I have to suffer so they don't).

          I started a long time ago with hue, when they were just about the only name in home automation. Luckily it integrates with home assistant, but I'm buying all generic zwave bulbs now, and I'm planning to replace them all as they die off, so I don't have to overhaul the system and throw out a bunch of stuff that still works.

          My only real problem is that, I picked zwave because it's primarily 900mhz, and ZigBee is 2.4ghz, I'm trying to keep the home automation in a separate wireless band from my WiFi; but the majority of home automation stuff that's coming out is ZigBee, or based on similar protocols that use the 2.4 GHz band (matter and thread seem to both be built on top of ZigBee, or at least 2.4ghz).

          It's frustrating because it's very rare that some cool new home automation thing hits the market and it has a zwave variant available.

          Anyways. I'm just saying, I've been on a journey, and it's been frustrating. I understand why you wouldn't want to screw around with this stuff in your off time. My advice: don't change. Go for that walk with your camera. Enjoy.

          • My only real problem is that, I picked zwave because it’s primarily 900mhz, and ZigBee is 2.4ghz, I’m trying to keep the home automation in a separate wireless band from my WiFi;

            Yeah, I'm a little worried about where we're going with all the stuff hitting 2.4 GHz. I mean, a lot of these devices are going to be spewing radio-frequency emissions for a long time to come, and if you saturate the airwaves too heavily in an area, nobody can use anything reliably.

            • Me too, I'm both IT with a specialization in networking (and further specialization in wireless), but I'm also a qualified amateur radio operator (ham radio).

              To say I know wireless bands and constraints with available frequencies, contention, interference, scattering, attenuation and free space path loss, is an understatement.

              Zwave and ZigBee, at the time I was making the decision to go one way or another, about two-ish years ago (maybe a bit more), were fairly comparable, and the set of what was available was fairly equivalent. This was before matter/thread were barely a concept, and long before anything thread/matter compliant was on the market. So I weighed the options based on a few factors and one of the more important factors that went into the decision was the 900mhz band that's used. Zwave now has 2.4ghz, I don't remember seeing any 2.4ghz support on zwave at the time.... That was so important because of the interference that ZigBee would have created, and suffered from, with the WiFi in the house. We have 7 access points in the house and plans to add a couple more. Not all of them are broadcasting on 2.4ghz for the same reasons, but still, it's a lot of activity getting crammed into a fairly small band.

              Bluetooth is already on 2.4ghz, so we're already going to hit some interference, plus all the problems we are likely going to experience from neighbors.

              2.4 GHz is a really small band, around 72 mhz wide in total (from 2401 MHz to 2473 MHz). While 5ghz is more like 745 MHz (5150 - 5895 MHz), with some caveats due to regulations. It's still nearly, if not more than, 10x the channel width, depending on regulations.

              I have band steering on, but we have some older IoT stuff, mostly smart speakers, which are 2.4ghz only, so we still need it.

              To say our houses 2.4 GHz is occupied, is an understatement. We need to keep that band as free as possible, and zwave had the right specs to make it happen. Then the entire home automation community seemed to pivot almost entirely to ZigBee, thread, and matter, running on 2.4ghz. sigh.

              Not to mention that microwaves run at 1000W of power or more, at 2.45mhz with only a poorly built Faraday cage to protect the airspace. I try to make sure that the microwaves in the house have good isolation, for safety and communication integrity, but still, that doesn't matter if the neighbor uses their microwave often and doesn't care if the signals are properly isolated. Even 1/100th of the power leaking out (around 100W) is 100x more powerful than most wifi access points (which usually sit around 100mW or 0.1W of transmit power)...

              2.4ghz is a mess. I don't want to use it, but I can't avoid it.

              And everyone seems to be dogpiling stuff onto the band for no discernable reason. 900mhz is pretty "slow" in terms of bandwidth, but how much bandwidth do you need to tell a lightbulb to turn on, or have a device report that a button was pressed (for a light switch for example).

              It doesn't make sense. Everyone seems hellbent on making 2.4ghz their go-to, and not understanding why that's a terrible idea. 900mhz has better penetration power, and more than enough bandwidth for the task. Use it, FFS.

              • Like, you can add frequency-hopping-spread-spectrum stuff, but that isn't a magic wand; I means that yeah, maybe the FHSS device is more-resistant to interference on any one frequency, but it also means that it's edging into more spectrum space.

                And the problem is if the only way you can reliably get a signal through is by ramming the power up, that creates bad incentives.

                I used to have a Logitech gamepad (an F710) that ran using a proprietary 2.4 GHz wireless protocol. Used it happily for years, I can't comfortably use it now, because, over the past several years, some devices has shown up that eeevery now and then disrupts the connection briefly. And that's with the receiver's antenna and the transmitter's antenna just a few feet away, with a clear line of sight. Bluetooth gamepads still work okay; I believe that the protocol has got more reliability built into it.

                Now, okay, gamepads are maybe a worst-case scenario. They have hard real-time constrants; you really notice it in the middle of a fast-paced video game if your gamepad stops responding. Just delaying and retransmitting is problematic. Something like, say, a baby monitor briefly dropping out doesn't matter so much.

                But by the same token, they're also the canary in the coal mine.

                I have wondered if the end game is going to have to be taking the really high bandwidth things, stuff like WiFi, and shifting it to requiring line-of-sight and a mechanically-aimed laser or something like that.

                I try to make sure that the microwaves in the house have good isolation, for safety and communication integrity,

                Hmm. How do you do that? Like, go to a brick-and-mortar-store that has plugged-in microwaves with a some kind of spectrum analyzer? Just keep buying microwaves until you find one that you like?

                I haven't paid attention to microwaves, but I have been a little concerned about what LED bulb power supplies do; they're apparently a rather significant and growing source of noise as everyone is replacing their (silent) incandescent bulbs with LED bulbs. I've actively tried to find low-RF-emission bulbs, and it's a pain.

                As I understand it, the basic problem is a combination of the facts that:

                • They are using a hefty amount of juice.

                • The power line is unshielded, and so can act as an antenna as a PWM power supply flips on and off.

                • Lamps designed for incandescent A19 bulbs, were never designed with LEDs in mind, so the LED's power supply isn't built into the lamp; instead, you have to put a small, high-power power supply where users are very price-sensitive in a very small space: inside the bulb.

                Even if there were a low-RF-emission rating, which there isn't, it's not as if someone can do something about other people using them.

                I suppose that in the long term, this problem will probably slowly solve itself if people just wind up moving in the direction of lamps designed specifically for LEDs (usually with non-removable LEDs); maybe lamp-integrated power supplies will perform better. But even an LED bulb will hopefully last a long time, not to mention a lamp. So that's not happening any time soon.

                • FHSS is not magic. In some ways it makes things worse for other protocols while avoiding problems for itself.

                  Which leads me into The next comments you made about interference sources. With microwaves and LED bulbs and such. While I do have an SDR, I don't use it for wireless cleanliness. My access points, mainly Cisco aironet 2802i series, have a feature called "clean air" which isn't new for Cisco, but other vendors are starting to add similar features to their access points. I believe it's been included in most mid-range aironet access points since wireless N (around the 2600, maybe before)... Anyways, the built in radios will listen for and analyse interference and provide information related to it.

                  Clean Air will report pretty much everything that can be interference with decent accuracy. I've personally seen the following: radar, Bluetooth, microwave (oven), and "non-wifi" as interference sources. I believe "non-wifi" is the catch-all for something that can't be identified.

                  Clean Air also reports on what channels are impacted by the inference, and I can also get reports on nearby wifi networks, and what channels they're on, the frequency width that's set on foreign access points.... On top of that, it gives me a report on how busy the channels are for the configured channels on the access points, with classifications for my wifi traffic, others wifi traffic, noise, and inference.

                  With microwaves, I mainly watch the clean air report, if I see microwave (oven) interference, I try to reference the time of the interference, and figure out if the microwave was in use during that time. If it lines up consistently, time to replace the microwave.

                  In my experience, new microwaves rarely have an isolation problem. The mark quality in the manufacturing of the microwave, is how long before that happens. Some last a long time, others lose their isolation fairly quickly. Pre-testing isn't very useful since the isolation is usually fine when It's new.

                  To the same point it'll pick up interference from other sources, like lightbulbs. So if that's picked up at all, I'll have to correlate what lights are on and when, to figure out which ones are the problem. To date, the interference is either off-band, or not significant enough to trigger clean air.

                  I know CFL's put off way more RF interference than LED bulbs. The high frequency required for florescent lamps is far worse than the RF put out by most LED bulbs.

                  I've considered getting an ekahau sidekick to get a better wireless spectrum analysis, but there's no way I could afford one right now. If I had more of a purpose for it, beyond my curiosity, then maybe. As it stands, no way. It's in the neighborhood of $2000+. Unless I can use it to help pay the rent, I won't be picking that up.

332 comments