Ensuring that the system complies with industry standards and integrating security measures for cross-technology communication are also necessary steps, Gao adds.
This is absolutely a huge factor that could make or break the technology if they don’t do this perfectly. This could be the single most important part of the tech.
2.4 GHz is super saturated. The last thing we need is long range i.e. large footprint signals in already saturated spectrum. How this technology is deployed should either be not at all, or very carefully, to prevent widespread interference with existing WiFi devices. This spectrum is already on the verge of being complete trash. Please please do not be deploying more stuff on 2.4 spanning an entire “smart city.”
I actually ditched 2.4 gigahertz Wi-Fi on my home network entirely for this exact reason. If a device is not compatible with 5 gigahertz Wi-Fi, it doesn’t get purchased.
It doesn’t just benefit you. You’re benefiting the current users of that spectrum that for one reason or another might not be able to switch.
I suspect most users though couldn’t tell you what frequency their network uses let alone the devices on it.
Anyone with a NAS will immediately notice that they are on 2.4GHz because it will take several times longer to transfer files.
I think users who know what a NAS is probably know that information already. But true, yes!
Some of us know what a NAS is, but aren’t fortunate enough to afford one
Indeed. Hello poorish brother
Yup, I have one device that’s stuck on 2.4GHz, my Brother laser printer. It works fantastically otherwise and it has an Ethernet port, but I haven’t bothered to run cable yet to it. I suspect a lot of people have that one device they’d rather not replace, which is still on an old wifi standard.
So I just make sure to have a simultaneous dual-band setup. Everything else uses 5GHz, and the 2.4GHz band exists for that one device, or if I’m on the opposite side of the house or something. I use fancy networking stuff though (Ubiquiti APs), your average person would just be confused at why the internet is sometimes slow (i.e. when the printer wakes up).
While my printer only supports 2.4GHz, it’s always been on Ethernet
But too many smart home devices and media streamers, even after making an effort to stick with local IoT meshes.
Do you live in a high density urban environment?
Because if so, that totally makes sense, and the other benefit of 5GHz/6GHz not traveling too far outside your apartment or condo wall, is pretty nifty as well.
But if you live in a house in the suburbs, man, that is commitment well outside of necessity, or convenience. Not saying it’s bad choice per se, just seems unnecessarily burdensome IMO.
I live in a single family house, but the area has quite a few single family houses packed pretty close together. So there’s still a lot of traffic on 2.4 GHz.
I’m not OP, but I also live in a single family house in the suburbs and actively avoid 2.4-only gear. I do have one stubborn device on 2.4GHz though, my laser printer, so I have to keep buying simultaneous dual-band gear until I get around to running Ethernet cable to it.
I wish I could but too many devices still require it
This spectrum is already on the verge of being complete trash.
Radio shouldn’t be used when avoidable. It’s for emergencies, aviation, hiking, short-range communication for convenience maybe. Phones - yes.
But providing internet connectivity via radio when you can lay cable is just stupid.
I mostly agree with you. I find it really weird how I live in a world where all my Internet is being run through 5G cellular for political and social reasons and not for technical ones. Due to the monopoly on the cables, it’s actually much cheaper here to buy 5G home internet. It seems unnecessarily complicated and choosing to use a shared medium for no reason. It’s just the politics.
In case you’re not from the States, we have a monopoly pretty much everywhere for Internet services.
With my 5G I have unlimited data, and it’s 300 down 44 up on a good day. It’s perfectly serviceable if you can live with increased latency.
we have a monopoly pretty much everywhere for Internet services
Fortunately, that’s not true everywhere, and municipal fiber is becoming more and more common.
5G home internet
The problem here is latency. It’s entirely sufficient for most web browsing and video streaming use-cases, but it sucks for multiplayer gaming and other interactive use-cases (e.g. video calls). So while it’s probably a solution for a lot of people, it’s not really a replacement for physical cables.
Sounds like they basically crafted some special messages such that it’s nonsense at 2.4ghz but smoothes out to a LoRa message on a much much lower frequency band (<ghz).
It’s LoRa on 2.4ghz.
It’s just that chirp signals are easy to decode from a lot of noise.
And they don’t really affect most other modulation techniques. I think you can even have multiple CSS coded signals on the same frequency, as long as they are configured slightly differently.LoRa is incredibly resilient.
It’s just really really slowI don’t think it’s “just” LoRa on 2.4ghz, because if it were existing lora devices wouldn’t be able to decode the signals off the shelf, as the article claims. From the perspective of the receiver, the messages must “appear” to be in a LoRa band, right?
How do you make a device who’s hardware operates in one frequency band emulate messages in a different band? I think that’s the nature of this research.
And like, we already know how to do that in the general sense. For all intents and purposes, that’s what AM radio does. Just hacking a specific peice of consumer hardware to do it entirely software side becomes the research paper.
WiFi uses BPSK/QPSK/OFDM/OFDMA modulation.
LoRa uses CSS modulation.This is about hacking WiFi hardware to make WiFi modulated signal intelligible to a receiver expecting CSS modulation, and have the WiFi hardware demodulate a CSS signal.
Thus making WiFi chips work with LoRa chips.LoRa doesn’t care about the carrier frequency.
So the fact that it’s LoRa at 2.4ghz doesn’t matter. It’s still LoRa.I’m sure there will be a use for this at some point.
Certainly useful for directly interfacing with LoRa devices from a laptop.
I feel that anyone actually deploying LoRa IoT would be working at a lower level than “throw a laptop at it” kinda thingI didn’t realize that LoRa didn’t care about carrier frequency, that’s for sure the root of my faulty assumption! Thanks for taking the time to explain
It’s pretty serendipitous, actually.
The past month I’ve done a somewhat deep dive into LoRa for a project.
I ultimately dismissed it due to the data rates, but for simple remote controls or for sensors - things that report a couple bytes - it seems awesome.
I’m sure you can squeeze higher data rates out of it, but when I evaluated it I decided to go with a hardwired network link (I had to have stability, dropped info wasn’t an option. But the client had a strong preference for wireless)
WiHi is already taken in Japanese for WiFi. They write WiFi but can’t pronounce WiFi (there is no Fi sillable), so they say WiHi. Source: I lived in Japan for a while.
Wireless Hidelity?
HIDEO
Kinda weird that their only syllable with F sound is Fu, which goes in the Ha-He-Hi-Ho column
Technically there is no Hi syllable in Japanese either. There is ひ, which phonologically is neither “Hi” nor “Fi”, but somewhere in between. The exact pronunciation varies depending on surrounding sounds, as well as the speaker’s regional accent.
So I wouldn’t say they really use WiHi. They write WiFi and they say “ワイハイ” which is the closest you can get to WiFi using Japanese sounds. It will kinda sound like WiHi to an English speaker.
I don’t know where you get the information tho, it’s factually false.
Japanese have have /h/, /ç/, /ɸ/ consonants in ハ行 (written as ha - hi - fu or hu -he - ho but pronounced differently). The consonant /ɸ/ is generally transcribed as f in alphabet.
フ(f+u) is the only letter that pronaunce /ɸ/ in regular ハ行, but ファ行 (f + other vowels) indicates sounds with /ɸ/.
Transcription of wifi in Japanese is ワイファイ, not ワイハイ.
They have plash speed wifi
This will still be not accessible in my room!
I mean, if you insist on sitting in the fridge what do you expect!
Nothing a hammer drill, CAT.7 cable and an access point couldn’t fix.
Maybe I’m being overly paranoid (this is Lemmy, after all), but doesn’t this seem like a step toward something troubling?
- Almost all of our devices are designed to use WiFi. Just try finding a laptop with an ethernet port, or a phone or tablet with wired connectivity. You can get adapters, sure, but they’re not standard anymore. I wouldn’t be surprised if game consoles eventually drop wired options altogether, or charge extra for them—like Sony does with the PS5 disc drive.
- ISPs have a track record of trying to control our internet experience—remember the fight over net neutrality? They’re always looking for ways to monetize data and restrict what we can access online.
- With long-range WiFi on the horizon, ISPs might find it cheaper to install one powerful broadcast device per neighborhood, similar to how 5G towers are deployed.
- And when that happens, it’s not that features like fiber to the home or port forwarding are gone, but they could be locked behind an extra fee. Want direct access to your own network settings? That might come at a premium. Even access to certain websites could become conditional on paying more, or worse, dictated by someone else’s agenda.
What are you talking about?
This is not about streaming to a laptop or Internet access. This is about a long range, low power, low bandwidth network using 2.4GHz. It’s using 2.4GHz, like everyone else likes to, because it’s the “free” signal band that you don’t have to pay to license. It’s for sending the message “Sprinkler head 1039A is leaking” from a solar panel powered transmitter without having to run a data cable or network repeaters.
It’s competition for Zigbee/Z-Wave/Matter. Not the herald of the ISP crackdown Armageddon.
It may also allow some sort of meshnet-based Usenet (no binary groups), just saying.
Just like wifi already does?
Yes, just further due to the particular kind of misuse . Distance matters for mesh nets.
Laptops with Ethernet are still pretty common. I just bought one recently. At work, we buy a lot of them. But I don’t think smartphones ever had integrated wired networking.
But that aside, what you’re describing is already happening. Wireless network deployments are much, much cheaper than running wire to each building. In semi-rural areas, WiSPs are pretty common. And 5G for home Internet access is pretty common in high-coverage areas. And as time goes on, the ISP-provided equipment is more locked down.
But I don’t think those things are related.
Not in the US, but every LTE home internet provider I’ve dealt with is just a SIM that you can use in any off the shelf LTE/5G to wifi router setup. Mikrotik options are cheap and can handle simple one-device implementations, and multi-device mesh setups (that require more effort to configure)
And when that happens, it’s not that features like fiber to the home or port forwarding are gone, but they could be locked behind an extra fee. Want direct access to your own network settings? That might come at a premium. Even access to certain websites could become conditional on paying more, or worse, dictated by someone else’s agenda.
They can do that right now. If this new wireless option is standardized, it would seem less prone to ISP shenanigans to me. Just a question whatever functionality makes it into the standard in the first place.
I think the article is explaining that this is really just modifying wifi protocols to work over LoRa, to reduce LoRa costs.
This will probably only be beneficial to people currently using LoRa.
I thought that LORA was optimized for low data throughput? Running WIFI over such a link would suck.
Also isn’t LoRa proprietary/patent encumbered?
I can find this believable in the US maybe (only stayed there for a few months and I heard nothing good, data caps on broadband is wild) but not a chance in countries with stricter regulations and guidelines on what the ISPs are allowed to do.
I don’t understand this (haven’t read the article, which probably explains that) but this thread is all over the place with different interpretations.
Mine is this is another local IoT network like Zigbee, zwave, thread, even Bluetooth. The latest z-wave standard includes a longer range functionality. LoRa is another such local IoT network. The use case is smart home or business devices, like a light switch or a thermostat. They are much lower power to suit battery powered devices and generally need very little bandwidth. You may have a central hub that can remotely control all your devices or manage automations. However a key feature is local. These are not directly connected to the internet, but may have a bridge to your network for connectivity.
But LoRa is expensive and requires a lot of configuration. If this means that WiFi can come with some of the LoRa connectivity and bridge configuration already done, then you’re making it much easier to set up such a local IoT network
Just tell them you have a low packet loss tolerance. The wire will never be cut. It can’t replace a wire for many use cases
Looking at how long it took for fiber to be “allowed” in my country, I don’t worry too much
This article leaves me with more questions than answers
Why not Willow
No thanks! 👍 Is there any way we could make a standard where unknown servers trying to run apps from google locally on my devices could be auto blocked? I’d love a standard like that.
"Hey we removed Google photos since they turned on the camera without your consent. We did that remotely! While you were out "
That WiFi, I would like.
How can an app turn on the camera without your consent?
Duct tape fell off
I mean, you have to explicitly give permission before apps can access the camera.
Can’t work for 3-letter agencies with that attitude