Just another voice yelling in the void.

I’ve probably protested for your rights. I’m definitely on at least one list.

I believe firmly that everyone should have a fair shake and as much freedom as they can be afforded - so long as it does not encroach on the freedoms of others.

Occasionally a wordy cunt who will type a book when a sentence or two will suffice.

  • 0 Posts
  • 293 Comments
Joined 2 years ago
cake
Cake day: July 7th, 2023

help-circle



  • We are talking theoretical here, of course. For enterprise to even give it a realistic look it needs to outperform very time tested equipment so… Were probably looking at needing to beat on cost, capacity, speed… Or to put it simply its actual value / cost for implementation. Currently there are a few different research grade projects at various stages of lab testing… And this, like those, needs to fundamentally provide (noteworthy) gains over the existing and also be able to be consistent outside of the lab. Were a fair bit away from that yet.

    I mentioned earlier that we are in dire need of meaningful, long term, non-magnetic storage… And I genuinely believe that. But while I can be interested in the tech - it still needs to be viewed with a critical eye until it can produce results.



  • You need to put the capacity into perspective with the storage speed. The comment I made simply highlighted the issue with an extreme example… For the reasoning provided. And as someone who’s worked with emerging tech before… 30 Mbps is their ideal lap time in a lab environment. Do remember that 100 Mbps is considered absurdly slow for networking. 1Gbps sounds fast but even those transfer rates move into hours and days for larger file transfers.






  • Seems someone said it before me… But you missed the point.

    I’ll respond to your statement generally though.

    Basic survival on 56k was doable. Shoutcast or Pandora could even be streamed with occasional buffering while browsing more light, or less heavy, sites. On the topic of video - low quality 240 would be “manageable” again, thanks to modern compression.

    Was it a good experience? Rarely. Was it passible? Certainly; and if a site optimised for load time and reduced bandwidth - it could even be near broadband “experience” with some caching tricks.

    Im not saying everyone needs to be code gods and build a 96k fps… But optimizing comes from understanding what you are writing and how it works. All this bloat is the result of laziness and a looser grasp on the fundamentals. As to why we should take a harder look at optimization?

    • Datacenter / cloud costs are rising… Smaller footprint - smaller bill.

    • Worldwide hardware costs are rising… Less people will be building fire breathing monsters. Better optimization - better user experience - more users. Recent examples (of poor optimization:) fallout and early 2077.







  • You’d be supprised to see how many industries probably have some sort of backups in place for power … But it’s typically more costly to run and they may not have plans in place for extended outages. At the end of the day it comes down to money.

    What’s frustrating about the current situation with the power companies is people just are unaware they are getting bled or don’t have options for recourse… Whereas monopolies and large companies are getting (fuck if I know why) white glove treatment and discounts. It makes little sense to be deferential to these massive companies - as while they promise jobs, economic benifits, and the moon itself… Data shows this rarely materializes. Its baffling.