I have a PC I built a year and a half ago and apparently it “doesn’t meet the requirements” for windows 11…
Ryzen 5 5600x and a 3060 TI.
I have a PC I built a year and a half ago and apparently it “doesn’t meet the requirements” for windows 11…
Ryzen 5 5600x and a 3060 TI.
TFW you’re storing 2D and 3D data structures and you read this 😂
For most users jmeter is difficult to approach.
Something like autocannon
or ddosify
may be nicer
Jack… Fortnight? This hurts. RIP the original franchises.
The wrong lesson?
I’m not sure how reducing your attack surface area is the wrong lesson here.
Definitely distopian, corporate power and entrenchment grows every year.
How are they just plain mean? Are you just trying to play the clickbait game?
Many of these are just normal faces with context, a chunk are pretty accurate stereotype and not insulting, and the rest could be considered mean.
Also from Oregon, with a bun as well 😂
It would be neat for Lemmy to have something like it. It’s a great way to self fund in an engaging way.
There probably needs to be a foundation or something that can distribute funds to instances though to prevent heavy consolidation by way of simple popularity based funding 🤔
Obsidians really good with lots of notes and linking them together as well as adding metadata to them.
It really depends on your use case. The plug-in ecosystem is also quite rich.
Lemmy is… Not distributed computing.
If each instance is a separate application than must scale on it’s own, then no distributed computing is occuring.
There is one database, and you can have the instance itself behind a load balancer.
Lemmy is not a distributed program, you can’t scale it linearly by adding more nodes. It’s severely limited by it’s database access patterns, to a single DB, and is not capable of being distributed in it’s current state. You can put more web servers behind a load balancer, but that’s not really “distributed computing” that’s just “distributing a workload”, which has a lot of limitations that defeat it being truly distributed.
Actual distributed applications are incredibly difficult to create at scale, with many faux-distribited applications being made (Lemmy being n-tier im a per instance basis).
Think of Kafka. Kafka is an actual distributed application.
Cloud computing is… Not distributed computing.
We’re talking about pushing compute workloads across a distributed set of devices where that workload is linearly scalable by the number of devices involved, compute, storage, failovers…etc scale elegantly. Cloud computing can give you the tools to make such a thing a reality within the scope of the cloud provider, but it most definitely is not distributed computing just by existing.
Also the fediverse is NOT distributed computing either, at least for Lemmy. There is no distributed compute available for Lemmy. You can’t have a few hundred users toss up their own compute to handle loads for an instance. Each instance is limited to a single database, and can have webservers behind a load balancer to spread out the compute. And that’s about the best you’ve got. Not distributed, you can’t just spin up 100 nodes for a Lemmy instance to handle more load and everything “just works”. It’s a very “classic” architecture in a way.
A K8 cluster isn’t distributed computing until you build a distributed application that can elegantly scale with more and more nodes. And is fault tolerant to nodes straight up dying.
Kafka for example, is an actual distributed application. One which you could run on a K8 cluster, it self-manages storage, duplication, load balancing, failovers, rebalancing…etc elegantly as you add more nodes. It doesn’t rely on a central DB, it IS the DB, every node. Lemmy is not.
Or Obsidian? Take actual control over them including rendering if you want to customize that.
Maybe it’s a different use case 🤔
This is a pretty disappointing and anemic article.
I thought this was going to dive into some of the practical pragmatic and scientific ways to measure information.
This is quite literally “What is a bit and a byte” 🫤
I’m sure this won’t have unfortunate knock on effects 😬
Yes it is nowhere near it. But the basis of the argument that today’s limitations mean tomorrow’s AI is just as limited is a clear logical fallacy.
AI can’t replace a person yet*
Stating that AI limitations today means those limitations will exist in the future, despite the accelerated growth of AI complexity & capabilities is plain wrong.
History is full of examples just like this, from computers, to the internet, to automation…etc “Robots will never replace my job because my job is complicated”, it’s not a matter of if, but when. Would you rather be on the side of history that considered the impacts and tried to mitigate them, or the side that stuck their head in the sand?
Also, on the point of invalid logic. “AI is not the problem, it’s the abuse” is assuming AI exists in a void, which it doesn’t. The same logic: Biological weapons aren’t bad, it’s how they are used is the problem. Misinformation isn’t bad, it’s how it’s spread that’s the problem. Guns aren’t bad, it’s the people shooting them that’s the problem. …etc for everything else in the world that is a real problem because humans use and abuse it.
Current gen AI is a problem because it’s a catalyst for abuse. Not because of nature of existing AI, you are right, but that’s an argument detached from the reality of the situation.
Note: General Super Intelligence is a problem purely by it’s natural. The same goes with partial intelligence due to alignment issues which are currently paradoxical in nature. There are entire fields of study for this.
I would suggest learning how current models function. They have a lot of limitations and they are nowhere near actual AI like movies and media suggest.
Despite this you will find while learning this that the rate of advancement is such that the future dangers posed by AI are real, and must be considered. Ignorantly ignoring the writing on the wall doesn’t do us any good.
Except that apple was intentionally doing this to drive consumers towards buying more of their product
Lack of serviceability is a big one.
Walled gardens are another.
I have major issues with both. I bought the device, I should be able to repair it. It shouldn’t intentionally not work with other ecosystems that use standard protocols either. I should be able to integrate my device with standards the rest of the world uses.
Pretty much you buy apple devices, you are essentially an expensive renter renting a really strong internal ecosystem that purposefully forces you into buying more of that ecosystem and not working outside of it.
For me?
I try to be civil and professional on the internet, and when I use this username I am speaking as myself. And this is largely the only username I use. And it can be tracked back to the real me.
Pretty sure it’s me ADHD that causes me to accumulate tabs like this…
I’ll have dozens and dozens of windows full of tabs.
I recently did a tab clean out before moving. And had tabs up from ideas or to do’s or items that interested me from 4+ years ago.
Every time I restart my computer or close Firefox I always restore my previous session and get all those tabs back.