Every waking day of every waking use of the devices I have, I find myself constantly fighting a lot with the shitty input and recognition of said input. Things I swore I clicked once but having to click twice or sometimes three times. Such lag input between the last time I clicked and to the time the function of whatever I had to click fucking functioned.
With phones it is obviously worse, with finger input being either too sensitive or too dulled to register, inquiring more touches just to get somewhere or to type something, along with the separated frustrations aside trying to type on awful keyboard interfaces.
Edit:
For clarification’s sakes, people are bringing up old computers and how you’ve had to go extra steps to make it work. That’s not what I’m talking about and I thought I had made it clear as possible.
I’m talking about with the way things have been with technology over the past 15 years. You would think with all of the millions and billions that get invested into making things snazzy, crisp and shiny, that they would function similarly. Except, no, things got lots of wrenches thrown into their design phases to make them laggy, drag and otherwise shitty.
Phones, Tablets, Site Interfaces .etc


Honestly, I think the difference is how much software is in these things now. Everything is a computer. And software is something that is very cheap to do half-assed, but expensive to do well (and reliably).
TVs are a perfect example of this. The TV of 40 years ago had an analog tuner directly attached to a CRT. It did only one thing, and did it well. Today’s TVs are basically embedded computers with large screens. And the embedded software was probably written by the lowest bidder.
Not just software, online updates. Even things that were computerized used to have a lot more QA effort put into them when fixing a bug meant having to physically ship a new product revision, or at least a new disk.