I have the opposite problem on my windows pc, it takes so long to boot that the monitor goes into standby before it boots again
WIndows is bloated, especially if there are updates involved. However, how old is the hard drive it’s on? Not only tech age, but perhaps there are some read errors occurring to cause rereading that you aren’t seeing because it finally works. Also, if it is a hard drive upgrading to SSD is huge as well.
deleted by creator
Y’all actually use Gentoo, I thought is was just a joke
I did for like a week 15 years ago.
for the last 6 years though im slowly switching to arch
btw
I sometimes forget to turn on my monitor in time so my computer doesn’t recognize it and I have to reboot :(
I have a different problem where if i don’t turn on my monitor in time, it will still work just fine but my motherboard will turn on this annoying little white light that won’t turn off again, so i have to reboot to make it go off again.
If I don’t turn my monitor on in time, my OS assumes I have two monitors when I do turn it on. Then all new windows open on the ghost second monitor.
(Yours is hilarious though)
Oh that’s weird lol. I think if i remember correctly when i looked it up, the white light on my motherboard is actually staying on as a warning that no display is connected, which is nice and all, but it would have been great if it turned off again when a display is detected, but it just stays on forever. It’s only a very small light but it’s bright as hell, so it’s really annoying and it hits your eye eventhough the pc is off to the side lol.
I’ve found that if you’ve got a VGA analog output port on your motherboard or GPU, it’ll output to that by default; so any digital (HDMI, DP, DVI) interface that is powered or plugged in after the fact will have to be toggled with a hotkey to mirror or extend the monitors
Monitors are a crutch for people who don’t pay close enough attention to their inputs.
We should just feed in one tape strip to receive the hole punches after we put in all relevant inputs to perform calculations.
Loads mario, lots of arrows
Tape roll: o
Fuck
Hole punch/Tape strip Rocket League, please!
Eh… if the input is 128937182964213/1283971293871237129 even if I pay perfect attention I still don’t have the output I expect. Did I miss something?
My thing with gentoo is that my devices that would benefit the most from it are also the ones that will struggle the most with compiling everything. If it wasn’t for that, I would sure give it a try in my lower end devices
You can always just compile for your lower end systems on your higher end ones. I don’t remember exactly the way to do this but I know it’s possible
You could use the computing power of your better computer(s) to compile.
https://wiki.gentoo.org/wiki/Binary_package_guide#Setting_up_a_binary_package_host
(or)
https://wiki.gentoo.org/wiki/Distcc#To_bootstrap
https://wiki.gentoo.org/wiki/DistccHmm that’s really interesting. Thanks for the links.
RIP my free time…
I don’t know about Gentoo, but as a serial dual booter I know this pain well.
I swear about two thirds of the time going through grub on every boot adds to the process are waiting for my monitor to figure itself out. Half the time it doesn’t get there on time at all.
If your mobo has an efi bootloader, which now-a-days almost all are, make sure grub is also an efi image and don’t allow the early boot to take control of the frame buffer.
Setting these flags for the bootloader, grub in your case, should make sure the monitor only does a single initialize.
GRUB_TERMINAL_OUTPUT=gfxterm GRUB_GFXPAYLOAD_LINUX=keep
Source: just went through something similar and was annoyed that the monitor would take forever to start.
I can give that a whirl if it’s not set up like that already, but the monitor is VERY slow on its own. It basically never wakes up in time for the BIOS bootscreen and any signal interruption sends it on a wild goose chase of signal searching around its inputs that can take ten seconds at a time. It’s not a cheap monitor, either, which I assume is part of the problem, as it wants to be super smart about a bunch of things and has to contend with a bunch of options and alternatives that maybe a simpler setup wouldn’t.
Still, worth a shot to try to tune grub and double check if it’s swapping modes unnecessarily between the bios image and the menu. I hadn’t considered it. Like so many Linux features and app there’s a bunch of stuff you can config on it that I keep not looking into because it’s only surfaced in documentation, if that.
EDIT: Tried, didn’t help. The motherboard rebooting gives the monitor just enough time to search its display port input, decide it’s been unplugged and shut down, so by the time another monitor picks up the slack it’s too late and the timeout has expired unless you’re mashing down to stop it. The changes do make the second monitor come up at its native resolution instead of changing modes, but the mistake happens elsewhere.
I could just set a longer timeout, but I’d rather have a faster boot when I’m sticking to the default than wait for the whole mess to sort itself out every time. Been mashing bios entry buttons and bootloader menus since the 90s, what’s a couple decades more.
Still dumb, though.
Sorry that didn’t help! Thanks for trying though.
I use an old Sony TV from around 2008 as my monitor, I can turn on the TV and via my laptop screen shut the laptop down then manually boot again and it’ll be fully booted before the TV is ready and showing the desktop. Got it from my neighbours when they tried to throw it out, it’s not amazing but I’m very happy to have it as I wouldn’t have anything other than the even more shitty 720p laptop screen otherwise.
Now hang on, 2008 ain’t that old…
Realizes it was 17 years ago
Slowly walks into the sea.
I updated my resume recently and…yeah.
yeah.
Yup, my Sony Bravia is great for movies except some quirks:
- takes over 10 seconds to sync to HDMI
- panel is 1366×768 but only 1360×768 is accessible over HDMI (it can be shifted up to 3 pixels left/right though)
- its LUT for color brightness is all messed up with RGB HDMI signals, the lowest 30 or so brightness steps map to full black and then the brightness takes off steeply. A YCbCr-capable GPU is needed to correct this (an inverse LUT is techniclly possible but will not compensate for the awfully giant steps in dark areas unless the GPU also adds dithering).
panel is 1366×768 but only 1360×768 is accessible over HDMI
Weird. Did they decide that resolution wasn’t cursed enough to begin with?!
I wonder if HDMI requires resolutions to be evenly divisible by 8. 1366 was always strange. I’m not sure I’ve ever seen it on an external monitor, mostly just cheap laptops.
1366 was always strange.
1366 is 768 divided by 9 times 16, to get an image with 16:9 (widescreen) aspect ratio. The 768 part comes from 1024x768, which was a very common screen resolution back when.
We have several crusty 1366x768 monitors kicking around my workplace, and none of them in regular use because they’re awful. I am reasonably certain part of why they’re so awful is because they are indeed repurposed cheap laptop panels slapped into even cheaper shells. Supporting evidence here is that they’re all significantly and suspiciously lighter than our other monitors. At least this helps with their usual use case, which is carting around to be portable temporary setups for diagnosis and troubleshooting. Every time we need a spare monitor the boss inevitably winds up ordering whatever the first option is on Amazon when sorting by price, and that’s how we wind up with these.
I notice several of them also run off of wall warts with really weird voltages. I think one of them we have is 8.5 volts.
I used to have a 1366x768 monitor. They’re actually pretty common, at least here in Brazil
And it accepts 1080p, but downsamples it to the resolution you mentioned.
Nah, just 1080i. And this will fill the screen (in fact, with slight overscan) but obviously native resolution is better.
Some Bravia models had 6 analog inputs (not counting VGA+3.5mm), at least one of which was a full-featured SCART port with RGB support and AV output to the VCR. And interlaced content worked seamlessly, and probably looked better than on modern TVs.
I believe my Bravia was showing 1080p when connected to PS3 via HDMI, but I might be misremembering. But yes, it had inputs galore on the back.
I believe that, it is a wide range of LCD TVs from pocket models to projection monsters.
All that ricing, and what did that get you?
But why do they always need that long? Can’t they check inputs in parallel or what?