Same. I can't even be in the same room anymore.
Somehow made gaming there feel like a chore as well...
So bored of games, that I'll go work on some other stuff (home servers) and just taking care of day to day life stuff. Maybe I'm just growing up, but I never thought I'd be bored of games... Ever...
I solve this by putting an Xbox downstairs and only playing games where short sessions are possible (rocket league, racing, indie games). I'd love for them to port WoW so I could grind/level up on the Xbox and go to the PC for raids though realistically even like that I probably still don't have enough time anymore.
Working from home has empowered PC gaming for me. I just play a little when I'm stuck, and closing the work laptop screen is enough to disassociate from the working environment. I even have a kvm and reuse the same screens, same keyboard, same mouse.
I work on my own PC, but keep two separate desktops going. One for personal, one for work. Ctrl + alt + arrow key to switch between. We don't have any dystopian monitoring software to install, so it works great for my use case.
I won't do that during the work day, but having a KVM switch could be helpful. Everytime I want to use my personal machine I need to unplug and replug the USB hub and monitor. It's only a couple of cables, but it's annoying enough to the point I usually don't bother.
Actually, I just saw this post, I really should get a KVM switch...
The irony is that nowadays the monitors would be swapped. The "good PC" would have a CRT (because most CRTs nowadays are probably in enthusiast rigs), while the "bad PC" would have the common 1080p Dell IPS display.
On an semi-related note, why are Dell's IPS panel monitors so ridiculously common? VA and TN panels are a lot cheaper, so I'd think companies wanting to get the most bang for their buck would use those instead. Is it the fact that IPS panels have a decent horizontal viewing angle, so Mr. Micromanager can look over your shoulder and see what you're doing more easily?
Dell produced monitors is much larger numbers that most distributors like CDW etc will have in every warehouse in the country. This makes it much easier to standardize equipment across a large organization when you can always order the exact same SKU for several years in a row.
Where are you getting this information about CRTs from? I know they get used for old school emulation, but pretty sure for modern systems a high refresh rate and freesync/gsync is where it's at.
People who are into older games tend to have a CRT + retro rig or digital to analog converter. A lot of older PC games legitimately look nicer on CRTs. Additionally, CRTs can have ludicrously high refresh rates and resolutions, don't let the 4:3 aspect ratio fool you. High-end CRTs (specifically computer monitors, not TVs) tended to max out at 1600x1200 (vs 1920x1080), giving them a slightly larger vertical resolution at the cost of a lower horizontal resolution, with some going as high as 2048x1536 (comparable to 1440p (yes, 1440p, CRT computer monitors were mostly progressive scan, not interlaced like TVs)). Additionally, the refresh rates on later CRTs tended to start at 75hz (vs 60hz on LCDs), and could max out at 200hz on high-end monitors. You'd sacrifice resolution to do so, though I think you could mitigate some of that by using a BNC cable if your monitor supported it (though I doubt most rigs could run anything even close to 200fps without decreasing resolution). Finally, CRTs tend to have extremely low response times, very good color depth, and true blacks.
That said, CRTs are heavy, fragile, and nowadays, expensive (before the pandemic you could get a high-end Sony Trinitron 20" PVM (professional video monitor) for like, $300-$400; shipping was more expensive than the monitor, nowadays you're easily talking $1000 or more). Most LCD panels can beat CRTs in resolution and refresh rate nowadays (though even high-end LCD panels tend to struggle at beating CRT response time), and OLEDs outclass CRTs in almost every way.
Edit: oh, another weakness of CRTs is that they can burn-in. That's where the term originated. If you left an image on the screen too long, it'd burn into the display, causing it to persist even after the monitor was turned off and unplugged. Since no one's making CRTs anymore, that means there's a smaller and smaller pool of CRTs in good condition, which means they'll get more expensive until someone decides it's worth the money to start making the tubes again.
Edit 2: that's also why screensavers were a thing! Screensavers were there to stop you from accidently burning in your monitor. I wonder why they haven't made a comeback with OLEDs.