I've been rocking a 1080ti since launch. Upgraded my 4th gen i7 to a 9th gen i9 on a sale a few years back. SSD upgraded when I got some that were going to be recycled.
Eventually I want to move to team red for linux compatibility. Other than that, I am sticking with what I have. (Doesn't help that I have 2 small children that all my money goes to. )
I genuinely dont understand this. On time my friend bought an rtx 3060 (was using rx580).
I asked "oh cool, whay new games are you gonna play?". She said "none, I'm just gonna play the same ones". I asked "what was wrong with the old card?" And she said "idk just felt like I need a new one." We play games like tf2...
I just don't get this type of behaviour. She also has like 14 pairs of sneakers.
i just upgraded this year, to an r9 5900x, from my old r5 2600, still running a 1070 though.
I do video editing and more generally CPU intensive stuff on the side, as well as a lot of multitasking, so it's worth the money, in the long run at least.
I also mostly play minecraft, and factorio, so.
ryzen 5000 is a great upgrade path for those who don't want to buy into am5 yet. Very affordable. 7000 is not worth the money, unless you get a good deal, same for 9000, though you could justify it with a new motherboard and ram.
I showed this to my penultimate daughter, who coopted my (literal 2014) Dell PC, the only thing I'd ever done to it was add memory, it is a beast still. Said "look, your 4chan twin" and she cracked up. But if she does not steal it when she moves out I will probably be able to get ten more years out of it.
I'm the one person who people go to for PC part advice, but I actually try to talk them down. Like, do you need more RAM because your experience is negatively impacted by not having enough, or do you just think you should have more just because?
Ha, I had this exact conversation with a friend of mine a few days ago, he wants to upgrade from 16GB to 32GB and when I asked why, he just blanked out for a while and went "...because more is better, right?"
He spends most of his time playing rpg maker porn games and raid shadow legend, also really taxing that RTX 3070 he bought right in the middle of the pandemic.
If you want to stay with Windows for whatever reason, even 11, I can recommend Revision Playbook. It locks your installation and scrapes out the crap like unwanted updates and features like AI bullshit, Edge, Telemetry and whatnot. You can even manually install Apps from the Store without the Store if you like to. Security patches and selective updates come only via manual download from MS catalogue in my case, but you can automate this too with some tools.
I built a PC in 2011 with an AMD Phenom II. Can't remember which one, it may have been a 740. And I'm pretty sure a Radeon HD 5450 until FO4 came out in 2015 and I needed a new graphics card. Upgraded to a Radeon R7 240, and some other AM3 socketed CPU I found for like, $40 on eBay. By no means was I high end gaming over here. And it stayed that way until 2020, when I finally gutted the whole thing and started over. It ran everything I wanted to play. So I got like, 9 years out of about $600 in parts. That's including disc drives, power supply, case, and RAM. And I'm still using the case. I got my money's worth out of it, for sure. The whole time we were in our apartment, it was hooked up to our dumb TV. So, it was our only source of Netflix, YouTube, DVDs, and Blu-rays. It was running all the time. Then, I gave all the innards to my buddy to make his dad a PC for web browsing. It could still be going in some form, as far as I know.
I remember the 5450! I got one when wrath of the lich king dropped because my Dell integrated graphics couldn't handle strand of the ancients. That baby got me from 2 FPS to 15. Served me until I left for school.
I barely remember it, which is think is a compliment because it just worked! Never had any driver issues or temperature problems, didn't demand too much power. It just did its job until I needed something more.
People want shiny new things. I've had relatives say stuff like "I bought this computer 2 years ago and it's getting slower, it's awful how you have to buy a new one so quickly." I suggest things to improve it, most of which are free or very cheap and I'd happily do for them. But they just go out and buy a brand new one because that's secretly what they wanted to do in the first place, they just don't want to admit they're that materialistic.
I have heard that Windows underclocks your CPU over time, to make you buy a new computer, and so Microsoft can get money from the new PC's preinstalled Windows license.
I have heard that Windows underclocks your CPU over time
I would say this is half true. Microsoft is known for pushing lots of software updates with unwanted features, so itâs probably that a computer will feel slower over time.
However thatâs not an underclock itâs just that the CPU canât keep up with that much bloatware.
People live in times of historic standstill. Society barely develops in a meaningful and hopeful way. Social relationships stagnate or decline. So they look for a feeling of progress and agency in participation in the market and consuming.
They don't realize this because they aren't materialistic enough, in a sense that they don't analyse their condition as a result of political and cultural configuration of their lives so that real agency seems unavailable
I originally built my current PC back in 2016 and only just "upgraded" it last year. I put upgrade in quotes because it was literally a free motherboard and GPU my buddy no longer needed. I went from a Core i5 6600K to a Ryzen 5 5500GT and a GTX960 4GB to a GTX1070. Still plays all the games I want it to, so I have no desire to upgrade it further right now. I think part of it is I'm still using 1080P 60Hz monitors.
I was running one from 2011 up until 2 years ago when I finally hit a wall in a game I was trying to play and had to upgrade the processor (which meant a new motherboard, which meant new everything). Prior to that I had only upgraded the GPU a couple years prior which i really didn't need but it was a present to myself and I was able to give the old one to my brother. By the time this one is outdated I might not even be interested in computers anymore with the way things are going with technology.
I thought anon was the normie? The average person doesnt upgrade their PC every two years. The average person buys a PC and replaced it when nothing works anymore. Anon is the normie, they are the enthusiasts. Anon is not hanging with a group of people with matching ideologies.
The word is incredibly vague and fails to reflect the diversity of viewpoints and opinions. Everyone has their own perception of what is most common, so the definition varies wildly.
They're invested in PC gaming as social capital where the performance of your rig contributes to your social value. They're mad because you're not invested in the same way. People often get defensive when others don't care about the hobbies they care about because there's a false perception that the not caring implies what they care about is somehow less than, which feels insulting.
Don't yuck others' yum, but also don't expect everyone to yum the same thing.
Very well put! I'd also add that most people aren't even really conscious that that's the reason that they're mad. There's ways to express your negative opinion without stating it as a fact or downplaying the other person's taste.
if you had a top of the line pc in 2014 you'd be talking about a 290x/970/980 which would probably work really well for most games now. For CPU that'd be like a 4th gen intel or AMD Bulldozer which despite its terrible reputation probably runs better nowadays thanks to better multi-threading.
A lot of the trending tech inflating minimum requirements nowadays are stuff like raytracing (99% of games don't even need it) and higher FPS/resolution monitors that aren't that relevant if you're still pushing 1080p/60. Let's not even begin with Windows playing forced obsolescence every few years.
Hell, most games that push the envelope of minimum specs like Indiana Jones are IMO just unoptimised messes built on UE5 than legitimately out of scope of hardware from the last decade. Stuff like Ninite hasn't delivered in enabling photorealistic asset optimisation but HAS enabled studios to cut back on artist labour in favour of throwing money at marketing.
One upside of AAA games turning into unimaginative shameless cash-grabs is that the biggest reason to upgrade is now gone. My computer is around 8 years old now. I still play games, including new games - but not the latest fancy massively marketed online rubbish games. (I bet there's a funner backronym, but this is good enough for now.)
I'm still using the i7 I built up back in 2017 or so... Upgraded to SSD some years ago, will be upping the ram to 64gigs (max the mb can handle) in a few days when it arrives...
The experience of playing modern games on a modern AAA "high end" PC is obviously going to be better if you care about things like ray-tracing and high framerates or resolution. You can't really dispute that.
But it would be stupid to say you're wrong if you just want to play that same game on your system if it actually runs. If the game is playable and you're having fun, you're doing it correctly.
I only upgrade when I start to see multiple games a year that just straight up don't work on my computer.
Absolutely it totally depends on what you got originally. If you only got an okay ish PC in 2018 then it definitely still won't be fit for purpose in 2025, but if you got a good gaming PC in 2018 it probably will still work in another 5 years, although at that point you'll probably be on minimum settings for most new releases.
I would say 5 to 10 years is probably the lifespan of a gaming PC without an upgrade.
However my crappy work laptop needs replacing after just 3 years because it was rubbish to start with.
And even then, a few strategic upgrades of key components could boost things again. New gfx card, a better SSD, more/faster RAM, any of those will do a lot.
I built an overkill PC in February 2016, it was rocking a GTX 980ti a little before the 1080 came out, and it was probably the best GPU out there, factory overclocked and water cooled by EVGA. My CPU was an i5-4690k, which was solidly mid range then, but I overclocked it myself from 3.5GHz to 5.3Ghz with no issue, and only stopped there because I was so suspicious of how well it was handling that massive increase. I had 2TB of SSD spaceand like 8TB of regular hard drives and 16GB of ram.
Because I have never needed to think about space, and so many of my parts were really overpowered for their generation, I have always been hesitant to upgrade. I don't play the newest games either, I still get max settings on Doom Eternal and Read Dead 2 which I forget are half a decade old. The only game where it's struggled in low settings is Baldurs Gate 3 unfortunately, which is made me realise it's ready to upgrade.
The computer I built in 2011 lasted until last summer. I smiled widely when I came to tell my wife and my friend, where my friend then asked why I was smiling when my computer no longer worked.
"Because now he can buy a new one" my wife quickly replied đ
I'm rocking a Ryzen 2700x since 2018, or early 2019, and it's still working like a champ. Granted Cities Skylines 2 is a bit much for it but I've been playing Baulders Gate and Helldivers with about a 100 fps average.
Mine is from 2011 and still going strong. It had some upgrades like extra ram, ssd and a new gpu a couple of years ago and I had to replace the front fan. It starts making a horrible noise about 4 hours into a gaming session with a graphically demanding game, but apart from that it runs perfectly fine. I don't really play demanding games usually so I don't really care. When it finally dies, I might just swap out the motherboard and cpu and keep the rest. It's my personal ship of Theseus.
That CPU started as a development Linux workstation, then as Windows gaming rig, then served couple of years as unRaid server and now runs a Windows 10 workstation for my mother in law. Still fast enough for everyday use.
My i7-920 lasted a lot longer than I ever thought it would. I still have it but i don't need the power anymore since I don't have time to PC game. Actually it was in a P6T v2 and I think I replaced it with a xeon processor.
IDK I have 200+ games and they all work. In terms of AAA I played all the recent Fallout, Doom, Tomb Raider and many others. I even played Hellblade in VR. Definitely good enough for me.
What sorta stuff do you play? I built an i5 2500k system a couple years back (2020-ish) and it struggled a fair bit, but was on the cusp of 1080p60 in the few games I tested like Fortnite, f1-2019, Warzone etc.
I just don't play online games, never have.
I can play pretty much any single player/coop game at medium/1080. Maybe most recent titles like Elden ring would struggle, but I have hundreds of games in my library and they all work fine.
I even made a small VR project with it although every manufacturers said it wouldn't work. The GPU is a 1060.
Overall, I've spent around 600$ on this computer, over 15 years and it still a perfectly capable PC. I have another PC and Macbook for work, but the i5 has been our streaming/gaming pc for years.
Undervolting (when done correctly) won't damage PC parts.
Yes, it reduces the voltage supplied to the components but CPUs and GPUs are designed to operate within a specific voltage range and you keep the voltage within this range. Even if you reduce the voltage below the recommended range, the system may become unstable but this doesn't cause damage â it simply results in crashes.
If not playing competitive, there's very little reason to go latest and greatest. Just buy something with software support, or use Linux where support is practically guaranteed for at least a decade
Linux is actually a problem area here, because various crucial libraries for running games have limited support for hardware that old. I tried for a long time to get it working with stuff from 2012, my problems disappeared after upgrading my cpu recently. Something with Vulkan compatibility I think.
That is only really a problem for CPUs one would consider today as ancient like a Pentium 3 from 1999 because it doesn't have e.g. SSE2 support which Wine (and afaik Vulkan) needs. Everything after that should work without any problems.
With older or slower CPUs performance may suffer, of course, but that is not a compatibility question.
We reached the physical limits of silicon transistors. Speed is determined by transistor size (to a first approximation) and we just can't make them any smaller without running into problems we're essentially unable to solve thanks to physics. The next time computers get faster will involve some sort of fundamental material or architecture change. We've actually made fundamental changes to chip design a couple of times already, but they were "hidden" by the smooth improvement in speed/power/efficiency that they slotted into at the time.
Also... There were significant improvements with intel Sandy bridge (2xxx series) and parent is using an equivalent to that. Sandy+ (op seems to be haswell or ivy bridge) is truly the mark of -does everything-.... I've only bothered to upgrade because of CPU hungry sim games that eat cores.
Yeah, I'm with you anon. Here's my rough upgrade path (dates are approximate):
2009 - built PC w/o GPU for $500, only onboard graphics; worked fine for Minecraft and Factorio
2014 - added GPU to play newer games (~$250)
2017 - build new PC (~$800; kept old GPU) because I need to compile stuff (WFH gig); old PC becomes NAS
2023 - new CPU, mobo, and GPU (~$600) because NAS uses way too much power since I'm now running it 24/7, and it's just as expensive to upgrade the NAS as to upgrade the PC and downcycle
So for ~$2200, I got a PC for ~15 years and a NAS (drive costs excluded) for ~7 years. That's less than most prebuilts, and similar to buying a console each gen. If I didn't have a NAS, the 2023 upgrade wouldn't have had a mobo, so it would've been $400 (just CPU and GPU), and the CPU would've been an extreme luxury (1700 -> 5600 is nice for sim games, but hardly necessary). I'm not planning any upgrades for a few years.
Yeah it's not top of the line, but I can play every game I want to on medium or high. Current specs: Ryzen 5600, RX 6650 XT, 16GB RAM.
People say PC gaming is expensive. I say hobbies are expensive, PC gaming can be inexpensive. This is ~$150/year, that's pretty affordable... And honestly, I could be running that OG PC from 2009 with just a second GPU upgrade for grand total of $800 over 15 years if all I wanted was to play games.
Almost exact same timeline, prices and specs here. Just went with the RX6600 instead after hardware became somewhat affordable again after crypto hype and COVID. Always bought the mid-lowend stuff of the then actual hardware, if upgraded wanted/needed. It's good to read of non-highend stuff all the time though.
I only got the 6650 because it was on sale for $200 or something, I was actually looking for the 6600 but couldn't find a reasonable deal.
I make enough now that I don't need to be stingy on hardware, but I honestly don't max the hardware I have so it just seems wasteful. I probably won't upgrade until either my NAS dies or the next AMD socket comes out (or there's a really good deal). I don't care about RTX, VR kinda sucks on Linux AFAIK, and I think newer AAA games kinda suck.
I'll upgrade if I can't play something, but my midrange system is still fine. I'm expecting no upgrades for 3-5 more years.
Still on a 1060 here. Sure, it's too slow for anything from the PS5 era, but that's what my PS5 is for.
It does have a 1 in 4 chance of bluescreening when I quit FFXIV, but I don't know what's causing that. Running it at 100% doesn't seem to crash it, possibly something about the drivers not freeing shit properly, I dunno.
I upgraded last year from i7-4700k to i7-12700k and from GTX 750Ti to RTX 3060Ti, because 8 threads and 2GB of vram was finally not enough for modern games. And my old machine still runs as a home server.
The jump was huge and I hope I'll have money to upgrade sooner this time, but if needed I can totally see that my current machine will work just fine in 6-8 years.
I had an i5-2500k from when they came out (I think 2011? Around that era) until 2020 - overclocked to 4.5Ghz, ran solid the whole time. Upgraded graphics card, drives, memory, etc. but that was incremental as needed. Now on an i7-10700k. The other PC has been sat on the side and may become my daughters or wife's at some point.
I just installed Linux on my old 2500k @ 4.5GHz system a few days ago! I havenât actually done much with it yet because I also upgraded the OS on a newer system that is taking over server responsibilities. But you are correct on the year with 2011. I built mine to coincide with the original release of Skyrim.
The install went quickly (Linux Mint, so as expected) and the resulting system is snappy yet full featured. Itâs ready for another decade of use. Maybe it will be a starter desktop to start teaching my second grader with it. (Educational stuff as well as trying a mouse for games compared with a controller)
I got screwed over with the motherboard, as it had to go back because of bimetallic contracts in the SATA ports that could wear out and stop it working so there was a big recall of all the boards... Was an amazing system though and if I hadn't seen the computer I'm currently running for an absolute steal, I'd probably still be running it with a 3060 as a pretty potent machine still.
Of course, then I'd never have the experience of just HOW FAST NVME IS! :â -â D
I was rocking a i7-4790k and a GTX970 until about 2 years ago, now I'm rocking a i5-10400F and one of Nvidia's chip shortage era RTX2060s. My wife is still on a i5-4560 (by memory) and a RX560 and that's really getting long in the tooth with only 4 threads and the budget GPU doesn't help matters much.
Later this year when Windows 10 gets closer to EOL I figure I'll refresh her machine and upgrade the SSD in mine
Phones suffer a lot from forced obsolescence. More often than not, the hardware is fine, but the OEM abandons it because "lol fuck you, buy new shit". Anyone that says that a Samsung S7 "can't handle current apps" is out of their mind
Other than camera and software, there's hardly any reason to buy new phones over flagships from some years ago.
This. My mobile is over 6 years old. Security updates till 2022, but I don't even mind sec updates. What concerns me more is buy-a-new-phone-every-year-because-reasons, because buy new shit and spybloatware. Skynet is the virus. My old one runs perfectly fine and I buy a new one if it is broken. Even critical apps like banking doing fine. It's not like the whole architecture of the OS changes yearly, right?
Similar story. The only upgrades I made to my 2014 desktop were a 1TB SSD and a used RTX2070 to play BG3 in 2023. I donât care much for the latest multiplayer shoot em ups with simulated leg hair growth, but I can play most other titles from the past year at the highest graphical settings.
I've never had a desktop PC. I have always wanted to build one but never had the money/time for it. I've been using the same laptop since 2016 until it recently started breaking (mostly due to toddlers). I casually mentioned this to a friend and they offered me their kids old PC for free. It was some 4th gen i3 with a 1050ti so pretty old spec. I've uograded with a ssd, a i7 4790k and 980ti all for around $130. It is pretty decent for gaming and ive never had a game run poorly as of yet. Very happy with my 10 year old hardware!
Like as in upgrading a component everyone few years? Sure.
Updating the entire rig everything few years? For the average user, very little point. It used to be you literally had to, to play them newest games. Around 00's I'd say. Games are way more backward compatible nowadays. I had a rig from 2012, to which I updated GPU to a 1060 6gb in 2016. Now I updated the entire rig last year, except for the GPU, which I plan to updated in a few months when 5070/5060ti comes out.
For the average gamer I don't think there's really much need to update more than every five years and that's still being pretty fresh. I can still play on the 1060, even Marvels runs although... eh. My GPU is clearly the bottleneck currently.
With a 5060/5070 I hope to manage till 2030 at least.
My old builds go to the wife and her old PCs upgrade my NAS. By the time I'm done using the hardware it's 10-12 years old. Wife only plays sims anyways. 2080 super is her upgrade to an AM4 3900x and 64gb of ram and 2) 2TB nvme drives. She will get plenty of life from that and then in 5 years, get my current rig. Cycle continues.
My current PC used for gaming is a self built one from 2014. I have upgraded a few things during the years, most notably GPU and memory, but it did an excellent job for over a decade.
Recently it started to show its age with various weird glitches and also some performance issues in several newer games and so I've just ordered a new one. But I'm pretty proud of my sustainable computing achievement.
I've upgraded pretty much everything in my 2009 PC and only just finally bought a new CPU. I just need a new case.for everything. The last straws were Elden Ring being CPU bottle necked at 20 FPS and Helldivers 2 requiring some instruction that wasn't on my CPU.
Only stopped using my Bulldozer-era box because it started crashing and freezing. And a BIOS fix Asus support suggested nuked my board. I had the thing maxed out... 12 SSDs in soft RAID, GTX570s in SLI. It was a monster. I still have most of the parts and I'm sure it would run a lot of stuff just fine at the cost of heat and noise :]
My $90US AWOW mini with Celeron J4125, 8 gigs of shared memory, 128gig SSD seems to run FreeDoom as good as any of the other potatos them GamerBoi fancy water cooled custom boxes have.........
That's where I was a couple years ago. Originally, I had an R9 290. Amazing card circa 2014, but its 4 GB of VRAM aged it pretty badly by 2020. Now I've got a 4070, which is way more than good enough for the 1080p60 that I run at. I'll upgrade the rest of the PC to match the GPU a little better in the future, but for right now, I don't need to. Except maybe for Stellaris.
But I just ripped a bunch of my old PS2 games to my PC because I felt like revisiting them. And my PS2 is toast. RIP, old friend. :(
Maybe it's just my CPU or something wrong with my setup, but i feel like new games (especially ones that run on Unreal Engine 5) really kick my computers ass at 1440p. Just got the 7900xtx last year and using a ryzen 9 3900xt i got from 2020 for reference. I remember getting new cards like 10 years ago and being able to crank the settings up to max with no worries, but nowadays I feel I gotta worry about lowering settings or having to resort to using upscaling or frame generation.
Games dont feel very optimized anymore, so I can see why people might be upgrading more frequently thinking it's just their pc being weak. I miss the days where we could just play games in native resolution.
This is on purpose. Game studios decided that instead of bothering with all sorts of complex graphics hacks to get games to run fast they can just crank ray tracing and use temporal anti-aliasing. The result being that you need one of the latest generation cards to run these games at all since they don't degrade gracefully to lower specs.
Until very recently I was still running a 1080, which runs pretty much any game (even recent ones) at high graphics settings. As soon as a game uses ray tracing or temporal anti-aliasing it won't even run at the lowest potato settings possible.
The invincible should notlook like this at <15fps and be a blurry mess when moving on minimum settings while halo infinite looks way better while rendering way more things on the same machine at high settings at 60fps.
Yeah I'm daily-ing a laptop from 2019 with an i7-9750, a GTX1650, and 16 gb of RAM. No upgrades except storage. The GPU is the only thing that sometimes makes me go "hm."
I'm daily driving a laptop with i7 9750h and 1660ti. Unfortunately I had to convert it to desk only as battery is dead and removed, and touch pad seems to have also broke. Still CPU and GPU work fine. I still wonder if I will upgrade and if I can afford it ever anymore. I bought this laptop for 800 new. Idk, I want a framework just because of repairable nature but I would need to spend close to 2k to match the current 64GB RAM and 2TB of storage.
My old guy's battery is still fine (for reference, fine means about 2-4 hours of screen time, which is about the same as new) but that's only because I keep it locked to 60% and use it as a desktop. I also would want to upgrade to a framework but god they're pricey. Especially the dGPU ones.
I still have my 2014 machine. I've upgraded it with an M.2 drive and more RAM. Everything else is perfectly fine and I wouldn't see the difference with a newer machine. I'll keep it for a long as I can because the longer I wait the better the machine I replace it with will be.
Also I just wouldn't know what to do with it after. I can't bring myself to throwing away a perfectly good machine, but keeping it would be hoarding.
Upgrading my ryzen 7 1700 and GTX 1080 for a 5800X3D and RX 7900 XT this weekend. Waiting for the CPU but it's cool to be able to go from first to last Gen that this motherboard can support
Lol, this feels like exactly what the original greentext was saying. I posted them under a different comment, I also didn't say I was playing everything on high. Playing HL Alyx on medium rn. Sure, not as crisp an experience, but still really good.
It's easy to go too far in either direction instead of just doing what fits your needs (which in fairness, can sometimes be difficult to precisely pin down). Blindly going "it's old, I need to upgrade" or "it still runs, it's not worth upgrading" will sometimes be right but it's not exactly tailored advice.
Someone I know was holding out for ages on a 4790K (2014), and upgraded a year or two ago to a then-current-gen system and said the difference it made to their workflow was huge - enough that they actually used that experience to tell their boss at work that the work systems (similar to what they had had themselves) should get upgraded.
At the end of 2022 I had had my current monitor(s) for about 10 years and had spent years of hearing everyone saying "wow upgrading my monitor was huge", saying that either 1440p was such an upgrade over 1080p and/or that high refresh rate (120+Hz) was such an upgrade over 60Hz. I am (or at least was in the past) a pretty competitive player in games so you'd think I'd be a prime candidate for it, but after swapping from a 60Hz 1200p screen to a 144Hz 1440p screen for my primary monitor I... honestly could barely notice the difference in games (yes, the higher refresh rate is definitely enabled, and ironically I can tell the difference easily outside of games lol).
I'm sensitive to input latency, so I can (or at least could, don't know if I still can) easily tell the difference between the responsiveness of ~90 FPS and ~150 FPS in games, so it's extra ironic that pumping the refresh rate of the screen itself didn't do much for me.
I noticed a night and day difference myself with the refresh rate from going from 60hz to 120hz after waiting for years to do so. I noticed it immediately on first person games because things went buttery smooth.
I was with them until my girlfriend gifted me a 180Hz monitor last year and now I can't deal with less than 90 FPS so I had to finally upgrade my RX580 (I just found out it stopped getting driver updates in January 2024 so I guess it was about time). High refresh rates ruin you.
I'll do you onetwo better: my computer's from 2012. I can play even modern games on high settings sometimes. It wasn't even a high specced one at the time. I think I put about $1200 into the actual components AND monitor/keyboard.
Everyone's different. Maybe for you playing a game on "high settings" in 1080p@30 is enough but others might prefer 4k@60 or 1440p@100 or more fps. Also, define "modern".
My 2008 librebooted t440p thinkpad
Says hold my beer.
Browses the web like its a 2025 desktop
Its amazing
Except for the compile times (it runs gentoo :D)
Yea, for general computing a lot of older PCs are very manageable. I have an old 2008 unibody MacBook laying around that I had to use for a little while a few months ago and it was perfectly usable on mint. Even felt a lot better than a lot of newer machines since apple built them like tanks and their trackpads back then were so ahead of their time they easily beat out a lot of brand new machines.
What generation? People keep saying AMD is worthless for RT but that's really only true compared to the rtx4000 series and games like cyberpunk.
Games with RT that I've played on AMD that ran well compared to Nvidia (because it's still a performance hog) are metro Exodus enhanced and avatar frontiers of Pandora.
I thoroughly dislike upscaling and fake frames though, so other than games like that it's not for me even if I had a 4080 super instead or the 7900xtx
I recently repurposed a xeon CPU/motherboard from 2012 to run my Proxmox server. Bought a rack mount case, noctua fans, new ram, cpu cooler, and gavie it a good thorough cleaning. Not blazing fast, but does the job.
My current PC is an asus rog with a gtx 1070 (and a piece of shit screen that gets all fucky if it heats) that I bought used, back in late 2019. The old hard drive failed some time ago and I had to change it, sometimes the main SSD seems to get strangely fucky (BSODs followed by disk scans), too, as does the memory (BSODs about "corrupted_page_memory", also complete freezes under Linux Mint, not even ctrl alt F1 worked), which makes me think the components aren't exactly high quality (considering how shitty the screen is and asus in general in the past years, that's no surprise)
Still, I fully intend to keep this bad boy as my main workhorse for at least another 2 years, possibly longer. After that, I'll probably relegate it to being the party game machine.
I built my current rig like a month before the COVID lockdowns. Still runs everything on high/ultra even without DLSS (because my 1660 Super is too old to have it) or FSR (in fact, turning FSR on usually makes things worse).
Really, the only game recently released that hasn't given me full 60FPS@1080p consistently is Starfield. But it does run, it runs at 30-40 most of the time and can get 60 in interior cells and I never had it crash on me the whole way through my one, solitary playthrough. Which says a lot considering the track record of stability and performance of Bethesda's games and the fact that my hardware isn't even supported; it's technically below the minimum requirements.
I could say I still run my 2014 (or 15, I don't remember) PC, but it's Ship of Theseus'd at this point, the only OG parts left are the CPU, PSU, case, and mobo.
My PC is still largely the same, in general spirit, as when I built it (c 2014-2015). But I have had to upgrade some key components over time. First was the move from a 1TB WD Blue HDD to a Samsung 860 Pro 128GB SSD (for my OS's drive), and, related to that, at some point soon after, I moved my games drive from an HDD to an SSD. Next, I upgraded my GPU from an Nvidia GeForce GTX 760 to a Nvidia GeForce GTX 1080. This build state lasted a decently long time until I switched from Windows to Linux, so I switched my Nvidia GPU to an AMD Radeon RX 6600 (not exactly an upgrade, but more of a side-grade) to improve the user experience. The most recent change (last year, iirc?) was upgrading my RAM from 8GB DDR3, to 16GB DDR3. My CPU (Intel Core i5-4690k) is starting to really show its age, though, so I've been wanting to upgrade that, but that will likely entail a near rebuild of my entire system, so I've been avoiding it, but, unfortunately, it's increasingly becoming more of an issue.
Same, same. Except I don't really play games, but use the computer for other hobbies. It's still plenty fast and does everything I need it to do. So why buy something that does exactly the same, just is newer and looks different?
When talking about hardware, if it works for you keep using it till it doesn't work. But when talking about desktop operating systems, you should be aware when it loses security updates support and try to upgrade to different one that works for you but has better security updates.
I usually wanted to upgrade my old PC (GTX970) 2 years ago, ended up buying a cheaper PS5 and a used MacBook for cheaper than the PC upgrade. PC still runs fine, is still in use. Also: the M1 MacBook Air is an emulation beast.
The hw might've been cheaper but a console will end up being more expensive than a pc in the long run. Not to mention that the only input method available is a controller. There are quite a few games I would only olay with a kb+mouse so, for some people, a console is not even an option.
If you donât play online and have the patience to wait until you can get games used or deals, consoles can still be worth it.
Iâm a PC gamer but the only games that I play online are in the realm of Minecraft, FallGuys, Raft or Stardew Valley which would run on almost any machine. I also donât really play shooters or strategy games so thereâs basically nothing Iâd need a mouse n keyboard for either.
I have even thought about just getting a PS or XBOX but I ended up upgrading my PC a little to near PS5 performance with a used cheap 5700xt for a little more than half the price of a new PS5. But if you canât do that and would have to build sth from scratch, keeping your old PC and getting a console might be worth it. Even more so, now that you can get good deals on used current gen consoles.
My only concern would be if anon maintains his PC. Sure, anon bought the PC in 2014 abd never upgraded... But dies anon at least open it up once in a while to clean it out or switch the thermal paste?
I always keep my PCs for about 8 years. Usually it is necessary to update the HDD/SSD and the GPU during that time, that is all. Mine will be 4 years old by the end of this year. I am now actively checking out 4TB SSDs in order to replace my current 1TB SSD.
This strategy may stop to work unfortunately. With the advent of ARM in desktop PCs, the PCs seem to become more monolithic. RAM and GPU not swappable, I think MACs don't even allow you to plop in more RAM. I don't like this development.
what? what PCs are you talking about that don't have swappable components? and how are those relevant? MACs is referring to something different than apple?
This is just proving the OPs point. I have a PC from around 2013 with a 970, and I can still play all the games I'm interested in just fine (BG3, AW2, etc). If you absolutely need 4k 120fps at ultra in order to have fun, that's on you.
If you want to play at 15 fps go ahead. I never never said you needed 120fps. Only that the comment about performance is bullshit. The vast majority of gamers would laugh you out of the room.