My favorite example of this sort of nonsense was an advertising image I saw when I was looking for a digital microscope. Had some very tiny wires to solder and wanted to get a feel for prices.
I've done it twice actually... But I come from an embedded engineering background.
Replaced some dead caps on an expensive GPU once and the other time it was a laptop where some of the GPU memory had broken(? IDK really how it happened, it was my boss's personal machine, so few questions were asked) the connections.
In the latter case we desoldered all the tantalum caps and put the motherboard in our reflow oven. Then resoldered the tantalums. The fear being that tantalums wouldn't survive the oven we used for prototypes in the RD department I was in at the time (I count this as IT, as the admin was also an RD developer).
Both times it worked.
With that said, I don't think that I've even seen a soldering station in an IT department since the mid 00s.
There's a few tricks you can do in overclocking where you replace shunt resistors. It bypasses power limit protections by making the board think it's drawing less power than it is.
That and replacing dead caps is about the only reason to touch a soldering iron to a GPU.
When I worked at a computer shop if I thought a power supply was iffy I would plug my backup in outside of the case just like that. Was great for diagnosis.
I would also solder some laptops, dc jacks mostly. But not like that.
I’ve used an extra PSU to power a graphics card. Things weren’t properly compatible, so I had to improvise a bit.
When you start the computer, you also need to use a paper clip to start the second PSU, because otherwise the graphics card will scream in terror until you give it the power it demands. It was probably the most ghetto style computer I’ve ever had.
There was a point around the early 2000s where having a second PSU was a possibility for overclocking. I've still got my modified case with a second PSU in the optical drive bays.
From memory, the Pentium 4 would draw something like 120w, the hard drives would draw a bit more, then the graphics card would, and if you were pushing your limits, you'd have loads of fans and maybe a peltier cooler. Now known to be massively inefficient, we thought they were great at the time.
On top of that, you could only usually get low powered PSUs at the time. 350w and 500w were the norm, and you could get 650w if you were lucky. 800w were seen in magazines, but you'd have to remortgage your parents house to get one.
And those 350W ones might catch fire if you tried to pull more than 200W. There's an old video of a 300W supply being pulled at its max rating, and it was taking 900W from the wall. That's 600W it's turning into heat.
Johnnyguru may have singlehandedly fixed the whole PSU market. There was so much garbage back then, and few other places were giving them the tests they needed.
Spot on. I used to run a second PSU for my peltier cooling back in those days when overclocking. PSUs were also not nearly as powerful as they are now. 300w was average.
Excellent form, your non-soldering hand must have good contact with the PCB. Then you wave the iron like a wand, and incant "solderus fluxus", pay special attention to your pronunciation, and try not to blink as you envision the tiny components rearranging before you.
Tbf, I once ran a PC with two PSUs at the same time, because I suspected one to overload, but had no 2nd powerfull enough to run the whole system. It kinda worked, bjt the system broke down due to other reasons...
I bought a Cooler Master Stacker 810 back in 07 almost exclusively because it could fit two PSUs. All the cool kids over at XtremeSystems were doing so teenage me thought I should as well.
I never got around to needing another PSU, but I did learn to jump start an ATX PSU, and I still have the case.
Yeah, my two main servers use redundant power supplies, my AI GPU server has no less then five, non-redundant power supplies, and my partner’s and my gaming rigs have two each, one for the damn GPUs and one for the rest of the system.
I have daisy chained two PSUs to power a motherboard with 4 GPUs. Would recommend buying appropriate gear unless you like the smell of melted electrics
Doesn't need it, the CPU is only 350W TDP for the 96 core variant, but a rig like that tends to also be loaded with 2-4 GPUs for compute workloads and a fuckton of ECC memory, which tends to use (far) more power than standard dimms.
I'm installing a (1+1 redundant) 1200W PSU for now, as I initially will only have a single GPU and a single DIMM per memory channel to do the platform validation.
In your typical gaming setup, you could perfectly use a single PSU, even an 850W one would probably do just fine as there's no games that'll 100% all the cores anyway and threadripper cores are ludicrously well optimized for power, especially compared to anything Intel offers in the desktop, workstation or server market.