It happens to the best of us. We power up our project and immediately run into issues. Be it spotty communication or microcontroller reset or any number of bugs that have us mystified and picking though our code… only to find that it’s a power supply issue. Anyone who has tried doing Raspberry Pi stuff and depended on the USB power from their PC has certainly been bit by this. It’s the same with larger, more power hungry projects as well. [Nerd Ralph] has been running a mining rig for a few years now, and has learned just how important proper power supply management can be. His strategy involves using interlocks to ensure everything powers up at the same time to avoid feedback problems, running a separate ground wire between all GPU cards and the PSU and running the supplies at 220 for the NA folks. For the voltage setting, I thought that wattage equaled volts times amps? Shouldn’t the waste heat be the same regardless of whether it’s drawing ten amps at 120 or five amps at 240? Or is it more complex and nonlinear than that? I’m curious. Five vs Ten amps is a big difference in heating in the power factor correction circuit of his power supply. If you find a data sheet for simila...