So, the PowerEdge R820 has a PCIe x16 slot, but there is no auxiliary power cable for graphics cards that require one, and no place to plug in such a cable either. At first glance, it also does not appear that there is any convenient place to tap +12V power within the system.
However, along the front edge of the motherboard (behind the front drive bays) are a set of 6 connectors for the drive bay backplane. 3 of the connectors are for backplane control signals, and the other 3 are for backplane power. On my system, only 2 of the control and 2 of the power connectors are used. The rightmost power connector, labelled "BP3" is available.
The backplane power connectors are 8-pin Molex 43025 series connectors. By visually inspecting the cables going to BP1 and BP2, and using a voltmeter, I was able to identify 3 ground pins and 3 +12V pins. (The remaining two pins are both gray wires on the BP1 and BP2 cables. One of them reads +3.3V and the other nothing. I am guessing that the extra gray wire is a 3.3V sense signal. I just left both pin locations unpopulated where gray wires were.)
The mating plug that fits the motherboard backplane power connectors is Molex part #43025-0800. The insert pins for the plug are Molex part #43030-0001. You can use the official crimp tool or you can try to do it by hand. I soldered my wires into the pins, but you have to be careful not to use wire or extra solder that is too large to fit the pin into the connector afterward. Bought these parts from Mouser.
For the cable, I used an spare PCIe auxiliary power cable that came with a PC power supply that has detachable cables. After my assembly with the connectors, the cable is exactly 20.75 inches long between the connectors. This is perfect for my setup. If your graphics card is a little shorter or the PCIe auxiliary power connector is in a different location, you may need a slightly longer cable. But don't make it too long because there isn't a lot of room to stuff any excess.
A photo of the pinout diagram is in the gallery. Study it carefully. The diagram shows the pin functions looking into the end of the BP3 connector on the motherboard. Double-check everything with a voltmeter before frying your server or GPU.
The AMD FirePro W7000 graphics card I used was a decent workstation card that I had lying around. It barely fits in this slot. Originally it had a metal RAM heatsink that extended from the bottom side of the card, making the card much longer, but I removed it. Just be sure to put the screws back in the card to hold the top side heatsink / fan assembly on securely. :) I don't think the RAM really needs a huge heatsink, and there is plenty of cool airflow in this server. No problems so far...
The back edge of the GPU card comes right up to the top CPU daughterboard, and the edge of the PCB rests directly on top of a metal bracket near the rightmost RAM slots. There is a PCB trace on the bottom of the GPU right along this edge, so I covered it with a strip of electrical tape just so it doesn't wear thru and short out agains this metal frame.
There is very little clearance above the top the of the GPU for air intake into its cooling fan. With forced airflow through the server, I don't think we need a lot, but I put a little rubber foot on the top corner of the GPU card near the PCIe auxiliary power connector, just to gently push the card down when the top cover of the server is installed and create a little gap where air can get sucked in to the fan. The foot is about 3/16" thick. I then decided to put a dab of lithium grease on top of the foot, to lubricate it a bit as it was extremely difficult to slide the top cover off again, sliding against the rubber. Now it's ok.
The auxiliary power cable was bent down and to the right, under the card, and routed back along the top right edge of the server. There is a space for routing cables there that is perfect. It then comes out just in front of the system fans, and curls around behind the drive backplane to plug into the BP3 connector. See gallery photos. It's very nice routing and doesn't get in the way of serviceability or anything.
I used the latest AMD Radeon Pro WHQL drivers 17.q4.1. No issues at all. Once installed, you can attach up to 4 monitors to the FirePro W7000 (it has 4 DisplayPort connectors, which also apparently support audio.) You can pick your primary display, and windows will switch to it during the OS boot process. The server's built-in VGA graphics can be used as an extended desktop or simply disabled. I noticed that when set to disabled, the last image on the boot screen just stays on that monitor when windows switches over to the W7000. I kind of wish they would black it out or turn it off, but oh well. Just don't plug a monitor in.
I didn't mess around with the BIOS or anything (it takes FOREVER to reboot this server), so I don't know if there's a way to tell the BIOS which graphics card to use for POST, early boot, etc. However I wouldn't recommend changing that anyway as iDRAC uses the built-in VGA for remote access, etc. You probably want to have that or at least a VGA monitor connected for management purposes.
Finally, I decided to "hack" my temporary install of Windows 10 to support all 4 CPUs installed in this server. Windows Home only supports 1 socket, and Pro only 2. You have to use the "Server" versions of Windows to enable more sockets. However, I found a little tool that modifies the registry and seems to work just fine, even on Home. (Of course I did all this testing with an unactivated install of Windows, just in case it all went south...) The tool is called "ProductPolicyEditor" and can be found here. Read about it here. Changing these registry values does require the disabling of Windows' Software Protection Service (sppsvc), which apparently is involved in Windows activation, too. If you re-enable the service, the values are changed back on next reboot, but you can run the tool again if needed. The tool reboots into setup mode, allows you disable sppsvc, and modify ProductPolicy registry values. After rebooting, they are applied. I make no claims regarding the utility, safety, security, or legality of this tool. I didn't write it. Use at your own risk.
This is useful for quick and dirty testing, but if you are running a system with more than 2 CPU sockets you should probably just use a Server edition of Windows, or some other operating system that supports your hardware. :)
A variety of 3D games I had lying around ran perfectly, using a remarkably tiny amount of CPU and GPU resources. :)
More photos, etc. can be found at this Dropbox link.