Description

I'm using this machine for training and evaluating neural networks. It's also plugged into the internet, and serves as my server for quickly downloading and batch processing data remotely.

My main goals were to the GPU power and memory high, the cost down, and keep it quiet.

The X99-E WS/USB 3.1 has enough space for four cards. The manual that ships with it has some out-of-date information that made me worried, like it said that you could only run two PCIe cards at x16 if they were in slots 1 and 3. Fortunately this isn't true. I was also concerned that I would have to update the BIOS on the motherboard before I could use the CPU, but the board I was shipped (ordered from Amazon) already had the latest firmware.

This setup has enough space for 4 GPUs in theory, but in practice the MSI Gaming X takes up around 2.5 slots, so I would have to switch to completely different GPUs. The motherboard would also be operating at x8 for each PCIe slot with 4 GPUs, which can be a problem for some applications.

The 400Q case combined with the H80i help keep everything quiet. I work at the intersection of two busy streets, and I can't hear the fans over the noise of the streets.

I configured the fans for cool air to enter from the front and leave through the back. I used an extra fan from the H80i on the front of the case, and positioned the front fan that shipped with the case closer to the two GPUs. I found that the watercooling tubes from the H80i wanted to push on the top GPU, but with by zip tying them together it's possible to hold them back.

I had a little trouble figuring out the best way to run some of the cables, but eventually settled on running everything except the GPU power around the back of the motherboard. The ATX 12V power cables at the top of the motherboard had to stretch a little bit, but everything else was comfortable once it was organized. It might have made more sense to run the GPU power behind the back, because right now it's very close to stopping the fans on the bottom GPU. But if I did that, I would have to run the main ATX power differently, because they would overlap making it hard to put the sides on.

The additional hard drive in the photo is something else I had laying around I threw in.

Installing Ubuntu 16.04.3 I had some minor hiccups. Adding "nomodeset" to the GRUB startup helped me get a GUI for installing and using Ubuntu until I could install the NVIDIA drivers.

Comments

  • 26 months ago
  • 2 points

I have always been curious what these neural training PCs actually do for these networks, is there a TLDR of why they need so much horsepower and what neural networks actually do?

Anyways, sick build, you are living the dream my dude.

  • 26 months ago
  • 2 points

thanks! tl;dr is that neural nets are just a big set of equations: multiplications and additions to convert from input (like an image) to an output (like the image label: cat, school bus, flute, etc.). figuring out the right numbers for the equations requires trying out different combinations and tweaking them until converging on the right set of numbers (usually via a process called stochastic gradient descent). there's no shortcut for finding these numbers, you have to setup a machine like this to process data for days, and wait for it to converge. when it's done you get something that can track faces, or drive a car, give you better recommendations on soptify, pick stocks, write music, etc. there are a couple demos you can run in your browser here: https://mil-tokyo.github.io/webdnn/ but the tech is already behind a lot of everyday products including speech recognition, translation, image tagging, and even some advertising.

  • 26 months ago
  • 1 point

Ohhhhh, thanks for the tldr!

  • 26 months ago
  • 1 point

When you put the cpu cooler on the back, is it an exhaust or intake?

  • 26 months ago
  • 3 points

From the fan position it's exhaust.

  • 23 months ago
  • 1 point

Correct, exhaust. Corsair also has a diagram showing this as the recommended design for their 400Q case.

  • 24 months ago
  • 1 point

Can you share why you went with the X99-E WS instead of the X299? Supposedly X99 is an outdated platform and it has terrible reviews.

Thanks for sharing your research process here (https://pcpartpicker.com/forums/topic/242541-deep-learning-with-2x1080-ti)

Very useful.

  • 23 months ago
  • 2 points

I went with the X99 because I was already spending a bunch on everything else and wanted to get a fast multicore CPU without spending a bunch on the CPU. I've also seen other similar machines using this motherboard and wanted to go with something that I was sure would work. My guess is that in another year the price will come down and enough people will have built machines with X299 to make it the clear answer, but the time wasn't right for me yet.

  • 23 months ago
  • 1 point

Now that it's been 3 months since the build, how is it doing expectation wise. Performing where you want it to?

  • 21 months ago
  • 1 point

It continues to run cool and quiet while allowing me to push things pretty far. I took across country in a flight case recently and didn't have any big problems. The GPU power started bumping into the fan after that, but the thumbscrews made it easy to remove the side and fix things.

  • 22 months ago
  • 1 point

I am using your design as a basis for my own deep learning machine. Imitation IS flattery. Thanks for sharing.

I have so many questions though...

I'm curious about the use of the GPUs in slots 1 and 5. Is the system configuring itself in triple PCIe mode, as shown in the manual, with the unused slot 3 being configured as x16 ?

Do you get full PCIe 3.0 x4 bandwidth to the M.2 connector ? Actually how do you use the M.2 connector, it looks like it is totally blocked by the GPU in slot 5 ? How fast is Ubuntu booting from the 960 Evo ? Did you have any problems with NVME ?

  • 21 months ago
  • 1 point

I'll share what I can remember.

I know that using slots 1 and 5 gives me x16 in both. I didn't check to see how slot 3 is configured.

You must install the 960 Evo in the M.2 connector before installing a GPU in slot 5. It is inaccessible after installing the GPU.

PCIe 3.0 x4 bandwidth tops out at ~4GB/s, the Evo claims 3.2GB/s sequential read and 1.8GBps sequential write. Testing with the fio tool I see 2.4GB/s sequential read and 1.7GB/s sequential write. I'm not sure why the read is lower than reported.

The output of systemd-analyze is: Startup finished in 11.599s (firmware) + 6.274s (loader) + 9.573s (kernel) + 12.931s (userspace) = 40.378s

Part of that might be my configuration. systemd-analyze blame reports as the top offender: 8.078s NetworkManager-wait-online.service

I haven't had any NVMe problems.

  • 20 months ago
  • 1 point

Thanks for your help. I have gone ahead and bought my components. I'll put the build on partpicker uk when I have got it working.

  • 19 months ago
  • 1 point

Very nice built! I am planning on building mine, doing research on what components to buy now. The plan is to build one with two 1080 ti like yours now, but with space to upgrade to 4 in the near future. Do you think this case is compatible?

Thanks for sharing the built!

  • 15 months ago
  • 1 point

Thanks for all this detail. I'm about to do the same thing, only want to put it in our server rack. Have you been happy with the build? Are there any components you would do differently?

Thanks!! -Jeff

  • 13 months ago
  • 1 point

Greetings, Were Ryzen CPUs considered for the build and eliminated for some reason? Thanks

[comment deleted]