Changes

Jump to navigation Jump to search
1,334 bytes added ,  12:58, 11 May 2019
no edit summary
==Documentation==
 
The documentation from NVIDIA is here:
*https://docs.nvidia.com/dgx/digits-devbox-user-guide/index.html
*https://developer.nvidia.com/devbox
 
Unfortunately, the form to get help is closed:
*https://info.nvidianews.com/early_access_nvidia_3_15.html
*https://www.reddit.com/r/buildapc/comments/3gewmz/build_complete_nvidia_digits_devbox/
 
Other people who have bought one include:
*https://www.pyimagesearch.com/2016/06/06/hands-on-with-the-nvidia-digits-devbox-for-deep-learning/
 
Other people who have built one include:
*Hardware spec only:
**https://www.azken.com/download/DIGITS_DEVBOX_DESIGN_GUIDE.pdf
**https://www.reddit.com/r/buildapc/comments/3gewmz/build_complete_nvidia_digits_devbox/
**https://cellmatiq.com/?p=155
**http://graphific.github.io/posts/building-a-deep-learning-dream-machine/
*Better instructions:
**https://medium.com/yanda/building-your-own-deep-learning-dream-machine-4f02ccdb0460
 
Information on related builds or on installing OS and the key software components include:
*https://www.oreilly.com/learning/build-a-super-fast-deep-learning-machine-for-under-1000
 
Some firms, including Lamdba Labs, Bizon-tech, are selling variants on them, but the details on their specs are limited (the MoBo and config details are missing entirely):
*https://lambdalabs.com/deep-learning/workstations/4-gpu
**https://pcpartpicker.com/b/FGP323
*https://bizon-tech.com/us/bizon-g3000
*http://deeplearningbox.com/
 
And the original is currently unavailable from Amazon:
*https://www.amazon.com/Lambda-Deep-Learning-DevBox-Preinstalled/dp/B01BCDK1KC
 
At around $15k (the Lamdba variants go from $10k to $23k), buying one is prohibitive for most people. But the parts cost is perhaps $5k now for the original spec.
 
==Hardware==
We mostly followed the original hardware spec from NVIDIA, updating the capacity of the drives and other minor things, as we had many of these parts available as salvage from other boxes. Though we had to buy the ASUS X99-E WS motherboard (as well as some new drives) just for this project:*https://developer.nvidia.com/devbox
We opted to use a Xeon e5-2620v3 processor, rather than the Core i7-5930K (which we did have available). Both support 40 channels and mount in the LGA 2011-v3 socket, and both have 6 cores, 15mb caches etc. The i7 has a faster clock speed but the Xeon takes registered (buffered), ECC DDR4 RDIMMs, which means we can put 256Gb on the board, rather than just 64Gb. For the GPUs we have a TITAN RTX and an older TITAN Xp available to start, and we can add a 1080Ti later, or buy some additional GPUs if needed. We also put the whole thing in a Rosewill RSV-L4000 case.
Old notes on a prior look at a [[GPU Build]] are on the wiki too.
 
We would have had to buy a lot more parts to try to build the more recent variants from Lamdba Labs, Bizon-tech, etc. And it isn't clear that they would get much more performance than we will. Their specs are here for your reference (it isn't clear which MoBo they are using):
*https://lambdalabs.com/deep-learning/workstations/4-gpu
*https://bizon-tech.com/us/bizon-g3000

Navigation menu