- Published Date
- Written by Super User
- Hits: 484
When the personal computer first arrived on the scene, it was possible to build your own computer to meet your needs at a cost lower than what you could get commercially. Because you can get PCs cheaper than you can build them, it makes little sense to build your own unless you are unable to get what you want. Still, the process of building your own computer is much like working on your car was in the 1950s-1970s. The PC like the car is no longer a black box.
Interestingly, the growth of computer server farms has led to a potential resurgence in DIY PC building. How is that? Server farms need to refresh their hardware every 3-5 years. The parts are sold as scrap. But why would you want yesterday's computer parts? Some of these parts - while not the latest in hardware - still have a useful life. In particular, Intel Xeon E5-2670s and equivalent that are 8 core CPUs make excellent computer workstations for those on a budget. Intel i7 chips with 8 cores run about $1700. What originally cost $1500-2000 per computer chip now can be purchased for 4-8 cents on the dollar.
As with all optimizations what was a minor cost component (motherboards, memory, disks) are now a more significant portion of your purchase. Since one of our employees decided he wanted to build a computer, we decided to use pre-owned Xeons as part of our build. You will follow our journey to build a DIY workstation.
We have decided to test two different motherboards: a Dell T5610 motherboard and an ASUS Z9PA-D8. Both are dual CPU motherboards. They normally run about $240-310 depending on your source. We were able to get the Dell motherboard for $150 (tax included). To accomodate the motherboards we purchased a used Cosmos 1000 computer case. We also opted to actively cool each CPU using Cooler Master Hyper 212 EVOs. As both motherboards are quad channel (read install memory in groups of 4), we chose four 8GB DDR3-ECC Kingston sticks. For the operating system we have cloned Windows 7 Professional using Macrium Reflect (disk clone software). Once it works we acquire the appropriate licenses.
Learning #1: The Dell motherboard is huge (13"x14"). While it technically could fit in the Cosmos 1000 case, the alignment of the motherboard with the I/O opening is difficult to achieve. Hence we opted to use the ASUS motherboard (12"x10") in the case. We will save the Dell motherboard for a DIY computer rack.
Learning #2: The Dell motherboard requires proprietary power cables. You will need two 5 pin to 4 pin converter cables if you decide to use cooling fans.
Learning #3: Not all 8 pin cables are the same. You will need EPS (read power cable) 8 pin power cables for either motherboard. Additionally, you will also need an 8 pin Y cable to split power two CPUs. A spare 8pin cable of at least 8 inches is needed as both 8pin connections on the Dell motherboard are ~14 inches apart. Fortunately the Asus motherboard has the CPU power connections side-by-side.
Learning #4: The CPU fans are great. They also take up a lot of space (6.5" high from the motherboard). If you are trying to cram as many DIY servers into your DIY rack, you should opt for passive cooling which will probably allow you to go from ~5U to ~2U per server. The fans fit into the Cosmos 1000 case.
We installed all the components but could not get the system to POST (did not pass the power-on self test). The troubleshooting procedure calls for you to remove components as see if any difference occurs. With the exception of memory removal, nothing made a difference. After removing hard drives and CD-ROMs, we chose to remove one CPU. Then we passed POST. Now to figure out why the second CPU doesn't allow us to past POST. Replaced the second CPU to duplicate error. Now no power. Bad power supply? Short?
Since we tested the power supplies (PSU) we are confident they are not the issue. We returned the ASUS motherboard without issues to MacMall and got a partial refund on the Dell motherboard. To shortcut our testing we bought a Dell T5610 workstation case that included everything but the CPUs and a hard disk. What a bargain! For $449 we got everything we wanted. Add $120 for Xeon E5-2670s and ~100 for a hard disk. We already had hard disks available.
The Dell workstation case was a pleasant surprise. Nice cable management. Easy to access components. We removed the CPU coolers, inserted the Xeons, applied thermal compound, re-installed the CPU coolers, and proceeded to POST. No issues. We were a bit hesitant that the CPUs were the problem. They were not. This simply means we will have to install Windows 7 Pro from a disk. Meanwhile, we decided to proceed with the Centos 7 (linux) install. Smooth as silk. This has restored our faith in our abilities but has spooked us when it comes to building a computer starting from scratch. Now we have a 16 core (32 thread) workstation for essentially $670. Not bad considering these workstations cost 10x or more when they first came out. The current pricing is around 4x what we paid for our parts.
Dell machines can be frustrating unless you know the quirks. In this case the Dell T5610 has RAID capability built-in. Not bad until you realize that Windows does not recognize it. Even worse, we couldn't find the appropriate driver on the dell support site. Easy fix - boot and hit F12 to enter the BIOS configuration. Change the SATA option from RAID ON to AHCI. Then you will be able to see the disks.
Second caveat: Set the BIOS option to manually select the graphics card you want. In our case we had two GPUs: a Quadro 600 and a GeForce GTX 1060. We wanted to use the 1060 but the auto mode autoselected the Quadro card.
So what fun is having a lot of cores if you cannot use them? We trained a series of neural network models on 514 stocks and futures (time series) over 15+ years - one model per time series. The code was written in python using the Keras deep learning module. In previous parallel studies we ran our models on an Intel i7 chip with 4 cores (8 threads). Would it scale on the new workstation?
|8||245.7||Single socket Intel i7-2630QM (4 core)|
|32||59.8||Dual socket Xeon E5-2670 (16 core total)|
Since the training was designed to be embarassingly parallel, we had a good chance that the wall clock times would scale nicely. We were pleasantly surprised to see that even with disk I/O which could potentially pose a bottleneck we scaled perfectly. This is primarily a computationally intensive job. How would the workstation do on other consumer type applications?
We normally use a Dell XPS 8100 (i7 860 @ 2.8 Ghz - 4 cores) using Adobe Premiere CS5 to render our video interview series for the Crighton Theatre. The rendering can take from real time to 1/12 real time to process. Any special effects - not normally used in the interview show - can be even slower - say 1/60 real time. For our test we had a 72 minute interview- mostly straight cuts, no special effects, typical transitions. The workstation rendered the video in 43 min. 10 seconds. What we noticed is that the CPU load stayed around 50% (40-60%). On another video (2 hr 15min length) the XPS took 2 hr 49 min to render while the workstation took 1 hr 34 min - a 1.8x speedup).
Does Premiere only select actual cores? Is it limited to the number of threads it can use?
Turns out the software was not pushing the machine hard enough. By accident we only had dual layer DVDs so we decided to make a high quality rendering of a video (6 Mbps avg, 8 Mbps max) instead of the usual fit on a disk (~2.8 Mbps avg). Now we were creating a DVD with about 8 GB of video. Once we did this all 32 threads were saturated. It took 23 minutes 7 seconds to render the video. With 1/4 of the threads on an XPS 8100 the same job took 67 minutes 57 seconds. That is only a 2.94x speedup. Because the workloads appeared to be split evenly and using all threads, other factors (disk) may be affecting the speedup. Still, Adobe Premiere appears to be able to use all threads when processing video provided there is enough work to go around.
The coolest part about this workstation? Seeing all those available threads for work...