I wanted a RAID setup ever since the 1990’s, back when I would get computer magazines from my neighbour. One of the articles featured perfect setups, which included an external RAID array. I soon learned that RAID (originally redundant array of inexpensive disks, now commonly redundant array of independent disks), particularly RAID 5, can provide a great balance of storage and redundancy (if a single drive fails in the array, no data is lost).
Along with this, I have a dangerous combination of being paranoid about losing data and rarely making backups (and I never learn, having lost a 60GB hard disk around 12 years ago with only the most crucial data available to be recovered).
Keep in mind a RAID is not a backup solution. If the house burns down and if I did not have anything off-site, I would be very out of luck. For many other scenarios however (usually less-catastrophic), it wouldn’t hurt to have.
The implementation has been left on the back burner for some time, until I watched a YouTube video of a friend setting up a FreeNAS server for his business. In the video he used mostly “spare parts” (which I wouldn’t consider as leftovers) to build a file server with two RAID 1 arrays. At this point I looked to my left at the old case sitting on the bookcase and to the right at the old SSD I had in my main PC and thought to myself “not only is this doable, but straightforward as well”.
Boy was I wrong on the second part.
This entry is split up into days, and will add as needed until the finale video of the software setup (will go about it this way as there would be a lot of recording to satisfy the hardware piece). A disclaimer…usually I aim for the more inexpensive route and post something “on the cheap” – this isn’t one of those times. Enjoy!
Day 1: A Tale of Two Cases
Having purchased my $10 Partners for Care PC (a Lenovo M92p) a.k.a. hospital surplus, it’s time to install the SSD (which I will use for drive cache) and prepare the bays. This is when I run into my first problem: my 60GB SSD bracket does not fit the drive caddy. The bracket is too short, leaving only two pins of the caddy holding in the bracket. This isn’t too bad though, there is a bottom to the caddy, and I won’t be installing the server upside-down. Just the gravity of the SSD will keep it from tilting up.
Next up are the 5.25″ drive bays. I require two full-height bays to fit the Icy Dock Fatcage hard drive caddy, for the three drives I ordered. So lets remove the DVD player and front panel to give myself…
…crap. Two drive bays separated by half an inch of chassis. There is literally a beam going through the two bays. I considered taking a hacksaw to it, but then one of the bays will not line up. I guess, I still have that spare case and power supply.
So let’s start transferring. Start by looking at the PSU. The main board connector uses 14 pins while the spare PSU I have (better watt output) has the option between 20 and 24 pins. For now (at least), I will have to transfer the PSU as well. With a quick measure, will the Lenovo mobo fit into the other case? Looks so.
I proceeded to gut the unbranded ATX-type case. With the mobo long gone, there wasn’t much needed to be done. Next up, detaching the Lenovo mobo and PSU, which wasn’t that much of a problem, whereas the back plate was a bit tricky to line up and remove. I installed the PSU first when the space was available. The back plate wasn’t made for a case like this, so i had to bend down the top locking edges so the plate can rest in. Finally the motherboard. A tight fit, the screws from the Lenovo case were too small (luckily I had screws that fit fine) but everything seems to line up thus far.
Next issue…documentation for the front panel are sparse and the connectors between Lenovo and the generic case are different. With about 20 minutes of searching, I found the right jumper configuration for the Power and HDD LEDs, as well as the power switch. Reset switch, front sound and front USB are all out of luck.
Now back to mounting the SSD (again). Once again things do not line up, but I found a configuration that is rather sturdy.
Finally, toss in an old spare hard drive and boot up. Since I didn’t want to crawl under my desk every time, I set up the old TV for DisplayPort. The system tested fine, except my SSD did not show. Turns out, I had that drive attempted to be plugged into an eSATAp port. Wait, eSATA? So, I don’t have four SATA 3.0 ports? Looking at the documentation, I only have two. The third is a SATA 2.0, while the last is that eSATAp. So how am I going to hook up four drives?
I finished the night looking for a host bus adapter, adding to the cost of the build by about 20%. Crap.
Day 2: Tidying Up
This was a quick night. Mostly me deciding on an adapter for the larger power supply, and then tidying up the internals using zip ties. Currently it stands as a 3rd generation Intel i5, 4GB RAM, 500GB HDD and 60GB SSD.
Day 3: A Melody of Beeps
More RAM (2 x Corsair Value Select 4GB DDR3L) came in the mail today, so I promptly installed that.
Set the PC back up, turned it on for a memory test and then…
…over and over. What did I screw up? I re-seated everything, plugged back in, same beep code. I then removed the old RAM. Booted without issue. Okay, is there a conflict between DDR3 and DDR3L? According to the documentation, as long as the DDR3L can still run at 1.5V (which this RAM can), it shouldn’t be a problem.
So I started swapping memory with my HTPC. First I used the Kingston HyperX Blu. Still with the beeps. Then I start to see a pattern. The Blu and the original RAM had a max bus speed of 1333 MHz, while the new pair had a max of 1600 MHz. One last shot, I used the other stick of RAM from the HTPC, since it was rated at 1600 MHz. Success!
As I am writing this section, it is still going through the memory test. Hopefully all goes well, to mark the end of day three.
You may notice that the loose hard drive has been taken out. Another attempt of cleaning up for the next phase.
Day 4: The NASty Setup
After registering the serial numbers for the hard drives that arrived a day before, I made an experiment Saturday night by building a virtual machine with an instance of FreeNAS. I got an idea of what I will be needing to do in the next week or so, to the point that I wanted to step up a part of the install-fest schedule.
Even though I cannot connect the hard drives just yet, I can still install the FreeNAS instance. That is exactly what I did late Sunday morning. It ended up being quite the fight with the BIOS to be able to select the boot media (either the install USB key or the installed USB key), until I had to turn UEFI back on and reinstall FreeNAS to expect that.
Once installed and booted, my router picked an agreeable IP address to which I then reserved, and then proceeded to start the configuration of the FreeNAS instance. Without the array there was not much to do, but after several attempts (and fixing my main PC’s e-mailer) I managed to set up emails between the new system and my personal.
Finally, I set up SSH for easier access which included a temporary setting of root login until I can have a regular user with sudo.
All this can be found on my YouTube.
Day 5: Power Up
Two packages arrived at my door today. Unfortunately I was not available to accept them. Luckily one was small enough to be left in the mail box. That package just so happens to be the last thing I ordered, the power supply adapter.
The first thing to do is to test the 500W PSU and adapter to verify it will work with the motherboard. So once again, I ripped open the case, and unplugged the original 280W PSU. In those spots I added the new connections.
With the 500W PSU sitting precariously on top of the case, I processed to plug it it, flip the switch and press the power button. What next? Fan grinding.
Not too bad of a situation, a connector I unplugged was resting on the CPU fan. I just brushed it out of the way, and left the system get as far as the GRUB boot menu.
Now the fun part. I turned the PC off at this point, unplugged the power to the PSU, and then let loose the four screws holding in the 280W. It’s a tight squeeze, but once I got the old PSU over the bottom lip (after backing it up toward the drive bays a little) so it is against the edge of the main board, I tilted the top toward me. With a small “klunk” the PSU managed to squeeze past the top case frame.
Now to repeat that one more time, in reverse.
Once the 500W is screwed in place, I then tidied up the cabling again. The main 20-pin cable I curled to the bottom of the case. With a cable tie I shortened the length of the CPU power connector, tight enough that it would have to curl down and inward to cause fan grinding now.
As for the unused cables already tied up, it was inconvenient to have them bundled in the space between the PSU and the drive bays (I don’t know how deep the Icy Dock will be in the end). So I cut the tie, unfolded once and tied it back up again. the slack is now tucked under the case bundle below the SSD. With one final system test to confirm the SSD (which is still being powered by the main board) powers up, looks like my night is complete!
Day 6: Server Config Leaves Me Thirsty
Whelp, the rest of the parts arrived a day later…
This ended up being a long night. The hardware install itself took only a few minutes, but it turns out, a lot of configuration was left to be done. The host bus adapter card physical installation was very straight forward – find a PCI Express slot, insert in.
The drive caddy was a little trickier. As I presumed, there is a bit of depth to the caddy, leaving not a lot of space between it and the power supply. To make matters worse, the cable placements were a bit tricky (power connectors were backwards for example, forcing a bit of a bend). So why not set the cabling up with the caddy out of the case?
With that complete, it was just a matter of holding the slack while inserting the caddy into the drive bays. Once in place, I secured the caddy, plugged in the SAS connector and then cleaned up the cabling, once again.
Withe the case no longer on it’s back, I opened every drive drawer, placed and screwed in each drive, and closed each drawer. The hardware portion is now complete.
I go to power on, and it’s apparent (eventually) that the host bus adapter does not pop up. There are at least two possibilities for this:
- The BIOS is out of date.
- The BIOS wants to use it’s own SATA controller instead.
I tried the second option in passing, but then decided that I will need to do the former. The BIOS update came in the form of an ISO image. It is probably best to burn a CD I guess. This also means digging out the optical drive for the server, once the disc is burned, to be able to read it.
Once I hooked the cables up, the firmware flashing did not cause too much an issue, initially. There was a lot of rebooting, which the update warned of. Problem is, the reboots continued even after the update, to the point of never reaching GRUB. After recalling a similar issue in the forum that I searched the original question in, I tried option two again. That’s what was needed.
While all this was happening, I could see the chattiness of the host bus adapter, and it only picking up two drives (or three once I moved the SSD onto the last SAS to SATA breakout cable I freed from the cable tie). So I turned everything off again, re-seated the connectors, determined which drive was not showing (the bottom/third one), and re-seated the chain of connections again). Finally, all drives showed.
I then allowed a full boot, went downstairs in preparation for the final video, when I got an alert in the FreeNAS dashboard that the host bus adapter was using the P16 firmware where FreeNAS was expecting the P21 version. Okay…21 does not exist…yet. But I knew from reading this before it means 20. No worries, I will download another flash utility. This time it’s to go on a DOS boot disk, so I fired up a VM, installed Rufus, and set up the FreeNAS installer stick with DOS boot and the firmware update.
I brought the drive upstairs, inserted the stick, booted to it, and before erasing the firmware ahead of replacing it I queried for the card and … some PAL error. Essentially UEFI was getting in the way of recognizing the card. I modified the drive to attempt booting into “UEFI mode”, it does not look like that exists on the Lenovo M92p.
Running low on options, I tied a different computer. HTPC to the rescue. Although because the HTPC only takes low profile brackets, I had to unscrew the full-height from the card. I inserted it into the computer, booted, and it did not recognize there either. Another BIOS I had to update, this time it had to be done in two parts, using a flash drive formatted for DOS. And yet after after both flashes, the card still would not recognize.
I went to try a different slot when … I realized that the card was not seated properly. As in, about half an inch after the back of the slot. Did I just damage the card? Anyway, I re-seated, turned the HTPC back on, and luckily, not only did the card’s LED come on, but it made an appearance in the POST as well. But will the firmware flash? If it doesn’t under DOS, looks like this BIOS has UEFI mode anyway.
Never had to go for plan B in the end. Short of a typo in my written instructions that I identified after erasing the firmware, the P20 version was installed successfully, and I returned the card to its proper home.
Now I can turn it on for the video portion. There were many goof-up’s at that point (in the process of setting up the “tank” volume, adding a user, setting up NFS and installing a couple plug-ins), and configuration I had to revise after the video was complete, but I am glad to say this project is complete.
Since the recording, I have put the server in place (determined that the BIOS was still requiring a keyboard so I had to disable that behavior again). I copied a great deal of data over to the server in the past day, and yet I am only using about 10% of the “tank” thus far.
Day 7: Rest
I was tempted to put that last heading in, given the bulk of the work happened over six non-consecutive days. At any rate, time to sign off. Take care.