Installation of a Western Digital Black2 Dual Drive in a Mac

This is a follow up to the previous post on this topic – this one contains more comprehensive instructions.

The first stage is to prepare the drive for use in the Mac. Due to the drive configuration and Western Digital’s lack of foresight/focus on Windows – this stage must be completed in a Windows PC.

You’ll need:

From the WD Black2 Box:
  • The USB -> SATA cable
  • The USB key (this has the software to ‘unlock’ the spinning partition of the dual drive
  • The drive itself
Also:
  • A windows PC/laptop which you can take the hard disk out of easily
  • Skill commensurate to the activities of removing and reinstalling hard disks
  • The ability to find your way around the terminal command line (linux or Mac)
  • A USB based installer for OS X (see here for a useful tool to help with this)
  • A working Time Machine backup of your Mac – update this before you start.
 
Stage One – Unlocking the spinning disk
  1. Boot the laptop in to Windows off it’s main hard disk – make sure you have an Internet connection and a browser with Adobe Flash capability.
  2. Connect the Black2 disk to the laptop using the USB -> SATA cable.
  3. Insert the WD USB key – crazy automatic things will start happening and you’ll find yourself at the product website on www.wdc.com – you can safely remove the USB key at this point.
  4. On the Overview tab which shows by default, click the Data Transfer Software link.
  5. Download the Acronis True Image WD Edition software (~230MB in size).
  6. Go back to the Overview tab and download the Partition Software as well – we’ll need that in step 11.
  7. Install the above software and start it, selecting Clone Drive.
  8. Use the Automatic option and, after some processing, you’ll be told that Windows needs to restart – click through this message for Acronis to start it’s own boot Loader and complete the clone process – the laptop will shut down automatically once it’s completed.
  9. Now things get physical. Remove the drive from your donor laptop and replace with the Black2.
  10. Boot and wait. Hopefully it’ll just start up pretty much like normal here. Don’t be surprised if Windows reports that a chkdsk needs to be run during startup – the disk has been completely re-written after all!
  11. Once Windows has started, it’ll likely request a reboot to complete installation of the new hardware. I know, I know, using Windows is a pain.
  12. Now we need the Partition Software which hopefully was downloaded back in step 5. Install it and follow the wizard through.
You should now have two partitions available on the Black2:
Stage Two – Drive partitioning 
  1. Remove the disk from the PC laptop and put it’s own laptop back. Happy Windows machine.
  2. Connect the Black2 back to the SATA USB adaptor and connect it up to your Mac.
  3. Fire up Disk Utility and erase the disk – make it a single Mac OS Extended (Journaled) volume.
  4. Download the OS X Recovery Disk Assistant from here – run it to create a correctly sized recovery partition on the disk.
  5. Now, in Disk Utility again, create two partitions on the disk. Make the first 119GB (to allow for the first part of the disk being used for the Recovery partition. The second should be 1TB. I called mine SSD and HDD just for clarity.
 
Stage Three – Create the Fusion drive
  1. In Terminal, type: diskutil list and press Enter. You should be able to find the disk easily if you named the disks as I did in Stage Two above.
  2. Once you’ve found the SSD and HDD partitions, note down the Identifier for each of the partitions
  3. In Terminal, type: sudo diskutil cs create Fusion disk4s2 disk4s3 (the last two items should be the Identifier of your partitions – SSD first, then HDD.
  4. Check through the resulting text to make sure everything worked without error – here’s mine for reference:
    • Started CoreStorage operation
      Unmounting disk4s2
      Touching partition type on disk4s2
      Adding disk4s2 to Logical Volume Group
      Unmounting disk4s3
      Touching partition type on disk4s3
      Adding disk4s3 to Logical Volume Group
      Creating Core Storage Logical Volume Group
      Switching disk4s2 to Core Storage
      Switching disk4s3 to Core Storage
      Waiting for Logical Volume Group to appear
      Discovered new Logical Volume Group “2A40C88F-0E1F-433D-BEA5-55A19BEBCB9F”
      Core Storage LVG UUID: 2A40C88F-0E1F-433D-BEA5-55A19BEBCB9F
      Finished CoreStorage operation
  5. Once the Fusion drive is created, it needs to be formatted. But before we can do that, we need to find the ID for the Fusion drive. In Terminal, type: diskutil cs list – the long alphanumeric string for the Logical Volume Group is the one you want – copy that to the clipboard – we’ll use it in the command in Step 6.
  6. Now, in Terminal, type: diskutil cs createVolume <ID string> jhfs+ “Macintosh HD” 100%
    • This will create a filesystem called Macintosh HD that takes all the space available on the Fusion drive
  7. All going well, the disk is now ready for final installation in your OS X device
Stage Four – Install the unit in your machine
I’ll leave this one alone, assuming A) you know what you are doing or B) you can follow one of the many excellent resources on the net – from instructions at OWC or Lifehacker, to videos on youtube.
If installing into a Mac with a hard disk temperature sensor, you can bypass the sensor with a small jumper wire (this mitigates against the fan-on-full issue which will occur if the sensor is not bypassed). There are instructions out there on how to do this bit as well.
Stage Five – Reinstall your OS
 

This is where your USB installer for OS X & Time Machine backup come into play – plug it into the Mac and boot from it and reinstall OS X onto the new drive then follow the instructions to restore from your Time Machine backup.

Note that the first steps of this procedure would also enable the drive for use in a linux machine. With root mounted on the SSD and /home on the HDD portion, this would also speed up your favourite linux box!

I hope this has helped – let me know in the comments if you have any questions that I might be able to answer for you.

What?! A “Windows only” hard disk??

Came across these Western Digital Black2 Dual Drive units the other day. What a cool idea I thought! Just the thing for a couple of laptop users here at work who prefer the speed of an SSD in their laptops but would like a bit of extra room for VM’s and the like.

So I ordered three – one for my work iMac as well. I’d recently upgraded the iMac with an older 256GB SSD – which was fine – though not a huge amount of space obviously.

The units arrived and I set about hooking it up to the iMac using the handy USB->laptop drive SATA cable that it came with.

Only the 120GB SSD drive portion appeared.

Digging a little further, I found that the unit is supported in Windows only – you need to install software in Windows to enable access to the 1TB disk in the Black2 Dual.

Bugger.

Once I’d finished ranting about how crap it was that a hard disk vendor would build a hard disk only for Windows, I had a think. There *must* be a way to make this work – surely! Just because they haven’t built this ‘enabler’ software for OSX surely doesn’t mean that it absolutely wouldn’t work.

So, I went to work getting it set up on a Windows laptop, figuring that I could unlock the disk in Windows & then pop it back in my iMac and configure a Fusion drive across the two disks.

Annoyingly – or cleverly I guess, depending on how you look at it – you cannot unlock the 1TB portion of the drive while it is connected via USB. It actually has to be resident inside the machine (or at least, connected to the SATA bus) to unlock.

This meant I had to image the existing SSD in the laptop onto this one – using the afore mentioned handy cable.

Once it was unlocked, I connected it up to the iMac and converted the partition table from MBR to GPT using the gdisk utility. Note that the 1TB portion shows up as a partition NOT a second hard disk I had suspected it might based on the reviews I’d read.

I removed all the partitions from the first 128GB of the disk and created an EFI partition then ran the Apple Recovery Disk Assistant tool to create a Recovery partition on the new disk.

Excitedly, I then used the directions here to create the Fusion drive.

diskutil cs create Fusion disk3s2 disk3s3

Unfortunately this resulted in a POSIX Input/Output error so it seemed like that was the end of the road.
Frustrated, I posted a brief report into a MacRumours forum in which I’d left a question.

Overnight, “Weaselboy” replied with a few further links to check which renewed my hope that it might work.

One in particular – this excellent (as usual) article from Anandtech described how the controller uses LBA to address the different areas of the disk. Here was the reason for my renewed hope.

Ok, I thought, let’s just wipe the whole thing in my iMac and create the partitions again.

So I did.

And, this time, it worked.  Similar steps would mean this disk could be ‘enabled’ for use in a linux machine as well – it would work really well with / mounted to the SSD and /home on the 1TB mechanical portion.

the unthinkable has happened

I never thought I’d see the day when I would lay aside the excellence of linux and actually enjoy using another operating system – a proprietary operating system at that!

But it has happened. Through various events, an opportunity arose a couple of months back, to use an early Intel model iMac at work. My quad-core i5, 4GB RAM, 64bit debian running workstation was flattened and redeployed with Windows 7, to someone requiring the horsepower.

The iMac, a spare loaded with XP, sitting unused for many months, became my workstation. Needing to run iTunes, and refusing to use Windows outside of a VM or RDP session, my only option was OSX as the spec wouldn’t cope with a VM running atop linux.

A good opportunity, I thought, to see how an increasing percentage of the other side live – to see what all the fuss was about.

I wasn’t prepared for what happened next.

At first, it took a little time to get used to the slightly different keyboard layout & the plethora of new, sometimes unwieldy hotkey combinations. However, I did feel at home though with the familiarity of the interface having come from a GNOME environment – it was uncannily similar.

Slowly but surely I felt a growing sense of wonder of just how simple it was to use, of how things just worked. I enjoyed that sense of the technology getting out of the way and just letting me get on with what I needed to do. And yet, a bash command line was just a click away…

I discovered I *could* have the comforts of a commonly used mainstream OS and UNIX too. I did a little investigation and found that, completely by happenstance, I had all the right hardware at home to make a Hackintosh. So, for roughly the same time as having the iMac at work, I’ve also had an almost-Mac-Pro at home.

Of course, my Apple-loving friends all nodded knowingly, tut-tut-ed and wondered why it had taken me so long…

I began to understand that it’s not just about a single device, or the OS, or an App Store. It’s the eco-system that all these things exist in that is so appealing. It is all there, designed to work together – not perfect, but much closer to completeness than anything I’ve previously come across – either proprietary or non. And I really like it.

What has ensued is a philosophical crisis of sorts: How can I *like* a proprietary OS? Is this nice, easiness worth giving up some freedom for? Where is my loyalty? If I like this, does this mean I might like Microsoft one day? How am I ever to afford the ‘real’ hardware to run at home?

Of course, these are all questions of which I’m willing to spend some time on getting to the answer of.

I know there are many among the FOSS community who have tread this path before me – some of whom are much cleverer and whose opinions are valued much more highly, than mine.

But I still can’t help but feel a little guilty.

Ubuntu LTS Server upgrade – really difficult?

At my place of work, we use a Java-based trouble-ticketing system from Atlassian called Jira.

It is hosted on a LAMP server virtual machine in our production VMware environment. The system has been in daily use (well, week day use) since near the end of 2008 – requiring minimal maintenance in that time (the occasional reboot after security updates have been installed).

Up until yesterday, we had been using Ubuntu 8.04 LTS Server. I decided it was time to move to the latest LTS release – 10.04 – which was released earlier this year and had just received it’s first .1 refresh.

Some googling around revealed the potential for various issues with the process so I took a snapshot before beginning – just to be safe.

I then found this link which detailed how to upgrade the server to the next LTS release.

I was shocked at how simple the process appeared to be – surely not?! This is that crazy technical, awful command line operating system with a really high cost of ownership isn’t it?

So, SSH’ing into the server, I took a copy of /etc (just being extra safe again), fired up a screen session and ran the command as instructed on the page above.

sudo do-release-upgrade


Various lists were obtained from the internet and upgrades calculated, I then had to press Y to show my acceptance of the results.

Everything slowed down at this point due to our internet connection speed (changing soon, yay!). I disconnected and went to sleep.

This morning, I connected back to the server and screen session to find a reboot necessary. So, Y again and a reboot later the 10.04.1 based system was up and running.

I fired up a browser and pointed to the Jira system – fail. Oh noes, I thought, now it gets difficult.

Well, no, not really. Over the course of various Ubuntu releases since 8.04, the sun-java6-* packages were moved into the partner repository.

So, I uncommented the partner repository in /etc/apt/sources.list, ran an apt-get update and reinstalled the sun-java6-jre package.

A reboot (only to test that everything would start by itself as it should) and Jira is running again, no data lost and inbound email requests to the system are working. Awesome.

Just so you get the significance of that, imagine doing an inplace upgrade (eg not a fresh install) of a Windows 2000 Server running IIS5 and SQL 2000 and have it coming out running Windows Server 2008, IIS7 and SQL 2008.

Two reboots, no data loss, no restores necessary and all done remotely. And Jira was actually still running and available for most of the time except when the box was rebooting and having java re-installed.

Yep, *really* difficult. Watch out.

the inspirational missing fork…

Yesterday, while preparing my rice noodles, tuna and sweet-chilli sauce for lunch, I realised I had misplaced my fork. And there were no other forks in the lunch room drawers.

Blast. “Oh well”, I thought; then grabbed a spoon and went and enjoyed the meal.

Once finished, I realised that, actually, the spoon hadn’t actually been that hard to use. In fact, it had worked well – better than I expected it would.

Then it occurred to me; if the fork had not gone missing, I would never have even tried using the spoon – believing it to be ‘not suited for purpose’ (to use a tech business term, if I may).

I made a mental note to blog about this wonderful thought – the inspirational missing fork – completely unaware that Don Christie (President, NZ Open Source Society) would make a similar comment today on the NZOSS Openchat list:

“..the idea that there are multiple platforms and options is as important as how to use an inidividual platform.”

Indeed, how often do we not even think about the possibility of an alternative being completely worthy of performing a given function only because we always had a ‘fork’ at our disposal.

The trick, I guess, is to look around now for an alternative before the proverbial fork goes missing (or becomes unavailable/unusable for some reason)….

KDE team removes support for underscore, starts enforcing STD3 from RFC1122

Interestingly, the latest build of KDE 4 (4.3.90 aka 4.4 RC1) no longer supports the underscore character in host names.

While this was allowed in previous KDE4 versions, the KDE team have removed support for the underscore as “STD3 requires all DNS domain names to be limited to Letters, Digits and Hyphen.”

Here are two examples of bugs that have been filed and subsequently closed with a ‘WONTFIX’ resolution: 220500 222291

So, any of you sysadmins out there who have the audacity to have hosts (or DNS aliases/addresses) on your network with the underscore character in them, you’ll no longer be able to connect to those hosts using KDE4 apps like KRDC (Remote Desktop) or the Konqueror web browser.

what a difference an AHCI makes

Last week, I noticed how, whenever huge disk IO was taking place on my Quad-core – with 4Gb of RAM and 64bit Ubuntu – workstation at work, the whole desktop environment would pretty much grind to a halt.

SSH’ing in from a remote machine and using top, iotop and nethogs didn’t show anything particularly heart stopping going on either.

I googled around and found that this seemed to be a fairly common problem with any of the newer kernel releases.

One post in particular said that a person had fixed the problem by disabling the SATA disk controllers AHCI mode in the BIOS – switching it back to IDE.

Cool I thought – let’s have a go! Interestingly, the BIOS was already set to IDE. I decided I’d try enabling AHCI instead.

Wow – what a difference that made. I then remembered one of the other posts I came across that just said to switch the BIOS setting as that forces the OS to load a different disk controller driver.

It certainly did the trick – said work-beastie is now much faster and more responsive under load.

segfaulting multimedia processes -or- The Case of the Badly Cooled……Case

A wee while ago (yes, I’m catching up on things I’d hoped to blog about for a while!), I had a problem with my home PC. This culminated in a post to the Ubuntu Forums.

General stability of this machine is great – it’s normally on for weeks at a time serving the familys various document/web/email/printing needs – and has done this for about four years with the only major hardware change being a new 7600GT graphics card (most recently – about 12 months ago) and a new Socket 478 P4 Extreme Edition CPU about 18 months ago).

So, what do you guys think? Hardware or software? And how do I troubleshoot this one further? (BTW, I’ve been a linux user for about 8 years now, so I’m not really a guru and definitely not a noob. Perhaps more of a goob. 😀 )

Basically, I had an issue where, whenever I’d do some ‘heavy lifting’ tasks – like audio or video encoding, the app would just disappear. Very odd it was – I tried all sorts of things to fix it. New linux distro’s, replacement RAM etc.

Starting the processes from the command line, I was able to see that the app termination was actually a segfault – which I subsequently found in the dmesg log. That and two other distro’s (lenny and a Fedora Core live CD) gave errors in dmesg about the CPU overheating:

Turned out to be the CPU overheating. Interesting, there was nothing in dmesg about the CPU overheating – though, when I had Debian on, it did show messages about that – and, when I booted into the Fedora 10 Live CD, it also complained about the CPU overheating in dmesg.

So, to solve the problem, I transplanted the guts of my box into a new case which breathes better and also used the correct heatsink for my CPU (one with a copper core).

The problem was that I was using the same case and heatsink from my old P4 2.8Ghz which wasn’t cutting it with the new P4EE 3.4Ghz and the amount of heat it generates.

Once the correct heatsink and better case with more efficient thermal dynamics were in place, the differences in internal temperature were quite remarkable:

If anyone is interested, here’s some temps from lm-sensors that show the difference in internal temps between the two cases and heatsinks. These are both just at system idle with no loading.

Before
SDA: 37C | SDB: 34C | GPU: 57C | CPU: 40C

After
SDA: 33C | SDB: 28C | GPU: 40C | CPU: 23C

During loading, the CPU was getting to around 70C, now its able to stay around the 57C – and with no segfaulting going on! Yaay!

The rather cool thing – from my point of view anyway – is that Windows would merely have blue screened under the same circumstances (or just rebooted as the default blue screen setting dictates). Obviously that would make things much harder to troubleshoot.

So linux dealt with the overheating by terminating the offending process. A much more elegant way of handling things – don’t you think?

TTF vs ttf in linux

During a recent migration of a friends family laptop from Vista to linux, I discovered a curious problem.

Said friend had a bunch of add-on TTF fonts in Vista which they still wanted to be available to them in linux. No worries, I thought – I’ll just copy them out of the windows partition and put them in ~/.fonts and away they’ll go.

Or not, as it turned out.

Curiously, some were showing up in OpenOffice, and some weren’t. Several font cache updates, reboots and various google searches later, they still were not showing.

At that point I thought I’d better have a look in ~/.fonts to see if I could spot a pattern with ones that were appearing, and ones that were not.

Bingo! All the non-working ones had extensions of .TTF – while the working ones were lower case.

Right, I thought – there must be a whizbang bash command to fix that! Off to google for help again and I was able to construct this command:

for i in *.TTF; do mv “$i” “`basename $i .TTF`.ttf”; done

Voila! Now all the fonts showed correctly (after a restart of OpenOffice of course).

Capitalisation only makes a difference in a real operating system after all!

A change of OS can void warranty…

I’m totally gobsmacked.

Today, I was talking to a friend about her new Vista laptop and the various troubles she’s been having. She said she’d love to run Ubuntu on it but couldn’t because removing Vista would void the laptop’s warranty.

Now, I can understand playing with the hardware itself voiding the warranty (overclocking etc), but formatting the hard drive and installing another OS? What the ….

I did a bit of googling on the subject and it turns out to be quite wide-spread too – apparently a lot of people have been told this by various manufacturers.

Watch out for that I’d say.. certainly makes even more sense to buy a machine with nothing on it.