The Application Experience

By this point I’ve talked a lot about the synthetic experience with Apple’s Fusion Drive, but what about the real world user experience? In short, it’s surprisingly good. While I would describe most SSD caching implementations I’ve used as being more HDD-like than SSD-like, Apple’s Fusion Drive ends up almost half way between a HDD experience and an SSD experience.

Installing anything of reasonable size almost always goes to the SSD first, which really goes a long way towards making Fusion Drive feel SSD-like. This isn’t just true of application installs, but copying anything in general hits the SSD first. The magic number appears to be 4GB, although with a little effort you can get the Fusion Drive to start writing to the HDD after only 1 - 2GB. I used Iometer to create a sequential test file on the Fusion Drive, monitored when the file stopped writing to the SSD, stopped the process, renamed the file and started the file creation again. The screenshot below gives you a good idea of the minimum amount of space Apple will keep on the SSD for incoming writes:

You can see that if you’re quick enough you can easily drop below 2GB of writes to the SSD before the HDD takes over. I don’t know for a fact that this is the amount of free space on the SSD, but that’s likely what it is since there’s no sense in exposing a 121GB SSD and not using it all.

In most real world scenarios where you’re not aggressively trying to fill the SSD, Fusion Drive will keep at least 4GB of the SSD free. Note that when you first use a mostly empty Fusion Drive almost anything you write to the drive, of any size, will go straight to the SSD. As capacity pressure increases however, Apple’s policy shifts towards writing up to 4GB of any given file to the SSD and the remainder onto the hard drive.

I confirmed this by installing Apple's OS X developer tools as well as Xcode itself. The latter is closer to the magic 4GB crossover point, but the bulk of the application ended up on the SSD by default.

The same is true for data generated by an application. I used Xcode to build Adium, a 682MB project, and the entire compile process hit the SSD - the mechanical side of the Fusion Drive never lifted a finger. I tried building a larger project, nearly 2GB of Firefox. In this case, I did see a very short period of HDD activity but the vast majority was confined to the SSD.

I grabbed a large video file (> 10GB) I cloned over when I migrated my personal machine to the iMac and paid attention to its behavior as I copied the file to a new location. For the first 2GB of the transfer, the file streamed from the SSD and went back to the SSD. For the next 2GB of the transfer, the file was being read off of the HDD and written to the SSD. After copying around 4GB, both the source and target became the HDD instead. Fusion Drive actually ended up caching way more of that large video than I thought it would. In my opinion the right move here would be to force all large files onto the hard drive by default unless they were heavily accessed. Apple's approach does seem to be a reasonable compromise, but it's still way more aggressive at putting blocks on the SSD than I thought it would be.

I repeated the test with a different video file that I had never accessed and got a completely different result. The entire file was stored on the hard drive portion of the Fusion Drive. I repeated the test once more with my iPhoto library, which I had been accessing a bunch. To my surprise, the bulk of my iPhoto Library was on the HDD but there were a few bursts of reads to the SSD while I was copying it. In both cases, the copy target ended up being the SSD of course.

My AnandTech folder is over 32GB in size and it contains text, photos, presentations, benchmark results and pretty much everything associated with every review I’ve put together. Although this folder is very important, the truth is that the bulk of that 32GB is never really accessed all that frequently. I went to duplicate the folder and discovered that almost none of it resided on the SSD. The same was true for my 38GB Documents folder, the bulk of which, again, went unread.

Applications on the other hand were almost always on the SSD.

In general, Apple’s Fusion Drive appears to do a fairly good job of automating what I typically do manually: keeping my OS and applications on the SSD, and big media files on the HDD. About the only difference between how I manually organize my data and how Fusion Drive does it is I put my documents and AnandTech folder on my SSD by default. I don’t do this just for performance, but more for reliability. My HDD is more likely to die than my SSD.

Management Granularity Fusion Drive Performance & Practical Limits
Comments Locked

127 Comments

View All Comments

  • guidryp - Friday, January 18, 2013 - link

    These claims about the effort in setting up SSD/HD combo are getting quite silly.

    There is essentially ZERO time difference into setting up SSD/HD partitioned combo vs Fusion. Your payback would be on Day 1.

    The only effort is simply deciding which partition to load new material on. That decision takes what? Microseconds.

    It is as simple as install OS/Apps on SSD, Media HD. Vs Install OS/Apps/Media on Fusion. The effort is essentially the same.

    But that simple manual partition will perform better, create less system thrashing and less wear on all your drives.
  • Zink - Friday, January 18, 2013 - link

    But then you end up with a SSD filled up with no longer relevant data and you need to figure out how to free up space again. A combo drive takes care of that for you and keeps the SSD filled to the brim with most of the data that gets used. You can download any games, start any big video editing project, and know that you are getting 50%-100% of the benefit of the SSD without worrying about managing segregated data. With a segregated setup you end up playing games from the HDD or editing video files that are on the HDD and sometimes see 0% of the benefit of the SSD. Fusion seems like the future.
  • KitsuneKnight - Friday, January 18, 2013 - link

    If you can divide your data up as OS, Apps, and Media, and OS + Apps fits on the SSD, then sure, it's not too bad.

    Unfortunately, my Steam library is approximately 250 GBs... That alone would fill up most SSDs out there. And that's not even counting all my non-Steam games, which would help push most any SSD towards being totally full. If I'd bought too many recent games, it'd likely be quite a bit larger than that (AAA games seem to be ranging from 10-30 GBs these days).

    Unless you sprung for a 500 GB SSD (which aren't exactly cheap, even today), you'd be having quite a pickle on your hands. Likely having to move most of the library manually to the HDD (which is a bit of a pain with Steam). Which means it's suddenly much more complicated than OS/Apps on SSD, and Media on the HDD. Especially since SSDs massively improve the load time of large games (unlike the impact it has on media).

    And then there's the other examples I've already given: the artist I know that works on absurdly massive PSDs, and has many terabytes of them (what's the point of a SSD if it doesn't benefit your primary usage of a computer?), as well as my situation with VMs on my non-gaming machine (which actually has a SSD + HDD setup right now). A lot of people could probably do the divide you're talking about, but likely even more people could fit all their data in either a 128 or 256 GB SSD.
  • name99 - Friday, January 18, 2013 - link

    Then WTF are you complaining about?
    You can still buy an HD only mac mini and add your own USB3 SSD as boot disk.
    Or you can buy a fusion mac mini and split the two drives apart.

    It's not enough that things can be done your way, you ALSO want everyone else, who wants a simple solution, to have to suffer?
  • Mr Perfect - Friday, January 18, 2013 - link

    Intel SRT is useful for everyone, there's no reason to look down on it. Could I sit there and manually move files back and forth between the SSD and HD? Sure. But why? Seriously, I have better things to do with my time then move around the program of the week between storage mediums. Last week I was using Metro 2033, this week is World of Tanks. Next week I might finish one of those run throughs of D:HR or Portal 2 that I left hanging. SRT takes care of all of that. This is 2013, an enthusiast class workstation should damn well be able to handle something as simple as caching, and it can. Enterprise class servers have been doing it for some time, so why isn't it good enough for a gameing rig?

    My one complaint with RST is the cache size limit. Why would Intel even impose a limit?
  • EnzoFX - Saturday, January 19, 2013 - link

    You're framing it in your own way so that only your solution works. Fail. Unnecessary stressing of the SSD? The better argument for most people would be putting that SSD to good use. Not trying to NOT use it.

    It further isn't simply about putting the files where they go, and then be done with. Files are changed, updated, and if you're on multiple drives, copied back and forth. Some people don't want to deal with that. Actually, no one should want to do deal with that. There are only barriers with every person having their own thresholds to good solutions. Is it that hard to understand?
  • lyeoh - Saturday, January 19, 2013 - link

    Do you manually control the data in the 1st, 2nd level cache in your CPU too? There are plenty of decent caching algorithms created by very smart people. If the algorithms were that bad your CPU would be running very slow.

    There should be no need for you to WASTE TIME moving crap around from drive to drive. The OS can know how often you use stuff, and whether the accesses are sequential, random, slow.

    If Windows 8's Storage Spaces was more like Fusion Drive out of the box (or better even), us geeks would be more impressed by Windows 8.
  • Feldur - Monday, June 29, 2015 - link

    Having designed and built both computers and operating systems, I qualify as not naive. I'm interested in your assertion, therefore, that because I prefer letting the Fusion drive do the work that I must be lazy. You're making a judgement about how I should spend my time - that it's the best investment of my time to shuffle files about (non-trivial if I want the level of granularity a Fusion drive can offer, too) versus developing software or playing with my dog. It's interesting that you think you know me that well, regardless of the fact you're dead wrong.

    How do you reconcile that?
  • StormyParis - Friday, January 18, 2013 - link

    The device is technically nice, however the price is wayyy too expensiveat around $450 for 128GB+2TB:

    Apple's 128GB SSD+ 2TB HDD "Fusions drive" is about $450 ($400 as an upgrade)

    A regular 256 GB SSD is $170
    A regular 3TB HD is $150.
    regular equivalent for Apple's price: 256 SSD+ 2x3TB HDD = $470

    You can get twice the SSD storage, and 3 times the HDD storage, for about the Apple price. This will take up more physical space, but also offer you way more storage space, both on the SSD side (plenty of space for your OS, apps, and live data files) and HDD space (3TB + 3TB backup, or 6TB JBOD for your archives and media)
  • jeffkibuule - Friday, January 18, 2013 - link

    Hence the DIY route.

Log in

Don't have an account? Sign up now