We met with AMD and among other things, one item they wanted to show us was the essentially final versions of several upcoming FreeSync displays. Overall AMD and their partners are still on target to launch FreeSync displays this quarter, with AMD telling us that as many as 11 displays could hit the market before the end of March. For CES AMD had several displays running, including a 28” 60Hz 4K display from Samsung, a 27” 144Hz QHD display from BenQ, and a 75Hz 2560x1080 34” display from LG. The three displays mentioned were all running on different GPUs, including an R9 285 for the BenQ, R9 290X for the Samsung display, and an A10-7850K APU was powering the LG UltraWide display.

More important than the displays and hardware powering them is the fact that FreeSync worked just as you’d expect. AMD had serveral demos running, including a tearing test demo with a large vertical block of red moving across the display, and a greatly enhanced version of their earlier windmill demo. We could then enable/disable FreeSync and V-SYNC, we could set the target rendering speed from 40 to 55 Hz in 5Hz increments, or we could set it to vary (sweep) over time between 40 Hz and 55 Hz. The Samsung display meanwhile was even able to show the current refresh rate in its OSD, and with FreeSync enabled we could watch the fluctuations, as can be seen here. [Update: Video of demo has been added below.]

Having seen and used G-SYNC, there was nothing particularly new being demonstrated here, but it is proof that AMD’s FreeSync solution is ready and delivering on all of AMD's feature goals, and it should be available in the next few months. Meanwhile AMD also took a moment to briefly address the issue of minimum framerates and pixel decay over time, stating that the minimum refresh rate each monitor supports will be on a per-monitor basis, and that it will depend on how quickly pixels decay. The most common outcome is that some displays will have a minimum refresh rate of 30Hz (33.3ms) and others with pixels quicker to decay will have a 40Hz (25ms) minimum.

On the retail front, what remains to be seen now is just how much more FreeSync displays will cost on average compared to non-FreeSync displays. FreeSync is royalty free, but that doesn’t mean that there are not additional costs involved with creating a display that works with FreeSync. There’s a need for better panels and other components which will obviously increase the BoM (Bill of Materials), which will be passed on to the consumers.

Perhaps the bigger question though will be how much FreeSync displays end up costing compared to G-SYNC equivalents, as well as whether Intel and others will support the standard. Meanwhile if FreeSync does gain traction, it will also be interesting to see if NVIDIA begins supporting FreeSync, or if they will remain committed to G-SYNC. Anyway, we should start to see shipping hardware in the near future, and we’ll get answers to many of the remaining questions over the coming year.

Comments Locked

118 Comments

View All Comments

  • GuniGuGu - Thursday, January 8, 2015 - link

    Was there a 4k IPS monitor?
  • JarredWalton - Thursday, January 8, 2015 - link

    No, there wasn't. I didn't take proper notes, but I believe the Samsung panel was TN, the BenQ was TN as well (maybe?), and the LG display was IPS (or AHVA or similar).
  • Azusis - Thursday, January 8, 2015 - link

    Is it likely that the BenQ monitor was using the AU Optronics panel M270DAN02.3? It's the only panel I can think of that is 27" 1440p 144hz. It's likely this panel is also being used in the upcoming Acer XB270HU, which his another 27" 1440p 144hz IPS with G-Sync.

    Article on AU Optronics: http://techreport.com/news/27019/au-optronics-pane...
  • JarredWalton - Thursday, January 8, 2015 - link

    Entirely possible -- I am quite sure Samsung was 4K TN, but it may be that both the BenQ and LG were wide viewing angle. I know one of them was, but that's all I can remember. (Too many products seen in too few days....)
  • Azusis - Thursday, January 8, 2015 - link

    Hah, I can only imagine. Thanks for the input! Regardless, I think a monitor upgrade in Q1 is in my future. :)
  • croc - Thursday, January 8, 2015 - link

    I am really not liking this 'Free-Sync" vs. "G-Sync" differentiation in the monitor segment. Monitors last for donkeys' years, and graphic cards seem to get replaced every time a new fruit fly hatches (around here, anyway...)! I DO NOT want any lock-in between my monitor and a brand of GPU. In fact, I am tempted to not buy a monitor that has any kind of lock-in. How long will it be before the smart monitor manufacturers decide to start making a monitor that will do both...?
  • JarredWalton - Thursday, January 8, 2015 - link

    Keep in mind that FreeSync uses DisplayPort Adaptive Sync and should thus always work. Of course, G-SYNC displays work fine with AMD GPUs as well, but without the adaptive V-SYNC aspect. G-SYNC will remain an NVIDIA exclusive I'm sure, but if FreeSync gains enough support NVIDIA might be forced to abandon G-SYNC and support FreeSync instead. Time will tell if FreeSync will get enough support, but it seems more likely than G-SYNC taking over.
  • chizow - Thursday, January 8, 2015 - link

    Again, it will come down to which solution is better. Why would Nvidia need to abandon G-Sync when their solution holds an overwhelming majority of the potential TAM for variable refresh rate monitors that matches their high-end dGPU marketshare?

    I suspect many expected the same of CUDA as soon as OpenCL rolled out a few years ago, y'know, open standards, multiple disinterested industry parties backing it etc. and yet. CUDA is still alive and dominating today. Because its the better solution.

    I expect nothing less when it comes to G-Sync and FreeSync. The market will respond, and the better solution will prevail.
  • Senti - Thursday, January 8, 2015 - link

    Silly, nVidia doesn't care about "potential" – all they need is "vendor lock in". I'm quite sure that the technical differences for videocards are minimal.

    About CUDA: almost everyone who writes it hates nVidia for that. It's far from "better solution", but you get great developer tools for it (and very important tools, like profiler) while for OpenCL nVidia provides you almost nothing (even AMD profiler is more informative on nVidia GPU than nVidia's one). And it's all pure marketing as both CUDA and OpenCL compiles into the same PTX on nVidia, so no technical reasons for such disparity in support. With equal support and tools CUDA would be dropped and forgotten long ago.
  • invinciblegod - Thursday, January 8, 2015 - link

    You forgot the fact that cuda came first and therefore already had developer tools created for it. Creating new tools require time and money. If Cuda works fine for nVidia, why would they bother making new tools?

Log in

Don't have an account? Sign up now