There is always a fairly fluid movement of engineers in the companies we cover, but recently AMD has made a number of substantial hires into several of its biggest departments.

The newest hire as reported by AMD is of Dan McNamara, former Senior Vice President of Intel’s Network and Custom Logic Group (formerly the Programmable Solutions Group) for several years and one of Intel’s hires from the Altera acquisition, having spent 11 years at Altera. Dan is set to be AMD’s SVP and GM of the Server Business Unit. This means that Dan’s role will expand through to accelerate AMD’s EPYC portfolio in order to engage better with AMD’s customers about server solutions built through AMD hardware. This is a slight jump away from his previous focus of SoCs, ASICs, and FPGAs, which may make some readers think that AMD might be going in that direction: Forrest Norrod is still heading up AMD’s Enterprise, Embedded, and Semi-Custom Business Group. Dan’s hiring was the focus of a recent AMD blog post about promotions and new hires.

While not specifically promoted by AMD in that post, the company has also made two key hires, both of which have spent the last 20+ years at IBM. First on that list is Dr. Bradley (Brad) McCredie, which AMD actually hired back in June. Brad started at IBM back in 1991 focusing on packaging and mainframes, eventually having spent over 28 years at IBM which includes stints in POWER system development and also holding the position of President of the OpenPOWER Foundation. He is now set in a role in AMD as a Corporate Vice President of GPU Platforms, but specifically will cover the execution of AMD’s data center strategy covering CPU and GPU, reporting directly to Forrest Norrod.

The other IBM hire is Joshua (Josh) Friedrich, a 20-year IBM veteran with roles in POWER5 clock gating, the POWER6 frequency lead, the POWER7 Chip Power Lead, the POWER8 Chip Circuit Lead, POWER9 concept/high-level design and uncore development, and his final role was developing future POWER designs at IBM. Within AMD, Josh’s role is listed as Corporate Vice President, and a spokesperson states that Josh’s role is in CPU/GPU integration technologies, reporting to CTO Mark Papermaster. That isn’t a lot to go on, as it could cover APUs or something more unique, and on probing AMD for more information, they’ve confirmed that it’s more on the platform/solution side to create differentiated products.

There is one departure to note: Scott Aylor, the Corporate Vice President and GM of AMD’s Data Center Solutions Group, is currently on leave and is set to leave the company at a future date. Dan McNamara is taking over his role, and CRN is reporting that Aylor’s departure is not related.

Title image, from left to right: Brad McCredie, Dan McNamara, Josh Friedrich

Update 1/22: Our moles have done some extra digging, and AMD hired two other long-time IBM employees in 2019.

Greg Wetli, who AMD hired back in February 2019 to manage server processor validation, spent 31 years at IBM in POWER processor validation as well as different aspects of chip design and tooling as far back as POWER4.

Norman James, hired back in March 2019 as an AMD Fellow on system architecture, spent 23 years at IBM starting as a senior engineer on POWER6 before working through to lead engineer on IBM's Lead Engineer of Cognative Systems, focusing on deep learning and machine learning.

Comments Locked

43 Comments

View All Comments

  • HStewart - Tuesday, January 21, 2020 - link

    I think the opposite can be true especially dealing with coding. 64 bit and extra processors have made developers lazy - I pretty sure very few developers monitor memory leaks and certain don't count clock cycle.

    There is no wonder in today's world there is so many viruses and spyware.
  • Lord of the Bored - Wednesday, January 22, 2020 - link

    Of course you hate 64-bit. Can't stand that Intel had to copy AMD64, can you?

    Also, IBM shipped the first 64-bit computer in 1961. It wasn't very good, but a few years later the Cray-1 was 64-bit and it was the best in the world.
    Extra word size and address space and multithreading aren't what makes people lazy, as those have been around FOREVER. What makes them lazy is a complete lack of consequences for shitting out bad code.

    And there's so much malware because there's so many computers.
  • Korguz - Wednesday, January 22, 2020 - link

    Lord of the Bored, you also forgot that intel also copied amd with the on die memory controller :-)

    "There is no wonder in today's world there is so many viruses and spyware. " what does that have to do with anything?? i think there are so many of those.. is because some people have nothing else better to do with their time, instead of doing something useful with it.
  • BurntMyBacon - Wednesday, January 22, 2020 - link

    @Korguz: " "There is no wonder in today's world there is so many viruses and spyware. " what does that have to do with anything?? i think there are so many of those.. is because some people have nothing else better to do with their time, instead of doing something useful with it."

    I think this statement actually makes some sense in context. Access to a surplus of memory and cpu resources has allow bad programmers to write inefficient code that often has memory leaks and other flaws that can be exploited by viruses and spyware. Before this surplus of resources, code needed to be better written just to be usable. Now, slow memory leaks and tanking an entire cpu core for no good reason can be hidden to some extent as many people have more memory and cpu cores than they would otherwise need. The 64bit part of the comment only make sense in that it allowed for more memory capacity in PCs. I don't think 64bit instructions really play into it.
  • HStewart - Wednesday, January 22, 2020 - link

    I am aware of at least one application that has 64/32 bit problem that I used at work. This is Cisco WebEx - it still on 32 bit, but developer has yet upgrade it to 64 bit.
  • Xyler94 - Monday, January 27, 2020 - link

    And you can thank AMD for making it possible to continue to use 32bit X86 with a 64bit instruction set.

    Intel beat AMD at making a pure 64bit solution, but the world demanded 32bit and 64bit support together, so they had to go back to the drawing board, and AMD won the race, hence AMD64 lives on today.

    If you've ever booted a Linux machine running 64bit, you'll constantly see references to AMD64.
  • Korguz - Tuesday, January 28, 2020 - link

    BurntMyBacon and all that was around before x86-64, point is ?? programmers were lazy with 32bit, and maybe more lazy with 64bit.
  • FreckledTrout - Wednesday, January 22, 2020 - link

    They did ship a 64-bit machine although it wasn't what we would truly call a 64-bit machine in today's standards but it did have 64-bit registers. That mainframe sold for around $10 million which in today's prices would be about $86 million. It also weighed around 70,000 pounds and took up an entire large room.
  • BurntMyBacon - Wednesday, January 22, 2020 - link

    @ Lord of the Bored: "Extra word size and address space and multithreading aren't what makes people lazy, as those have been around FOREVER."

    In this context, you are talking about a small group of people with (at the time) very specialized education trying to program as efficiently as possible to solve mission critical problems on the most powerful computer in the world. This is not a good equivalent to the rather large group of programmers today programming (in many cases) non-mission critical software for commonly underutilized computers. Despite the Cray 1's relative power, the compute resources of this machine were extremely limited for the workloads that it was used for. Inefficiencies were not as easy to hide and the impact was far more severe when compared to today's underutilized PCs.

    @Lord of the Bored: "Extra word size and address space and multithreading aren't what makes people lazy, as those have been around FOREVER."
    I agree with your point. I don't think the extra word size and address space and multithreading are making programmers lazy. However, I do think that they allow lazy programmers to exist where more restricted resources might have forced them to better optimize or find a different job.
  • Lord of the Bored - Thursday, January 23, 2020 - link

    My point was actually that "better computers" don't make more malware and worse code. It is the AVAILABILITY of computers that makes more malware.

    As far as the lack of equivalence to modern software development and the Cray-1, that lack of equivalence is my point. A few elites that understood every step of the process coded for the Cray-1.
    The overall environment for most modern software development is... different, to put it mildly, and it generates a lot of shitty code. Why learn the system when you can cut/paste blindly from Stack Exchange?

Log in

Don't have an account? Sign up now