Intel CPU Innovation.. or Lack Thereof?

December 31, 2016 by 33 Comments

Has Intel pushed performance ahead in the last 10 years? Let’s take a look!

Buy Intel CPUs on Amazon:

Enter our Razer giveaway:

Buy Kraken Pro V2:
Buy Kraken 7.1 V2:

GFuel link: Use offer code “LINUS” to save 10% over at

Discuss on the forum:

Our Affiliates, Referral Programs, and Sponsors:

Check out our Linus Tech Tips posters at

Intro Screen Music Credit:
Title: Laszlo – Supernova
Video Link:
iTunes Download Link:
Artist Link:

Outro Screen Music Credit: Approaching Nirvana – Sugar High

Sound effects provided by


33 Replies to “Intel CPU Innovation.. or Lack Thereof?”

  1. You need to redo the test with AMD CPUs

  2. fx 8350 still relevant 😀 haha

  3. Dave G says:

    I disagree with you Linus, and here is why…

    I have been using and programming computers since the 1970's, I went to university for my Bachelors in Electronics Engineering, I am a multi-published author in many electronics magazines, books, and journals, and I have been a hardware developer and software developer for more than 40 years now.  Yes, I'm old.

    In the early days of processors, we saw leaps and bounds from one generation to the next, where there might be such things as significant increases in core clock speed eg 500MHz to 1GHz, wider registers and busses eg 8-bit to 16-bit, and improvements in instructions per clock and cache etc.

    But there are practical limits to core clocks, we can't just keep doubling it.  There are practical limits to bus width.  And practical limits to IPC, cache, etc.  So the "leaps and bounds" that we saw in earlier generations simply will not occur any more.
    Are cars and planes still doubling in speed and performance every year?

    However, the overall system performance has still increased significantly if we compare for example a "total system available then" for an i7-2600K versus a "total system available now" for an i9-9900K.  Over the past few years we moved up to faster DDR4, faster USB3.1, faster SSDs and NVMe PCI drives, faster GPUs, etc.

    However we are also seeing a lot of the software starting to lag significantly behind the hardware.
    Who needs an 8-Core 16-Thread processor if they are simply surfing Facebook and YouTube and playing online card games.  Even the majority of current AAA game titles don't run on engines that support more than 2 to 4 threads.

    I know that some people will talk about some of the push that AMD has brought about recently with their Threadripper CPUs, but I have to disagree about that.  If we ignore price for a moment, Intel has had CPUs with high-core counts in their Xeon series for many years.  And as I stated above, what good is a 16-Core or 32-Core Threadripper or Xeon for 98% of the population, other than bragging rights for those who like "specs" and toys like rainbow color water cooling and RGB LEDs.

    I have had computers based on Intel processors all the way back to the 8088.
    I still have computers here in my office that have everything from the i7-2600K 4-Core HT to the I7-6950X 10-Core HT, and while yes, limiting our look strictly to the raw performance numbers of the CPU alone, the i7-2600K is still a great processor, but the overall system comparison is night and day between my i7-2600 8GB-DDR3-1333 HD6850 WD-Black-500GB Windows-7 vs the i7-6950X 64GB-DDR4-2666 GTX1080 WD-Blue-SSD Windows-10.  The newer system overall is significantly faster in all ways.  Even the benchmarks for the processor alone show a good enough difference to be noticeable PassMark i7-2600 = 8186, i7-6950X = 19945.

    The biggest thing that we need these days in performance boost in my opinion, is better use of threading, so that the high-core-count Intel i9/Xeon and AMD Threadripper processors can actually get utilized to their maximum.
    As I mentioned earlier, the majority of game engines are two to four threads.  Most software that people run doesn't require high-core count systems.  Even software like Adobe Premiere Pro doesn't utilize the full hardware.  My video editor system is an i7-6800K 6-Core-12-Thread 32GB-DDR4-2666 dual-AMD-W7100-8GB WD-Blue-SSD, and even on 2.5k RAW BMD footage with LUT and color-correction and 8 to 10 GPU effects, I still never get over 50% CPU and about 15% GPU usage.

    I also develop software (Demenzun Media Inc.) and my current 3D software can utilize up to 1024 Cores and SIMD for significant performance, easily max'ing any CPU at 100%.  Unfortunately, the vast majority of software doesn't.  So most people's CPU's will typically be sitting idle for most of their time…

    And for the gamers, since the vast majority of games are not CPU-intensive, upgrading your GPU is almost always the best way to improve performance.  Unfortunately those RTX-2080s are pretty pricey.  🙂

    Feel free to trash me in the comments, I know that I am more of a "Productivity Software" user than a Gamer, which is what most of the people who visit this channel are.

  4. Watching this on my PC with a 2990WX

  5. Andy T says:

    3 years on- how well did this work out for them?

  6. Will you be doing any AMD EPYC or Opteron server build guides?

  7. I like the amd cpu content even more.

  8. January, 2017
    Ryzen has entered the chat

  9. Yeah, competition has returned, SO…

  10. past ball says:

    I just want updates over and over until my phone turns into a transformer

  11. Sulphurous says:

    I thought he was going to say their mottto was no longer lead ahead I the end.

  12. He actually advertised GFUEL once…

  13. I had a Matrox G400 and a Cyrix CPU on a VIA Chipset at some point. There used to be a time where you had more choices to run x86.

  14. Geier S says:

    I think Amd ryzen 2 proved that it is very much possible to improve cpus with less cost.

  15. SoylentGamer says:

    Also gaming is kind of starting to peak in graphical fidelity. VR will likely push this further for a few years, but after that, Moore's law for gaming is dead. We'll be gaming on cheap silicon for centuries. No need to waste valuable graphene chips on silly games. They'll be for the servers, supercomputers and mainframes.

  16. ShavedBird says:

    This certainly aged well

  17. Adam Nenn says:

    I’m here from the future to tell you it gets better

  18. 0:43 the CEO of Linus Media Group ladies and gentlemen

  19. fm00078 says:

    WOAH 2016 comparaSIN. Being that AMD (per-say) didn't continue innovation gave Intel a break from the MAD RUSH to out mfg them. Thus giving Intel time to Think Tank other venues & ventures.
    IF there's a next step in CPU design, maybe Intel will go back to Double-Cpu style (no not like that quad-size memory stick) but instead something like a 1.5 or 2.0 wider CPU chip. This is not innovation but a temporary solution. Double Wide CPU's can still fit on current MoBo designs without reinventing the wheel.
    Another point, MFG costs will be minimal. Maybe that's why the 2016 chip was $1800.oo… redesign & tooling?
    Progressing thru Moore's Law there will be walls & stumbling blocks. This forces MFG's to rethink and\or retool. My wider CPU suggestion gives users more speed and MFG's a little more time to innovate. A + +
    BUT, I still think a Hologram CPU is in the future, tho not for at least another 10 to 30 years. Why such a big time spread? Kids this past 20 years don't wanna work thus shrinking the innovation pool.

  20. fm00078 says:

    HEY, Linus… No need to make a whole new video, add to this video and update with last 3.5 years of CPU's… if possible.
    GREAT WORK by the way.

  21. fm00078 says:

    HEY, Linus… No need to make a whole new video, add to this video and update with last 3.5 years of CPU's… if possible.
    GREAT WORK by the way.

  22. BOI this video aged well

  23. Mumu Alpaka says:

    3 years later, AMD took over the scenes lol

  24. Year is 2020. Date is 20 July. I report Intel has fallen. AMD has ryzen in its place.

  25. Fabian says:

    Meanwhile Intel in 2020: 14nm++++++++++++++++++

  26. Noob YTP says:

    Linus is the only…
    whatever to fit in every single role.

  27. azrul nizam says:

    …then Ryzen happen

    Intel: panic

  28. I still use win7 ultimate 2020

  29. Watching this from 2020: well let’s just say that things have changed over the last 3-4 years

  30. Jumpier Wolf says:

    Good thing AMD stepped in

Leave a Comment

Your email address will not be published. Required fields are marked *