POST A COMMENT

25 Comments

Back to Article

  • shabby - Monday, October 16, 2023 - link

    100-200mhz bumps lol Reply
  • shabby - Monday, October 16, 2023 - link

    Only the 700k has a descent core bump, the rest are junk Reply
  • Flunk - Monday, October 16, 2023 - link

    ZZZZ, why bother? Reply
  • kpb321 - Monday, October 16, 2023 - link

    Okay so they released the i9-13900KS as a regular chip for the i9-14900K and the higher end i7 chips are now partially binned i9 chips. Pretty big yawn. No wonder they are fine announcing these ahead of time. Maybe it will push the prices of 12th and 13th gen chips down even more making them even better options but this is the non-upgrade I think pretty much everyone expected. Reply
  • Duwelon - Monday, October 16, 2023 - link

    What are desktop users (i.e. gamers) supposed to do with 8+ E cores? Do they assist gaming workloads even a little bit? Whats the point of them on desktop again? Reply
  • Tom_Yum - Monday, October 16, 2023 - link

    To win multicore benchmark scores. Reply
  • wrosecrans - Monday, October 16, 2023 - link

    The short answer is that 8 extra cores is no particular benefit to somebody playing a video game. No current game requires that many cores because the market for such a game would be vanishingly small.

    If you have a zillion browser tabs open to crappy pages with JS that burns cycles, it can be helpful. Which is kinda sad that so many modern web pages have gotten so bad that looking at the web is a serious driver for faster CPU's. But the main benefit is folks doing things like content creation. If you are rendering CGI while also editing video, you can't have too many cores. Or if you are a developer working on big projects, being able to build across more cores can be a big help. (Though at this point, some real world applications have fewer source files than a modern desktop has cores. So if you mainly work on tiny utilities, even developers may start to see a point of diminishing returns with more cores!)
    Reply
  • Duwelon - Monday, October 16, 2023 - link

    I wonder if we're not just witnessing an AI-driven fad. I built a 16 core machine 3 years ago (zen 3) thinking I'd get a lot of use out of it with ML-driven photo editing / enhancement tools. The technology is good for niche cases but it's very primitive, not ready for commercial use in many cases, at least not when it's obvious it's computer generated. In other words, are the vast majority of these E Cores doing absolutely nothing? Reply
  • Samus - Tuesday, October 17, 2023 - link

    Exactly this. Bearing in mind that most games are built around an 8-core ecosystem (due to the XBOX and PS5 being such) it's doubtful PC ports, or PC games that may be ported to consoles, will be designed around utilizing more than 8 cores. The fact is most AAA games are still GPU limited on the most powerful CPU's.

    Elsewhere, in other applications, cores are incredibly useful, but not to most typical desktop users. Compression\decompression benefit tremendously as does any other multithreaded task. But with rendering, especially AI plugins for Adobe etc, using GPU acceleration, CPU performance again is becoming less critical.

    And this is why Intel can get away with calling 13th gen parts 14th gen parts: nobody cares.
    Reply
  • Dizoja86 - Monday, October 16, 2023 - link

    Multitasking and productivity. A lot of us use our computers for more than gaming.

    Although there are few current games that would utilize every core, I would definitely make use of these in working with photography and scientific software. I definitely don't regret my 5900x for similar reasons, even though a 5800x would do the same job in most games.
    Reply
  • aparangement - Monday, October 16, 2023 - link

    I'm wondering what exactly are the "more than gaming" working loads that actually depends the E cores.

    For laptops I guess a few E cores make sense as light weighted tasks could be transferred. But even for laptop, indeed 2 or 3 E cores are not good enough?

    Servers on the other hand might also benefit from large number of E cores, since most of the tasks are heavily threaded, and E cores are more energy saving.

    But home use computers? I really don't get it...
    Reply
  • Samus - Tuesday, October 17, 2023 - link

    E cores are exceptionally powerful. Intel's E cores are between Skylake and Coffee Lake cores in performance, and Zen 2 cores used in the XBOX and PS5 are comparable to Coffee Lake.

    I wouldn't dare say E cores are as powerful as Zen 2 cores, but in the case of a console SoC, I bet they are pretty close due to power budgeting, the lower clockspeed of XBOX and PS5 (~3.5GHz) and the way RDNA2 GPU core utilization reduces the package power availability to the CPU cores, reducing their ability to clock high.
    Reply
  • Duwelon - Tuesday, October 17, 2023 - link

    Who in the mainstream are these E cores on desktop helping? Personally I suspect whether a mid-tier or higher PC has 8 E cores for 8 Million E Cores the vast majority of users wouldn't notice a single difference because practically speaking almost nothing can scale to more than 4 or 8 cores. Reply
  • Tilmitt - Monday, October 16, 2023 - link

    Back to refreshing Skylake for 5 years again. Reply
  • lmcd - Monday, October 16, 2023 - link

    Ironically Skylake refreshes were more justified. Skylake had a ton of errata and Kaby Lake fixed most of its sleep issues, while Coffee Lake supposedly fixed virtualization-based security and Comet Lake fixed Meltdown issues.

    Raptor Lake is maybe a few fixes and a better spin, and this refresh is even more pointless (and probably isn't even a spin).
    Reply
  • meacupla - Tuesday, October 17, 2023 - link

    At least this one still works on LGA1700.
    When was the last time Intel supported the same socket for 3 generations of CPUs?
    Reply
  • duploxxx - Tuesday, October 17, 2023 - link

    so just a new stepping is now called a generation? Yeah you are so aligned with Intel marketing tricks.... All this is the result of failing Meteor lake, not capable of having enough P core in the package and unable to go high enough in GHZ to leap vs RDL. Reply
  • PeachNCream - Tuesday, October 17, 2023 - link

    Yeah, whatever. 253W TDP. That's stupid even in an obsolete desktop form factor case. Managing cooling and power consumption on a monthly utility bill for something like that and the inevitably mandatory, comically-gigantic graphics card people will pair it up with is beyond reason. While it isn't quite as irresponsible as having children, the impact of entertainment is dreadfully high for all of us living on this dying planet together. Reply
  • Tilmitt - Tuesday, October 17, 2023 - link

    Your genes will be replaced by people with no inhibitions over reproducing. Reply
  • charlesg - Tuesday, October 17, 2023 - link

    What a depressing and distorted viewpoint of reality!
    Assuming you actually believe this nonsense, and aren't a bot, maybe it's time to subscribe to a new ideology that is closer to reality?
    You'll be much happier.
    Reply
  • jimbo2779 - Tuesday, October 17, 2023 - link

    I thought they were dropping the "i" branding. Reply
  • ingwe - Tuesday, October 17, 2023 - link

    These power draws just feel wild. I am running a 5600X at 30W that does just fine. I realize I am leaving some performance on the table, but even in some more demanding games it isn't a problem. I can't imagine running my CPU at 250 W. Reply
  • Eletriarnation - Tuesday, October 17, 2023 - link

    This is exciting, not because of anything about the chips themselves but because this is the third generation of processors on a single socket. We haven't seen Intel do this since Socket 775 unless I'm forgetting something. Maybe they'll be a bit more willing to let platforms last 3+ years going forward. Reply
  • Samus - Tuesday, October 17, 2023 - link

    I couldn't help but notice that too, but does 14th gen even count as a generational advancement? Reply
  • hubick - Tuesday, October 17, 2023 - link

    How many people work at Intel? Y'all showed up every day for a year, and THIS is what we get from all those hours?

    This amounts to NOTHING. Nothing of real use anyway. Garbage. Replace everyone involved with someone who can get Thunderbolt 5 out the door faster, at least that will be useful for something.

    Also, DDR5 is still basically a wash over DDR4 due to latency, except maybe at the very very high-end? Y'all need to push RAM vendors to make a system upgrade worthwhile.

    I feel like my Threadripper 3960X was a hell of a good time to buy a new system. I'm not feeling even slightly compelled to replace it with this garbage. Maybe when TB5 does come out... but that probably won't actually be on shelves until 2025.
    Reply

Log in

Don't have an account? Sign up now