Intel F-processors: end of the road for integrated graphics?
Intel F-processors come with their on-chip graphics unit disabled. Strange, as F-processors aren’t much cheaper than their non-F cousins. Is Intel planning on fully removing integrated GPUs for their desktop Core-i series?
Since the first Core-i series launched in 2010, Intel has almost always integrated GPUs in their CPUs. At the recent CES, the chip developer announced processors with an added «F». These chips will still have integrated graphical units, these will be either deactivated or defective, however.
Less bang for your buck
Essentially, F-processors are completely normal ninth series Core-i CPUs. If you’re annoyed that Intel is selling these chips at what is basically a slightly lower or equal price than their non-F counterparts, then you’re not alone. I’m the exact same. Anyway, Intel’s pricing isn’t what this article’s about. I’m wondering why the Sultan of semiconductors from Santa Clara simply deactivated the GPUs.
On one hand, shifting a whole line of production to GPU-free processors is not easily justifiable. On the other hand, however, functioning CPUs with defective GPUs could be sold for profit. Instead of gathering dust in some warehouse. Intel is still battling difficulties in manufacturing their 14 nm++ chips. The increase in production that results from adding another suffix to the end of a processor isn’t something the company should frown at.
Revenue is the central reason for the existence of F-processors. It gains even more weight when considering the fact that Intel is planning on entering the graphics card market. An exact entry date is unknown, but 2020 seems like a safe bet. You can’t game with on-chip graphics drivers, of course. However, even if you’re an office user, you’ll still be forced to add a dedicated graphics card to your PC. A new avenue for Intel, as the CPU market is getting more and more competitive. Due to the risk of losing stocks, especially in the mobile sector, Intel is expected to jump at the opportunity to lessen their losses.
ARM: the kingslayer?
Think of Apple what you will. You can’t claim they don’t go with trends. The tech giant is adapting another next year: Apple will begin using ARM chips instead of Intel. Another important buyer for their mobile CPUs Intel will lose. Microsoft is also working on making Windows more ARM-friendly. Windows notebooks could foreseeably work with ARM processors. Intel definitely needs to think about new ways to gain revenue.
Tech YouTuber Coreteks has summarised Intel’s ARM problem in this highly-informative video.
Even if I don’t agree with everything Coreteks is saying, he’s still plausibly explaining why ARM is the future of notebooks and cheap PCs. With the advancements ARM chips have made in the last ten years, they can certainly take on Intel’s mobile processors – they’re also more efficient and much cheaper.
According to the IDC’s Personal Computing Device Forecast 2019-2023, notebooks and mobile workstations comprise 42.4 percent of all Personal Computing Device Market Shares in 2019. Desktop PCs are only at 22.6 percent. Intel currently holds between 75 and 80 percent market share. If notebooks and mobile workstations disappear, it’ll be one hefty financial blow for Intel.
On top of that, Intel is also losing market share to AMD, and not just in the desktop sector. Intel is under extreme pressure to find new markets. The lucrative graphics card business seems like their best bet. Why shouldn’t the average Joe buy a dedicated graphics card – an Intel one at that – for his PC? Bundled deals would be the perfect fit for this situation.
Integrated graphics have given all they’ve got
F-processors are only just the beginning. I believe that Intel won’t add on-chip graphics units on their desktop processors from now on. Still, it’s wrong to paint Intel as this big bad company. Processors without graphical capacities are standard at AMD. And Red Team has been selling graphics cards for years now.
What do you think? Am I blinded by my presumptions and expecting too much of Intel? Or are you of the same mindset? Let me know in the comments.
From big data to big brother, Cyborgs to Sci-Fi. All aspects of technology and society fascinate me.