Skip to main content


Apple's last tower topples… and the others will follow


in reply to Powderhorn

Just because Apple failed to make expandible hardware doesn't mean if won't still work for PC's.
in reply to realitista

The interesting thing is the people who will care the most about this are professional users, who actually did require a machine with real expandability, to stuff full of the likes of SDI video IO cards (eg aja.com/products/kona-5).

If you ask those people, they'll undoubtedly gladly tell you how much it sucked dealing with Thunderbolt-to-PCIe expansion cages during the "Trashcan" era in order to use their machine for their work.

While Thunderbolt's throughput has certainly improved a bunch since then (80Gbps symmetrical or 120/40Gbps asymmetrical for TB5, vs 20Gbps for TB2 back in that era), latency and stability still frankly leave a lot to be desired versus a real PCIe slot.

For people who already perceive Apple devices as overpriced toy computers, their further alienating what was at one point their primary target audience - high-end professional users - will certainly seem like an odd choice.

in reply to Powderhorn

The writing on the wall is large and clear. You can still have high-end kit, but you don't get to put it together from discrete bits. The fastest parts – the CPU, GPU, volatile and non-volatile storage – all get assembled as a single, highly integrated, non-upgradable component.


Honestly I'm shocked desktop PCs have lasted this long.

That being said, PC gaming is a growing trend, not shrinking, so I suspect there will continue to be at least some availability in the future for those components?

Additionally, while Macs are really great at some workloads, they're still inferior in others to existing desktop machines with dedicated GPUs, and the closest competitor from Apple will still cost at least twice as much.

in reply to artyom

Desktop PCs are so much more powerful and fast than laptops of the same spec. Not to mention cheaper.

High integration on laptops decreases space and cost by wildly increasing battery life for the same battery

This entry was edited (1 day ago)
in reply to Kairos

This isn't about laptop/desktop but about modular vs. Integrated processors.
in reply to artyom

Integrated processors let laptops be faster without also using power. Strictly speaking it'd be cheaper to just use a faster CPU but battery life is more important than cost so lots of money is spent on integrating processors.

Desktops are still around because they're upgradable and faster than their laptop brothers.

in reply to artyom

An AIO is effectively a laptop without a keyboard. They're functionally very similar (appealing to less power-hungry users). They're just less mobile.

Presumably it's cheaper for apple to just put the integrated CPUs in everything because it'd be expensive to make another model.

I garuntee you this trade off only makes sense for Apple. Other AIOs don't always have the new laptop chips from Intel because it makes more sense to use the desktop one with all the space they have.

This entry was edited (1 day ago)
in reply to Kairos

They put them in everything because they're smaller and more efficient (and thus quieter) and because they're competitive with PC desktops in performance. And economies of scale doesn't hurt either.
in reply to artyom

I get what you mean. What I'm trying to say is that desktop/non integrated CPUs are cheaper and this cost savings continues into a large form factor. Apple doesn't put a desktop chip in their iMacs because they don't make one. That's not what their customer base needs. If they did it'd be 4x faster for the same price.

And these arm chips are slower than x86. X86 is so much faster at least for single core performance which matters a LOT more for desktop use cases

This entry was edited (1 day ago)
in reply to Kairos

Again, it depends on the workload. There are endless comparisons between high end desktops and comparably-priced Mac desktops, and while the PC is often more powerful, that's not always the case , and the Mac does it while being much quieter, and not turning the room it's in into a sauna.
in reply to artyom

Yes as it turns out when your workload is 99999 idle applications, a larger number cores helps more than single core performance. SOCs don't change that. They just reduce power and space usage at the expense of cost. It makes no sense to point at the special-case computing company and say that their special case will suddenly override a 50 year pattern.
This entry was edited (1 day ago)
in reply to Kairos

It's nothing to do with idle applications, you should really look more into this because what you're saying is simply misinformed.
This entry was edited (1 day ago)
in reply to artyom

Can you give me an example? The video you linked has a timestamp to something about video encoding.
in reply to Kairos

The Mac is comparable in photo and video editing a dominates in LLM generation.
in reply to artyom

Which again would be cheaper if they put the chips in separate enclosures. Just way bigger and more power usage.

Macs are good at video editing because Apple actually givesa shit about hardware encoding. NVENC is the only competitor. Everything else is shit.

This entry was edited (1 day ago)
in reply to artyom

I'm not sold that modular desktops are going away in general. SoCs have some benefits in terms of power usage, but those are most-substantial on phones and least-substantial on the desktop.

My understanding is that memory may move away from DIMMs to CAMM2 to permit for higher speeds, but that's still a modular system.

in reply to tal

CAMM has been around for years now but I've never seen a single model using them. Even Framework passed on them with their new desktop.
in reply to artyom

You don't need to as long as you're getting sufficient speeds from non-soldered DIMMs, and desktops are generally still using non-soldered DIMMs.
in reply to tal

Yes. That Apple can do these things because their soc is their market deferential. It's not an over all market direction.
in reply to artyom

PC gaming is a growing trend, not shrinking


Wait until we see the 2026 stats for hardware sales. 📉

Though I think the supply issues will hurt consoles just as much.

in reply to Powderhorn

I'm not understanding the logic here. Apple killed their last tower. That isn't surprising, and their user base is perfectly happy buying nothing but SOCs.

Then there is a still-expanding PC gaming market, where building the machine from discrete parts is a portion of the hobby. By and large, this has never really overlapped with Apple's user base.

The article does a poor job saying why we should expect non-Apple machines to go the same direction.

in reply to circuitfarmer

They already are. Increased speed and efficiency are solid reasons. The Mac Pro was absolutely enormous in comparison to the new Mac Studio, which absolutely blows it away in terms of performance, while being a lot cheaper. Strix Halo is a great example of similar benefits on the PC front.

The vast majority of PCs aren't sold to hobbyists. Gamers mostly benefit from the existence of other markets that they can sell these chips too. If those go, these chips get taken off the market.

in reply to Powderhorn

Aside from them, discrete graphics cards are history, just as disk controllers were a few decades earlier. DIMM slots are going too. The primary storage will be built in. (The industry missed a great deal there.)


Discrete disk controllers are still around.

My last desktop had a PCI SATA card that I added after I exhausted all of the on-motherboard SATA slots.

My current one has a JBOD SATA USB Mass Storage enclosure.

This entry was edited (1 day ago)
in reply to tal

We are talking about Apple, the "you'll pay $bigbucks to have one usb port and you'll be happy about it" Apple here...
in reply to SharkAttak

I mean, Apple is the example the author is using to come to his conclusions, but he's talking about the industry as a whole regarding the disk controllers.
in reply to Powderhorn

This is a bad article. It's just an Apple fanboy watching their company continue its trend of shitting on customers and assuming that everyone inevitably will, apparently never once reflecting on whether their insistence of sticking with Apple is the real problem.

Their argument boils down to CPUs increasingly integrating basic versions of other components over time meaning that desktops will disappear... Ignoring that the desktop market has stayed surprisingly flat that entire time and has certainly not disappeared.

If your argument is that integrated CPUs will outclass discrete components connected with high speed buses then you need to make it from an engineering standpoint, not a headline one.

I also don't understand his reasoning that because NVidia don't buy ARM they don't get to make an integrated CPU.... Nvidia made and sold an integrated ARM CPU before ever being rumoured to buy them, and they still make and sell it to this day ... because ARM's entire business model is based on companies like Nvidia licensing their designs.

This entry was edited (1 day ago)
in reply to masterspace

It’s just an Apple fanboy


checks article history

Almost all of their articles are about Linux.

in reply to tal

Hey now, let's let this user craft their own reality!
in reply to tal

And what hardware do they run Linux on? And what phone do they use? And what TV device?

And if they're not a mac fan boy, then they're insistence of looking at Apple as the only possible sign of industry trends is mind boggingly narrow.

This entry was edited (20 hours ago)
in reply to masterspace

It's an opinion piece. I don't agree with all of it, either.

This said, do you really miss having a northbridge and southbridge?

in reply to Powderhorn

Do I really miss it? It never once came up in any practical situation.

You would buy a mobo and a CPU and put them together and not think about the specific buses or controllers you have available, unless you had a very specific reason to.

Unless we're talking about a mobile power constrained device, I certainly would rather have expandable RAM and graphics cards then everything slammed in a single unchanging chip.

And again, the fact that the author states that Nvidia can't release an integrated SoC because they didn't buy ARM, when they actively sell an integrated SoC licensed from ARM, makes the entire rest of their "opinion", untrustworthy.

This entry was edited (20 hours ago)