🔗 Gurman: Forget the M3, the M4 is on the way
And for RAM fiends out there, Gurman also reports that Apple is considering a new memory ceiling of 512GB, up from the current high-end maximum of 192GB.
Gurman: Forget the M3, the M4 is on the way – Six Colors
The speed of the jumps are exhilarating. The GPU, NPU and higher memory ceiling clearly are targeting the training of LLMs. However, I'd also be surprised if there wasn't special attention paid to inference and how hardware could speed that up locally for a decent sized LLM model.
And if it's for Gemini, I think it will quite the coup for Google. However, it's Apple so likely a homegrown LLM.
https://sixcolors.com/link/2024/04/gurman-forget-the-m3-the-m4-is-on-the-way
Member discussion