Memory
Latest about memory

Apple may switch to 16GB of memory as default for new M4 series
By Mark Tyson published
An Apple insider has noted that a quartet of new base-level Mac computers are being tested, all with at least 16GB of RAM. Upcoming M4 machines are likely to adhere to this minimum.

Nvidia officially announces a new RTX 4070 with slower GDDR6 memory
By Andrew E. Freedman published
Nvidia is adding GDDR6 memory to the RTX 4070 in order to improve supply. It shouldn't affect performance much if at all, but sadly probably won't change the pricing either.

China's new Hygon CPU spotted with 64 Zen cores
By Anton Shilov published
Hygon's C86-7490 processors use AMD's SP5 packaging, but what resides under the heat spreader?

G.Skill launches ultra-low-latency RAM for Intel and AMD CPUs
By Aaron Klotz published
G.Skill has unveiled a new high-performance DDR5 memory spec aimed purely around performance. The new kit has a very low latency configuration of DDR5-6400 with a CAS latency 30.

Pick up 32GB of speedy 6000 MHz Team T-Force Vulcan DDR5 memory for only $86
By Stewart Bendle published
Snag a deal on 32GB of DDR5 6000 memory with the Team T-Force Vulcan memory kit for only $86.

Nvidia RTX 4070 with slower GDDR6 memory is on the way, according to rumors
By Mark Tyson published
Green team is likely responding to component pricing and supply conditions.

China's CXMT begins mass-producing HBM2 memory well ahead of schedule
By Anton Shilov published
China-based memory maker CXMT begins to mass produce HBM2 memory, well ahead of telegraphed mass production in 2026.

Ampere unveils monstrous 512-core AmpereOne Aurora processor — custom AI engine, support for HBM memory
By Paul Alcorn published
Ampere is adding a 512-core AmpereOne Aurora processor to its roadmap, and it also divulged pricing for its AmpereOne lineup.

SK hynix announces its GDDR7 memory touting 60% faster speeds, 50% improved power efficiency
By Jowi Morales published
Korean firm plans to start GDDR7 mass production in 3Q24.

New memory tech unveiled that reduces AI processing energy requirements by 1,000 times or more
By Jeff Butts published
Seeing the need to improve the energy-efficiency of AI applications, one research team from Minnesota may have cracked the code to lowering energy consumption by a huge amount.
Stay On the Cutting Edge: Get the Tom's Hardware Newsletter
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.