China allows first wave of chatbots, India's sun-spotter soars; ASUS smacks down speculation it will quit smartphones
Samsung last Friday announced it has developed a 32-gigabit DDR5 DRAM die using its 12 nanometer-class process technology.
The Korean giant has mass-produced 16Gb DRAM since May 2023, and claimed its new and denser product "paves way to DRAM modules of up to 1TB capacity" without offering any hint of a roadmap or timeframe for those colosso-modules to debut.
Nor has Samsung hinted at the price a 32Gb module will command.
But SangJoon Hwang, executive vice president of DRAM Product & Technology at Samsung Electronics, said 1TB modules will be welcomed by those running AI and big data applications.
Designed for DARPA, Intel's RISC-based 7nm chip can hit a theoretical bandwidth of 1TB/s
Intel has unveiled a mind-boggling new chip that uses optical networking technology to pack 66 threads into each core, rather than the standard two. The result is a monstrously powerful eight-core CPU with 528 threads.
Revealed at the Hot Chips 2023 conference, the 7nm prototype CPU is designed to churn through massive data processing workloads at 1TB/s. It's the first mesh-to-mesh photonic fabric that uses silicon photonic interconnects to link chips together.
It was designed specifically for DARPA's Hierarchical Identity Verify Exploit (HIVE) program, according to The Register. This program is looking to build a graph analytics processor with a novel architecture that can process data at lightning speeds without consuming excessive levels of power.
There is no question in our minds here at The Next Platform that quantum computing, in some fashion, will be part of the workflow for solving some of the peskiest computational problems the world can think of.
What we have had our doubts about is if anyone can ever make any money from quantum computing. It is not going to be a volume product, and it is not going to be inexpensive, either. The cost will probably be a fraction of the value of the benefits derived from quantum computing, of course, just like in the HPC simulation and modeling business, but the HPC business has not been particularly profitable over the past six decades, either.
Now, AI training and inference at scale - that is another story entirely. And we aren't writing that one today - and on purpose because, frankly, as have AI poisoning. Maybe we all have AI poisoning at this point. Let's take a breather together. . . .
While the theory of quantum mechanics is complex, the tech might offer practical uses for organizations. Learn about the challenges and opportunities of quantum computing.
Quantum computing may seem like a technology that belongs in the distant future, but there are potential real-world applications today. Still, in order to take advantage of those, organizations need to address the challenges quantum computing poses.
Many organizations are looking into practical use cases of quantum computing and for good reason. Quantum computing has the potential to solve computational problems significantly faster than existing classical computing approaches. Businesses might want to use quantum computing for optimizing investments, improving cybersecurity or discovering new paths for value creation.
Two semiconductor analysts have written an Emerging Memories report and suggest MRAM has good prospects for replacing SRAM and NOR flash in edge computing devices needing to process and analyze data in real-time.
Tom Coughlin of Coughlin Associates and Objective Analysis' Jim Handy have produced a 272-page, 30-table analysis of the prospects for five emerging memory technologies: MRAM, Phase-Change Memory (PCM), Ferro-Electric RAM (FERAM), Resistive RAM (ReRAM) and NRAM/UltraRAM. These are typically non-volatile memories with DRAM data access speeds, and roadmaps to increased densities that promise to go beyond the scaling limits of NAND and NOR and use less electricity than the constantly refreshed DRAM and SRAM.
See all Archived IT News - Technology articles
See all articles from this issue