Nvidia’s DGX Station is powered by the GB300 Grace Blackwell Ultra
OEMs are making their own versions – Dell’s is the Pro Max with GB300
HP’s upcoming GB300 workstation will be the ZGX Fury AI Station G1n
Nvidia has unveiled two DGX personal AI supercomputers powered by its Grace Blackwell platform.
The first of these is DGX Spark (previously called Project Digits), a compact AI supercomputer that runs on Nvidia’s GB10 Grace Blackwell Superchip.
The second is DGX Station, a supercomputer-class workstation that resembles a traditional tower and is built with the Nvidia GB300 Grace Blackwell Ultra Desktop Superchip.
Dell and HP reveal their versions
The GB300 features the latest-generation Tensor Cores and FP4 precision, and the DGX Station includes 784GB of coherent memory space for large-scale training and inferencing workloads, connected to a Grace CPU via NVLink-C2C.
The DGX Station also features the ConnectX-8 SuperNIC, designed to supercharge hyperscale AI computing workloads.
Nvidia’s OEM partners - Asus, HP, and Dell - are producing DGX Spark rivals powered by the same GB10 Superchip. HP and Dell are also preparing competitors to the DGX Station using the GB300.
Dell has shared new details about its upcoming AI workstation, the Pro Max with GB300 (its DGX Spark version is called Pro Max with GB10).
The specs for its supercomputer-class workstation include 784GB of unified memory, up to 288GB of HBM3e GPU memory, and 496GB of LPDDR5X memory for the CPU.
The system delivers up to 20,000 TOPS of FP4 compute performance, making it well suited for training and inferencing LLMs with hundreds of billions of parameters.
HP’s version of the DGX Station is called the ZGX Fury AI Station G1n. Z by HP is now one of the company’s product lines, and the “n” at the end of the name signifies that it’s powered by an Nvidia processor - in this case, the GB300.
HP says the ZGX Fury AI Station G1n “provides everything needed for AI teams to build, optimize, and scale models while maintaining security and flexibility,” noting that it will integrate into HP’s broader AI Station ecosystem, alongside the previously announced ZGX Nano AI Station G1n (its DGX Spark alternative).
HP is also expanding its AI software tools and support offerings, providing resources designed to streamline workflow productivity and enhance local model development.
Pricing for the DGX Station and the Dell and HP workstations isn’t known yet, but they obviously aren’t going to be cheap. Pricing for the tiny DGX Spark starts at $3,999, and the larger machines will cost significantly more.
Asus' new Ascent GX10 brings AI supercomputing power directly to developers
Promises 1000 TOPS of AI processing and can handle models up to 200 billion parameters
It's cheaper than Nvidia DGX Spark, with less storage but similar performance
AI development is getting ever more demanding, and Asus wants to bring high-performance computing straight to the desks of developers, researchers, and data scientists with the Ascent GX10, a compact AI supercomputer powered by Nvidia’s Grace Blackwell GB10 Superchip.
Asus’s rival to Nvidia’s DGX Spark (previously Project Digits) is designed to handle local AI workloads, making it easier to prototype, fine-tune, and run impressively large models without relying entirely on cloud or data center resources.
The Ascent GX10 comes with 128GB of unified memory, and the Blackwell GPU with fifth-generation Tensor Cores and FP4 precision support means it can deliver up to 1000 TOPS of AI processing power. It also includes a 20-core Grace Arm CPU, which speeds up data processing and orchestration for AI inferencing and model tuning. Asus says it will allow developers to work with AI models of up to 200 billion parameters without running into major bottlenecks.
Powerful yet compact
“AI is transforming every industry, and the Asus Ascent GX10 is designed to bring this transformative power to every developer’s fingertips,” said KuoWei Chao, General Manager of Asus IoT and NUC Business Group.
“By integrating the Nvidia Grace Blackwell Superchip, we are providing a powerful yet compact tool that enables developers, data scientists, and AI researchers to innovate and push the boundaries of AI right from their desks.”
Asus has built the GX10 with NVLink-C2C, which provides more than five times the bandwidth of PCIe 5.0, allowing the CPU and GPU to share memory efficiently, improving performance across AI workloads.
The system also comes with an integrated ConnectX network interface, so two GX10 units can be linked together to handle even larger models, such as Llama 3.1 with 405 billion parameters.
Asus says the Ascent GX10 will be available for pre-order in Q2 2025. Pricing details have not yet been confirmed by Asus, but Nvidia says it will cost $2999 and come with 1TB of storage.
In comparison, Nvidia’s own DGX Spark is a thousand dollars more ($3999) and comes with 4TB of storage.
Your skin is an important organ and you need to keep it protected from the harsh UV rays of the sun if you don't want wrinkles. Here are the best sunscreens, picked by our experts.
HP ZBook Fury G1i is a powerful 18-inch mobile workstation
It's powered by up to an Intel Core Ultra 9 285HX and next-gen Nvidia RTX graphics
There's also a 16-inch model available with same high-end specs and features
It’s a personal preference, but I’ve always liked laptops with bigger screens. That means 16-inches for me, but HP thinks 18-inch laptops are what professionals should be aiming for if they are looking to replace their desktop PCs and get a solid productivity boost.
Billed as the world’s most powerful 18-inch mobile workstation, the HP ZBook Fury G1i 18” still manages to fit into a 17-inch backpack.
That extra 2-inches gives you roughly 30% more space to work with, which can come in handy when handling complex datasets, editing high-resolution media, or working across multiple windows.
Three-fan cooling
HP is pitching the laptop at developers and data scientists who need to train and run LLMs directly on the machine.
The Fury G1i 18” runs on Intel’s latest Core Ultra processors, up to the top-end Core Ultra 9 285HX, with peak speeds of 5.5GHz. These chips also include an NPU with up to 13 TOPS of AI performance. HP says the machine will support next-gen Nvidia RTX GPUs.
There’s support for up to 192GB of DDR5 memory and up to 16TB of PCIe Gen5 NVMe storage. Connectivity includes Thunderbolt 5, HDMI 2.1, USB-A ports, an SD card slot, and Ethernet.
The 18-inch display has a WQXGA (2560x1600) resolution, coupled with a fast 165Hz refresh rate, trading pixel density for smoother motion. Thermal performance is handled by a redesigned three-fan cooling system, along with HP’s Vaporforce tech, allowing up to 200W TDP without throttling under sustained load.
Other features include a spill-resistant RGB-backlit keyboard, four Poly Studio speakers, dual-array microphones, and an optional IR camera for facial login.
The Fury G1i is also available in a 16-inch model for anyone who feels 18-inches is too big to lug around. Pricing and availability details for both models is expected shortly.
AMD targets Nvidia’s Blackwell with upcoming Instinct MI355X accelerator
Oracle plans massive 30,000-unit MI355X cluster for high-performance AI workloads
That’s in addition to Stargate, Oracle’s 64,000-GPU Nvidia GB200 cluster
While AI darling Nvidia continues to dominate the AI accelerator market, with a share of over 90%, its closest rival, AMD, is hoping to challenge the Blackwell lineup with its new Instinct MI355X series of GPUs.
The MI355X, now expected to arrive by mid-2025, is manufactured on TSMC’s 3nm node and built on AMD's new CDNA 4 architecture. It will feature 288GB of HBM3E memory, bandwidth of up to 8TB/sec, and support for FP6 and FP4 low-precision computing, positioning it as a strong rival to Nvidia’s Blackwell B100 and B200.
In 2024, we reported on a number of big wins for AMD, which included shipping thousands of its MI300X AI accelerators to Vultr, a leading privately-held cloud computing platform, and to Oracle. Now, the latter has announced plans to build a cluster of 30,000 MI355X AI accelerators.
Stargate
This latest news was revealed during Oracle’s recent Q2 2025 earnings call, where Larry Ellison, Chairman and Chief Technology Officer, told investors, “In Q3, we signed a multi-billion dollar contract with AMD to build a cluster of 30,000 of their latest MI355X GPUs.”
Although he didn’t go into further detail beyond that, Ellison did talk about Project Stargate, saying, “We are in the process of building a gigantic 64,000 GPU liquid-cooled Nvidia GB200 cluster for AI training.”
He later added, “Stargate looks to be the biggest AI training project out there, and we expect that will allow us to grow our RPO even higher in the coming quarters. And we do expect our first large Stargate contract fairly soon.”
When questioned further about Stargate by a Deutsche Bank analyst, Ellison gave a reply that could just as easily apply to the cluster of MI355X AI accelerators Oracle is planning to build.
"The capability we have is to build these huge AI clusters with technology that actually runs faster and more economically than our competitors. So it really is a technology advantage we have over them. If you run faster and you pay by the hour, you cost less. So that technology advantage translates to an economic advantage which allows us to win a lot of these huge deals,” he said.
Ellison also touched on Oracle’s data center strategy, saying, “So, we can start our data centers smaller than our competitors and then we grow based on demand. Building these data centers is expensive, and they’re really expensive if they’re not full or at least half full. So we tend to start small and then add capacity as demand arises.”
Kioxia launches 122.88TB SSD with PCIe Gen5 and dual-port support
The LC9 Series NVMe SSD is designed for AI workloads and hyperscale storage
The new drive comes in a compact 2.5-inch form factor
After nearly seven years at the top, Nimbus Data’s massive Exadrive 100TB 2.5-inch SSD has been dethroned by Kioxia, which has unveiled a new 122.88TB model that not only offers a higher storage capacity but also supports PCIe Gen5, a first for this category.
Several companies have previously announced 120TB-class SSDs, including Solidigm, but Kioxia's LC9 Series 122.88TB NVMe SSD stands out by pairing its ultra-high capacity with a compact 2.5-inch form factor and a next-gen interface with dual-port capability for fault tolerance or connectivity to multiple compute systems.
"AI workloads are stretching the capabilities of data storage, asking for larger capacities and swifter access to the extensive datasets found in today's data lakes, and Kioxia is ready to offer the necessary advanced technologies including 2 Tb QLC BiCS FLASH generation 8 of 3D flash memory, CBA and the complimenting AiSAQ," said Axel Störmann, VP & Chief Technology Officer for SSD and Embedded Memory products at Kioxia Europe GmbH.
Supporting AI system developers' needs
The 122.88TB SSD is aimed at hyperscale storage systems, AI workloads, and other data-intensive applications that rely on capacity and speed. There’s no word on availability or pricing yet, but the company does plan to showcase the new drive at "various upcoming conferences".
"This new LC9 Series NVMe SSD is an instrumental Kioxia product expansion that will support AI system developers' needs for high-capacity storage, high performance, and energy efficiency for applications such as AI model training, inference, and Retrieval-Augmented Generation on a vaster scale," Störmann said.
Reporting on the new SSD, ServeTheHome notes, “This is a hot segment of the market, and it is great to see Kioxia joining. As AI clusters get larger, the shared storage tier is usually measured in Exabytes. Teams have found that replacing hard drives with SSDs often reduces power, footprint, and TCO compared to running hybrid arrays. Moving from lower-capacity drives to the 122.88TB capacity in a 2.5-inch drive form factor really highlights the advantage of flash in these systems.”
The EU is officially out of control. It's now demanding that Apple break down the competitive advantage it's built with attractive features like AirPlay and AirDrop and essentially open them up to the competition. Thereby stripping Apple – bit by bit – of its competitive advantage.
Ever since the EU first implemented its Digital Markets Act, it's treated Apple like a global monopoly or rather a disrespectful child that deserves to spend time in a corner.
I know many cheer these changes. Why should Apple force people to use its App Store or its now retired lightning cable?
Apple has complied but also warned about the dangers of such compliance. When the EU forced sideloading, Apple promised, "the risks will increase." If we haven't seen that happen, it may be because the majority of iPhone owners are still using the trusted and well-regarded App Store.
I consider this a change no one, save the EU and some software companies that pressed the issue, wanted.
In the case of USB-C, I've long believed Apple was heading in that direction anyway but the threat of fines forced Apple's hand and made it accelerate its plans.
Open sesame
Now, though, we have the EU demanding that Apple open up nine core iOS features, including push notifications for non-Apple smartwatches, seamless pairing between non-Apple headphones and Apple devices, and AirPlay and AirDrop. In the last instance, the EU is demanding Apple open iOS up to third-party solutions and ensure they work as well as native software.
Naturally, Apple is not happy and shared this comment with TechRadar:
"Today’s decisions wrap us in red tape, slowing down Apple’s ability to innovate for users in Europe and forcing us to give away our new features for free to companies who don’t have to play by the same rules. It’s bad for our products and for our European users. We will continue to work with the European Commission to help them understand our concerns on behalf of our users."
As I'm sure you can gather from the tone, Apple is fed up. This constant stream of EU enforcements, all designed to diminish Apple and hoist up competitors, is ridiculous and increasingly unfair.
Let's zero in on AirDrop as an example.
Drop it like it's hot
(Image credit: TechRadar)
AirDrop, which lets you quickly share files, photos, and videos between iPhones and other Apple ecosystem devices, arrived more than a decade ago on iOS 7. It was a transformative and brilliant bit of programming that instantly opened up an ad-hoc network between, say, a pair of iPhones. It did require some learning. Open AirDrop settings on phones could result in you unexpectedly receiving an illicit photo (yes, it happened to me once and it was terrible). Apple has since vastly improved AirDrop controls.
Not a lot of people used it at first, but every time I went to a party where I was often taking pictures, I would grab the host and quickly drop the photos onto their phones. They were usually shocked and deeply appreciative.
There was, for years, nothing quite like it on the Android side until Samsung unveiled Quick Share and Google launched Nearby in 2020. The two later merged to become just Quick Share.
There's no doubt Apple's success with AirDrop spurred the development of Quick Share and isn't that exactly how competition is supposed to work? You don't look at one company's successful deployment of technology and then demand that they make it possible for you to deploy a copycat app, and on the successful company's platform no less.
There's no doubt Apple's success with AirDrop spurred the development of Quick Share and isn't that exactly how competition is supposed to work?
But this is what the EU is demanding of Apple. It must make it possible for competitors to compete with Apple on its own platform, and why? Because apparently, they cannot do it without the EU's help.
I actually do not think that's true. Google and Samsung, for instance, are not stepping up to say they do not need this help because it serves them no purpose to do so. If the EU wants to slap Apple, let them. It certainly doesn't harm any of these competitors (until they fall under the EU's watchful gaze).
In the EU's world, there is no difference between competitors. They want a level playing field, even if at an innovation level, one company is outperforming the other.
Ecosystem FTW
Apple has built a fantastic ecosystem that pays significant benefits to those who live inside of it. Yes, that does in a way define which smartwatch and earbuds I use. But, for more than 20 years, it had no impact on the laptop I carried. I was a dyed-in-the-wool Windows fan and even though I used an iPhone and AirPods, and I wore an Apple Watch, I saw no need to switch to a MacBook.
When I did make the switch, it was to see if I liked the macOS experience better than Windows (spoiler alert: I did), and, yes it turns out that there were instant benefits to the switch, like AirDrop access to files on my iPhone and iPad.
Everything is easier when you have all Apple products but that's not an unfair advantage, it's engineering and excellence. The EU would like to wipe that out and make Apple as average as possible so it's fair for everyone. But that's not fair to Apple and, honestly, not to you, the Apple user, either. You pay a premium for the best programming, the best products, and the best interoperability.
Everything is easier when you have all Apple products but that's not an unfair advantage, it's engineering and excellence.
You won't get that by mixing and matching some from Apple and some from, for instance, Samsung, even if the EU wants you to. I love many Samsung, Google, OnePlus, and Microsoft products and there is nothing wrong with a non-homogenous setup. There should not, however, be an issue with all-Apple-all-the-time.
The EU needs to step back and get out of the way of smart technology and only act when consumers are being harmed. There was no harm here, just some small companies whining because they weren't winning.
You might think this is an EU-only issue but remember that what starts in Europe usually flies over the Atlantic to the US and eventually all global markets. Put another way, when the EU sneezes, we all catch a cold.
Google Messages is improving its message-deleting features
You'll soon be able to delete a message for everyone
We now have screenshots showing how the feature works
It's not a great feeling, sending a text and then regretting it – instantly, the next morning, or any time in between – and Google Messages looks set to give users a safety net with the ability to remotely delete messages for everyone in a conversation.
This was first spotted last month, but now the folks at Android Authority have actually managed to get it working. This is based on some code digging done in the latest version of Google Messages for Android.
While the feature isn't live for everyone yet, the Android Authority team tweaked the app to get it to display some of the functionality. Deleting a text brings up a dialog asking if you just want to wipe your local copy of it or erase it for all recipients.
If an image is wiped, that brings up a "Message deleted" placeholder in the conversation for everyone who's participating. It seems as though there's a 15-minute window for deleting – so you'll need to be relatively quick.
Bring it back
This is how Google Messages will let you remotely delete RCS messages for everyone ✅ Detail and more screenshots - https://ift.tt/205fQxi #Android pic.twitter.com/cKHvqe1tmrMarch 19, 2025
The upgrade comes courtesy of RCS Universal Profile v2.7, which Google Messages is in the process of adding support for. The remote delete feature may not be available for devices with older software installed – so bear that in mind for your text chats.
Up until now, deleting a text only removed the message on your own phone. Once it had been delivered and downloaded on the recipient's device(s), there was nothing you could do to bring it back.
That will change when this update finally rolls out in full, though it's not clear exactly when that will be. Considering Android Authority has been able to access some of the screens that show the feature working, it shouldn't be too long now.
Support for this feature varies in other apps: WhatsApp lets you delete sent messages for all recipients, while iMessage lets you delete sent messages, but only your local copy (though you can unsend messages within a two-minute window).