Wednesday, March 19, 2025

Razer Wants to Be Your Copilot. New Developer Tools May Bring More AI to Games

The new software developer kit aims to help developers catch more bugs and give better in-game advice.

from CNET https://ift.tt/Mx9W8UV

Latest Tech News

The EU is officially out of control. It's now demanding that Apple break down the competitive advantage it's built with attractive features like AirPlay and AirDrop and essentially open them up to the competition. Thereby stripping Apple – bit by bit – of its competitive advantage.

Ever since the EU first implemented its Digital Markets Act, it's treated Apple like a global monopoly or rather a disrespectful child that deserves to spend time in a corner.

It's used the strength of the union to force Apple to make technical changes that theoretically benefit its constituents, like a charge port standard (USB-C) and more recently the side-loading of Apps outside the protective arms of the App Store.

I know many cheer these changes. Why should Apple force people to use its App Store or its now retired lightning cable?

Apple has complied but also warned about the dangers of such compliance. When the EU forced sideloading, Apple promised, "the risks will increase." If we haven't seen that happen, it may be because the majority of iPhone owners are still using the trusted and well-regarded App Store.

I consider this a change no one, save the EU and some software companies that pressed the issue, wanted.

In the case of USB-C, I've long believed Apple was heading in that direction anyway but the threat of fines forced Apple's hand and made it accelerate its plans.

Open sesame

Now, though, we have the EU demanding that Apple open up nine core iOS features, including push notifications for non-Apple smartwatches, seamless pairing between non-Apple headphones and Apple devices, and AirPlay and AirDrop. In the last instance, the EU is demanding Apple open iOS up to third-party solutions and ensure they work as well as native software.

Naturally, Apple is not happy and shared this comment with TechRadar:

"Today’s decisions wrap us in red tape, slowing down Apple’s ability to innovate for users in Europe and forcing us to give away our new features for free to companies who don’t have to play by the same rules. It’s bad for our products and for our European users. We will continue to work with the European Commission to help them understand our concerns on behalf of our users."

As I'm sure you can gather from the tone, Apple is fed up. This constant stream of EU enforcements, all designed to diminish Apple and hoist up competitors, is ridiculous and increasingly unfair.

Let's zero in on AirDrop as an example.

Drop it like it's hot

AirDrop on an Apple device.

(Image credit: TechRadar)

AirDrop, which lets you quickly share files, photos, and videos between iPhones and other Apple ecosystem devices, arrived more than a decade ago on iOS 7. It was a transformative and brilliant bit of programming that instantly opened up an ad-hoc network between, say, a pair of iPhones. It did require some learning. Open AirDrop settings on phones could result in you unexpectedly receiving an illicit photo (yes, it happened to me once and it was terrible). Apple has since vastly improved AirDrop controls.

Not a lot of people used it at first, but every time I went to a party where I was often taking pictures, I would grab the host and quickly drop the photos onto their phones. They were usually shocked and deeply appreciative.

There was, for years, nothing quite like it on the Android side until Samsung unveiled Quick Share and Google launched Nearby in 2020. The two later merged to become just Quick Share.

There's no doubt Apple's success with AirDrop spurred the development of Quick Share and isn't that exactly how competition is supposed to work? You don't look at one company's successful deployment of technology and then demand that they make it possible for you to deploy a copycat app, and on the successful company's platform no less.

There's no doubt Apple's success with AirDrop spurred the development of Quick Share and isn't that exactly how competition is supposed to work?

But this is what the EU is demanding of Apple. It must make it possible for competitors to compete with Apple on its own platform, and why? Because apparently, they cannot do it without the EU's help.

I actually do not think that's true. Google and Samsung, for instance, are not stepping up to say they do not need this help because it serves them no purpose to do so. If the EU wants to slap Apple, let them. It certainly doesn't harm any of these competitors (until they fall under the EU's watchful gaze).

In the EU's world, there is no difference between competitors. They want a level playing field, even if at an innovation level, one company is outperforming the other.

Ecosystem FTW

Apple has built a fantastic ecosystem that pays significant benefits to those who live inside of it. Yes, that does in a way define which smartwatch and earbuds I use. But, for more than 20 years, it had no impact on the laptop I carried. I was a dyed-in-the-wool Windows fan and even though I used an iPhone and AirPods, and I wore an Apple Watch, I saw no need to switch to a MacBook.

When I did make the switch, it was to see if I liked the macOS experience better than Windows (spoiler alert: I did), and, yes it turns out that there were instant benefits to the switch, like AirDrop access to files on my iPhone and iPad.

Everything is easier when you have all Apple products but that's not an unfair advantage, it's engineering and excellence. The EU would like to wipe that out and make Apple as average as possible so it's fair for everyone. But that's not fair to Apple and, honestly, not to you, the Apple user, either. You pay a premium for the best programming, the best products, and the best interoperability.

Everything is easier when you have all Apple products but that's not an unfair advantage, it's engineering and excellence.

You won't get that by mixing and matching some from Apple and some from, for instance, Samsung, even if the EU wants you to. I love many Samsung, Google, OnePlus, and Microsoft products and there is nothing wrong with a non-homogenous setup. There should not, however, be an issue with all-Apple-all-the-time.

The EU needs to step back and get out of the way of smart technology and only act when consumers are being harmed. There was no harm here, just some small companies whining because they weren't winning.

You might think this is an EU-only issue but remember that what starts in Europe usually flies over the Atlantic to the US and eventually all global markets. Put another way, when the EU sneezes, we all catch a cold.

You might also like



from Latest from TechRadar US in News,opinion https://ift.tt/G28BEob

Latest Tech News


  • Google Messages is improving its message-deleting features
  • You'll soon be able to delete a message for everyone
  • We now have screenshots showing how the feature works

It's not a great feeling, sending a text and then regretting it – instantly, the next morning, or any time in between – and Google Messages looks set to give users a safety net with the ability to remotely delete messages for everyone in a conversation.

This was first spotted last month, but now the folks at Android Authority have actually managed to get it working. This is based on some code digging done in the latest version of Google Messages for Android.

While the feature isn't live for everyone yet, the Android Authority team tweaked the app to get it to display some of the functionality. Deleting a text brings up a dialog asking if you just want to wipe your local copy of it or erase it for all recipients.

If an image is wiped, that brings up a "Message deleted" placeholder in the conversation for everyone who's participating. It seems as though there's a 15-minute window for deleting – so you'll need to be relatively quick.

Bring it back

The upgrade comes courtesy of RCS Universal Profile v2.7, which Google Messages is in the process of adding support for. The remote delete feature may not be available for devices with older software installed – so bear that in mind for your text chats.

Up until now, deleting a text only removed the message on your own phone. Once it had been delivered and downloaded on the recipient's device(s), there was nothing you could do to bring it back.

That will change when this update finally rolls out in full, though it's not clear exactly when that will be. Considering Android Authority has been able to access some of the screens that show the feature working, it shouldn't be too long now.

Support for this feature varies in other apps: WhatsApp lets you delete sent messages for all recipients, while iMessage lets you delete sent messages, but only your local copy (though you can unsend messages within a two-minute window).

You might also like



from Latest from TechRadar US in News,opinion https://ift.tt/W4eXITM

Latest Tech News


  • Microsoft & Oracle absent from nuclear pledge signed by Amazon, Meta, and Google
  • It aims to triple nuclear capacity by 2050 to support global energy needs
  • Nuclear seen as key to powering AI-driven data centers with clean energy

Even through Microsoft is seriously exploring nuclear energy as a way to power its data centers – even signing a deal in 2024 to purchase energy from the restarted Three Mile Island (TMI) nuclear plant - it is notably absent from a new Large Energy Users Pledge that supports the global expansion of nuclear capacity.

That pledge has attracted major signatories such as Amazon, Meta, and Google, but neither Microsoft nor Oracle, which is also exploring nuclear energy, are on the list.

Led by the World Nuclear Association, the pledge was first introduced at the World Nuclear Symposium in September 2023, and has gained backing from 14 major global banks and financial institutions, 140 nuclear industry companies, and 31 countries.

Around-the-clock clean energy

Its purpose is to drive home nuclear energy’s “essential role in enhancing energy security, resiliency and providing continuous clean energy,” and sets a target to triple global nuclear capacity by 2050.

Nuclear power currently supplies about 9% of the world’s electricity via 439 reactors.

The call to action goes beyond traditional energy applications. It also outlines nuclear's potential to serve high-demand sectors like data centers, where the rise of artificial intelligence has led to soaring energy needs.

While it typically takes at least five years to construct an atomic plant, micro nuclear reactors, expected to be available by the early 2030s, could be a quicker, cheaper solution for powering large-scale computing operations.

"We are proud to sign a pledge in support of tripling nuclear capacity by 2050, as nuclear power will be pivotal in building a reliable, secure, and sustainable energy future," said Lucia Tian, Google’s Head of Clean Energy & Decarbonization Technologies.

"Google will continue to work alongside our partners to accelerate the commercialization of advanced nuclear technologies that can provide the around-the-clock clean energy necessary to meet growing electricity demand around the world."

That message was echoed by Urvi Parekh, Head of Global Energy at Meta. “As global economies expand, the need for a reliable, clean, and resilient energy supply is paramount. Nuclear energy, with its ability to provide continuous power, can help meet this rising demand. We’re excited to join alongside this multi-organizational effort with the Tripling Nuclear Pledge to reiterate our commitment to nuclear energy.”

Brandon Oyer, Head of Americas Energy and Water for AWS, emphasized the urgency of scaling nuclear power. “Accelerating nuclear energy development will be critical to strengthening our nation’s security, meeting future energy demands, and addressing climate change. Amazon supports the World Nuclear Association’s pledge, and is proud to have invested more than $1 billion over the last year in nuclear energy projects and technologies, which is part of our broader Climate Pledge commitment to be net-zero carbon by 2040.”

You can view the Large Energy Users Pledge, which is signed by Meta, Amazon, Google and ten other companies, with a statement of support by Siemens Energy, here (PDF).

You might also like



from Latest from TechRadar US in News,opinion https://ift.tt/xUdu05N

Tuesday, March 18, 2025

Latest Tech News

Nvidia has taken the world a step closer to smart, humanoid robots with the launch of its latest advanced AI model.

At its Nvidia GTC 2025 event, the company revealed Isaac GROOT N1, which it says is, "the world’s first open Humanoid Robot foundation model", alongside several other important development tools.

Nvidia says its tools, which are available now, will make developing smarter and more functional robots easier than ever, along with allowing them to have more humanoid reasoning and skills - which doesn't sound terrifying at all.

Isaac GROOT N1

“The age of generalist robotics is here,” said Jensen Huang, founder and CEO of NVIDIA. “With NVIDIA Isaac GR00T N1 and new data-generation and robot-learning frameworks, robotics developers everywhere will open the next frontier in the age of AI.”

The company says its robotics work can help fill a shortfall of more than 50 million caused by a global labor shortage.

Nvidia says Isaac GROOT N1, which can be trained on real or synthetic data, can "easily" master tasks such as grasping, moving objects with either a single or multiple arms, and moving items from one arm to the other - but can also carry out multi-step tasks which combine a number of general skills.

The model is built across a dual-system architecture inspired by the principles of human cognition, with “System 1” is a fast-thinking action model, mirroring human reflexes or intuition, whereas “System 2” is a slow-thinking model for "deliberate, methodical decision-making."

Powered by a vision language model, System 2 is able to consider and analyze its environment, and the instructions it was given, to plan actions - which are then translated by System 1 into precise, continuous robot movements.

Among the other tools being released are a range of simulation frameworks and blueprints such as the NVIDIA Isaac GR00T Blueprint for generating synthetic data, which help generate large, detailed synthetic data sets needed for robot development which would be prohibitively expensive to gather in real life.

There is also Newton, an open source physics engine, created alongside Google DeepMind and Disney Research, which Nvidia says is purpose-built for developing robots.

Huang was joined on stage by Star Wars-inspired BDX droids during his GTC keynote, showing the possibilities of the technology in theme parks or other entertainment locations.

Nvidia first launched Project GROOT ("Generalist Robot 00 Technology") at GTC 2024, primarily focusing on industrial use cases, which could learn and become smarter by watching human behaviour, understanding natural language and emulating movements, allowing them to quickly learn coordination, dexterity and other skills in order to navigate, adapt and interact with the real world.



from Latest from TechRadar US in News,opinion https://ift.tt/ZjcuSKr

Latest Tech News

Amazon is turning off the ability to process voice requests locally. It's a seemingly major privacy pivot and one that some Alexa users might not appreciate. However, this change affects exactly three Echo devices and only if you actively enabled Do Not Send Voice Recordings in the Alexa app settings.

Right. It's potentially not that big of a deal and, to be fair, the level of artificial intelligence Alexa+ is promising, let alone the models it'll be using, all but precludes local processing. It's pretty much what Daniel Rausch, Amazon's VP of Alexa and Echo, told us when he explained that these queries would be encrypted, sent to the cloud, and then processed by Amazon's and partner Antrhopic's AI models at servers far, far away.

That's what's happening, but let's unpack the general freakout.

After Amazon sent an email to customers, actually only those it seems who own an Echo Dot 4, Echo Show 10 (3rd Gen), and Echo Show 15, that the option to have Alexa voice queries processed on device would end on March 28, some in the media cried foul.

They had a point: Amazon didn't have the best track record when it comes to protecting your privacy. In 2019, there were reports of Amazon employees listening to customer recordings. Later, there were concerns that Amazon might hold onto recordings of, say, you yelling at Alexa because it didn't play the right song.

Amazon has since cleaned up its data act with encryption and, with this latest update, promises to delete your recordings from its servers.

A change for the few

Amazon Echo Dot (2020)

(Image credit: Future)

This latest change, though, sounded like a step back because it takes away a consumer control, one that some might've been using to keep their voice data off Amazon's servers.

However, the vast majority of Echo devices out there aren't even capable of on-device voice processing, which is why most of them didn't even have this control.

A few years ago, Amazon published a technical paper on its efforts to bring "On-device speech processing" to Echo devices. They were doing so to put "processing on the edge," and reduce latency and bandwidth consumption.

Turns out it wasn't easy – Amazon described it as a massive undertaking. The goal was to put automatic speech recognition, whisper detection, and speech identification locally on a tiny, relatively low-powered smart speaker system. Quite a trick, considering that in the cloud, each process ran "on separate server nodes with their own powerful processors."

The paper goes into significant detail, but suffice it to say that Amazon developers used a lot of compression to get Alexa's relatively small AI models to work on local hardware.

It was always the cloud

In the end, the on-device audio processing was only available on those three Echo models, but there is a wrinkle here.

The specific feature Amazon is disabling, "Do Not Send Voice Recordings," never precluded your prompts from being handled in the Amazon cloud.

The processing power that these few Echos had was not to handle the full Alexa query locally. Instead, the silicon was used to recognize the wake word ("Alexa"), record the voice prompt, use voice recognition to make a text transcription of the prompt, and send that text to Amazon's cloud, where the AI acts on it and sends a response.

The local audio is then deleted.

Big models need cloud-based power

Amazon Echo Show 15

(Image credit: Amazon)

Granted, this is likely how everyone would want their Echo and Alexa experience to work. Amazon gets the text it needs but not the audio.

But that's not how the Alexa experience works for most Echo owners. I don't know how many people own those particular Echo models, but there are almost two dozen different Echo devices, and this affects just three of them.

Even if those are the most popular Echos, the change only affects people who dug into Alexa settings to enable "Do Not Send Voice Recordings." Most consumers are not making those kinds of adjustments.

This brings us back to why Amazon is doing this. Alexa+ is a far smarter and more powerful AI with generative, conversational capabilities. Its ability to understand your intentions may hinge not only on what you say, but your tone of voice.

It's true that even though your voice data will be encrypted in transit, it surely has to be decrypted in the cloud for Alexa's various models to interpret and act on it. Amazon is promising safety and security, and to be fair, when you talk to ChatGPT Voice and Gemini Live, their cloud systems are listening to your voice, too.

When we asked Amazon about the change, here's what they told us:

“The Alexa experience is designed to protect our customers’ privacy and keep their data secure, and that’s not changing. We’re focusing on the privacy tools and controls that our customers use most and work well with generative AI experiences that rely on the processing power of Amazon’s secure cloud. Customers can continue to choose from a robust set of tools and controls, including the option to not save their voice recordings at all. We’ll continue learning from customer feedback, and building privacy features on their behalf.”

For as long as the most impactful models remain too big for local hardware, this will be the reality of our Generative AI experience. Amazon is simply falling into line in preparation for Alexa+.

It's not great news, but also not the disaster and privacy and data safety nightmare it's made out to be.

You may also like



from Latest from TechRadar US in News,opinion https://ift.tt/NA6bSu9

Latest Tech News


  • Write-once memory card offers tamper-proof, long-term data storage
  • TeamGroup’s 256GB D500R WORM SD card can store up to 374 CDs’ worth of data
  • Built-in protection features guard against power loss, damage and tampering

CDs and DVDs are (or rather were) great for storage, but they can be affected by disc rot, a form of physical deterioration that affects optical discs, causing them to become unreadable over time due to corrosion or damage to the reflective layer. If you have a lot of important data stored on discs and are worried about losing it, TeamGroup has a solution to your fears – the D500R ISD WORM SD card.

Shown off at Embedded World 2025 (where it won the Embedded Vision category's top honor and the 2025 Community Choice Award), the D500R WORM card uses Write Once, Read Many technology for non-erasable, tamper-proof, long-term data retention storage.

The D500R ISD uses MLC NAND Flash, which offers better durability and steady performance, and promises read and write speeds of up to 70MB/s and 65MB/s. It supports capacities from 8GB to 256GB. That largest card is big enough to store data from around 374 CDs, so it can likely back up all your discs easily.

Seals the data in place

As you can tell from the description, the card can be read many times, but only written to once. Via a combination of hardware design and firmware control, the D500R ISD seals the data in place and ensures it can’t be deleted, overwritten, or updated. This makes it useful for users who require lasting data protection without the risk of accidental changes or even malicious tampering from ransomware.

The card includes features like Power Fail Management, which helps preserve data in the event of an unexpected power cut, and Bad Block Management, which detects and isolates faulty storage sectors, helping extend the product’s lifespan.

It also has a built-in S.M.A.R.T system that tracks usage and sends alerts if an attempt is made to alter or delete the data in some way.

Built to withstand demanding environments, it meets multiple military-grade shock and vibration standards and is rated IP58 for dust and water resistance.

TeamGroup suggests it could be useful for law enforcement footage, medical records, or financial archives, but it could be used for backing up family photos and videos or company data.

While it’s clearly a great solution for long-term archival storage, it’s probably worth noting that the D500R WORM card is a lot easier to lose than a box of CDs or DVDs, so you might want to think about where you’ll keep it. There’s also no word on pricing yet, but it's fair to assume it won’t be cheap.

You might also like



from Latest from TechRadar US in News,opinion https://ift.tt/tNoA3XI

Latest Tech News

Saturday Night Live is one of the most iconic pieces of programming in TV history, and it’s currently in its 50th season. To celebrate that achievement, NBC and showrunners, including Lorne Michaels, planned, produced, and executed well – all live – the SNL 50th Anniversary Special, which aired a few weeks back on February 16, 2025.

Rather than firing up a classic TV antenna or opting for my Hulu live TV subscription, I decided to open up Peacock, using an Apple TV 4K – one of the best streaming devices – to ultimately watch the anniversary special. It’s not the first time I’ve turned to the NBC-owned streaming service to watch a live event, that honor goes to the Summer 2024 Paris Olympics, in which I also used the multiview functionality to watch multiple competitions at once.

This time around, I was greeted with an SNL 50 overlay and could jump right into the live broadcast of the show with a single click. For those wondering, I hopped in about midway through the opening with Paul Simon and Sabrina Carpenter on one of the best streaming services.

SNL50 The Homecoming Concert banner on Peacock

(Image credit: Peacock)

The stream began without a hitch and looked pretty sharp if I do say so myself. I also didn’t encounter stutters or a stoppage in the stream, and I didn’t even see commercials either – rather, I was presented with a text along the lines of “the SNL 50th Anniversary would resume soon” with the theme song playing on a loop and various glam shoots of the acclaimed 8H studio where the show tapes.

It was a nice experience for a live-stream event, an easy way to watch the 50th special without the need for a linear TV experience, and it went off pretty much without a hitch. Something that the linear broadcast for some didn’t as the special ran over. Even so, 14.8 million viewers across Peacock and NBC (linear TV) watched the special either live or within the same day. That makes it NBC’s most primetime event in the last five years, besting the Golden Globes.

Peacock has a sort of special sauce in the streaming service sector – yes, it’s a new offering since its premiere in 2020, but it’s backed by NBC, which has been in the business of live TV and events rolling in real time for years. The streaming service also knows how to curate its backlog of content with other elements and live events. We saw back with the Paris Olympics the ability to watch each individual category, a few of them together in multiview, and a trove of new content, including Snoop Dog at the games and a show hosted by Alex Cooper.

The SNL 50 takeover was a similar feat, which had a goal of letting fans reach the content they wanted – be it all the episodes from the 50th season or previous ones, specially-released documentaries, or even the SNL 50 Homecoming Concert, which also streamed live on Peacock. We’ll also see a similar takeover come March 21, 2025, when Wicked hits Peacock and begins streaming for all subscribers of the service.

Building a platform that supports the highest levels of performance

Speaking to TechRadar, Patrick Miceli, CTO of global streaming and the NBCU Media Group, said that since Peacock’s inception, “we’ve invested in building a platform and technical infrastructure that supports the highest levels of performance, security, and reliability to deliver a seamless and incomparable experience for our customers.”

SNL 50: The Anniversary Special

(Image credit: Theo Wargo/NBC)

Miceli further noted that even with the complexities of live-streaming, Peacock has “achieved a 100% success rate in video quality and reliability on our NFL and Olympics coverage last year”. This comes as other services, including Netflix with the Logan Paul vs Mike Tyson fight, have struggled to keep streams afloat with heavy audience demand.

Suffice to say, Peacock and NBC at large felt confident going into the weekend with the Olympics, NFL games, and other events, including the Macy’s Thanksgiving Day Parade in the history books. Still, Miceli shared, “we do still spend a lot of time preparing for each event, accounting for a variety of different scenarios to ensure we’re ready for anything”.

I was impressed with Peacock’s performance, and countless others shared that the stream was a top performer on social media using the hashtag #SNL50. Given the performance here, I’m keen to see what kind of package Peacock curates for future specials, live events like the next Olympics, and NBA games, which will hit the streaming service in 2025.

Of course, even though it’s not live, I’ll still head to Peacock whenever I need my Bravo fix like Summer House, reruns of Vanderpump Rules, or a rewatch of a timeless classic with SNL alums Amy Poehler (Parks and Recreation).

You might also like



from Latest from TechRadar US in News,opinion https://ift.tt/30xMHyz

Monday, March 17, 2025

Best Food and Drink Subscription Gifts for 2025

Celebrate the foodies in your life with these unique, and of course, delicious, edible and drinkable treats. We’ve found the very best food and drink subscriptions to gift in 2025.

from CNET https://ift.tt/AyrnhJD

Latest Tech News

Since its inception in 2020, YouTube Shorts has been TikTok’s fierce competitor in the short-form content sector, but as of recently YouTube has been having a bit of run-in with issues that many users have picked up on. As reported by people on Reddit (see below), a new bug auto-plays the Shorts tab as soon as you open the YouTube app.

YouTube app opens to shorts from r/youtube

This issue has been quite on and off for YouTube and has somehow resurfaced for more and more people over the past few weeks. For many users, the app will automatically open to show its Shorts tab if it was the last thing viewed/ used, as opposed to opening the YouTube homepage as it usually would.

However, according to 9to5Google, it’s noticed a growing trend where YouTube automatically launches Shorts no matter what was previously viewed - and this is the case for both free users and paying subscribers of YouTube Premium. As it stands, it’s not certain if this issue is cropping up for iOS users, however 9to5Google has noticed that Android devices - notably the Nothing Phone (3a) and the Pixel 9 Pro Fold - have fallen victim to this annoying bug.

There’s no doubt that YouTube has contributed a substantial amount of attention to launching its answer to TikTok’s popular short video feed, and even gained an edge over it with a music video remix feature last year. But it’s hard to tell if this bug is a minor fault in the system, or a tactic to get more users to ride the YouTube Shorts train.

How you can fix the YouTube Shorts bug

If you’re an Android user who’s having issues with YouTube playing you Shorts as soon as you launch the app, then there’s a way you can change this. As recommended in 9to5Google’s report, opening and force-stopping the app won’t be enough to solve the issue, but clearing your storage/ cache will help.

Though this is the trick to getting YouTube to stop auto-playing the Shorts tab when you open the app, 9to5Google says that the issue will return when Shorts are opened again. YouTube has yet to comment on its efforts to fix the issue, but we’ll update this story if any new information surfaces.

You might also like



from Latest from TechRadar US in News,opinion https://ift.tt/O1trRhy

Sunday, March 16, 2025

The Power of Apple Cider Vinegar: Proper Dosages, Benefits and Safety Tips

There's a lot of craze around apple cider vinegar. This is what you need to know before you start drinking it.

from CNET https://ift.tt/DBshpQ6

La Liga Soccer Livestream: How to Watch Atlético Madrid vs. Barcelona From Anywhere

It's a crucial top-of-the-table clash at the Metropolitano Stadium.

from CNET https://ift.tt/86B3DgM

Latest Tech News


  • Despite predictions, HDDs are here to stay and increasing in capacity
  • Seagate recently sold one exabyte of HAMR storage to two hyperscalers
  • The "tens of thousands of drives" likely cost between $33 and $35 million

Although the likes of Pure Storage, IBM, and Meta believe the writing is on the wall for hard drives, the technology doesn’t look like it will be going away any time soon.

Seagate and its main rival Western Digital are working on magnetic recording methods that will allow the drives to continue increasing in capacity, helping them maintain a clear advantage over SSDs when it comes to storage density.

The main technology leading this charge is HAMR, or heat-assisted magnetic recording, which could see HDDs hitting incredible 100TB capacities. HAMR works by briefly heating the disk surface with a laser to make it easier to write data at higher densities. HDMR - short for heated dot magnetic recording - is HAMR’s likely successor and could lead to even larger drives by focusing the heat and magnetic energy into smaller, more precise areas for even denser data storage.

Not an unreasonable outlay

In a recent The Wall Street Journal article, John Keilman wrote an article covering Seagate’s “fight to store the world’s data”, and mentioned something which caught my attention. “Seagate said two large cloud-computing customers have each ordered one exabyte’s worth of HAMR storage, which works out to tens of thousands of hard drives.”

Keilman didn’t name names - Seagate wouldn’t have told him who the buyers were - but we can narrow the list of suspects down to the usual big US hyperscalers, including Apple, Oracle, Microsoft, Google, Amazon, and Meta. It’s possible that Chinese hyperscalers could have come shopping for the drives, but that seems unlikely to me.

Keilman doesn’t say what capacity drives were sold, but we can assume they will have been Seagate’s highest commercial HDD, the Exos M, which ranges from 30TB (CMR) to 36TB (SMR), with a breakthrough 3TB-per-platter density. Based on timing, it’s likely we’re talking about the 30TB models, as the 32TB drive was only added to the range in December 2024, followed by the 36TB model just a month later.

Assuming the hyperscalers in question paid bulk pricing of around $500 per drive (refurbished models of Seagate's Exos 28TB HDD can currently be purchased for as low as $365), their combined bill likely came to somewhere between $33 and $35 million. For a full exabyte of cutting-edge, high-capacity storage, $16 billion or so isn't an unreasonable outlay.

Seagate previously revealed that a 60TB drive was on its way, and the firm recently announced plans to acquire Intevac, a HAMR specialist, which could help it achieve that 100TB capacity goal faster, as well as ramp up HAMR drive production.

You might also like



from Latest from TechRadar US in News,opinion https://ift.tt/APUxIKn

Saturday, March 15, 2025

Best Internet Providers in Fayetteville, North Carolina

Fayetteville's fastest provider is Metronet, but it's not the only option in town. CNET's experts bring you the top picks.

from CNET https://ift.tt/nkVYFJl

Latest Tech News


  • Meta is reportedly readying its first in-house AI training chip for deployment
  • The dedicated AI accelerator, made with TSMC, completed tape-out
  • Meta’s shift to custom silicon aims to reduce its dependence on Nvidia hardware

Like many of Nvidia’s highest spending customers, Meta is looking to slash its reliance on the GPU maker’s expensive AI hardware by making its own silicon.

In 2024, the social media giant began advertising for engineers to help build its own state-of-the-art machine learning accelerators, and now, according to an exclusive report from Reuters, Meta is at the testing stage for its first in-house chip designed for training AI systems.

Sources told Reuters that following its first tape-out of the chip, Meta has started a limited deployment, and if testing goes well, it plans to scale production for wider use.

RISC-V business

According to Reuters, “Meta's new training chip is a dedicated accelerator, meaning it is designed to handle only AI-specific tasks. This can make it more power-efficient than the integrated graphics processing units (GPUs) generally used for AI workloads.”

Taiwan-based chipmaker TSMC produced the silicon for Meta as part of the Facebook owner’s Meta Training and Inference Accelerator (MTIA) program, something which Reuters points out has had “a wobbly start for years and at one point scrapped a chip at a similar phase of development.”

In 2023, Meta unveiled its first-generation in-house AI inference accelerator designed to power the ranking and recommendation systems for Facebook and Instagram, and then in April 2024 it debuted a new version that doubled the compute and memory bandwidth.

At the 2024 Hot Chips symposium, Meta revealed that its inference chip was built on TSMC's 5nm process, with the processing elements on RISC-V cores.

Like a growing number of tech firms, Facebook has thrown its weight behind RISC-V in order to recognize its AI ambitions, and although the Reuters report doesn’t provide any details on the technical aspects of Meta’s new AI training chip, it seems a fair bet that it too will be based on the open source RISC-V architecture.

The Reuters article does note that Meta executives say they want to start using their own chips for training by next year.

You might also like



from Latest from TechRadar US in News,opinion https://ift.tt/b5Zx4fR

Heat Domes and Surging Grid Demand Threaten US Power Grids with Blackouts

A new report shows a sharp increase in peak electricity demand, leading to blackout concerns in multiple states. Here's how experts say ...