Tuesday, December 31, 2024

Best Internet Providers in Provo, Utah

Provo may be a Google Fiber hotspot, but some other providers offer great internet options. Here are CNET’s top picks for home internet plans.

from CNET https://ift.tt/C7WQ8iv

Monday, December 30, 2024

Best Dash Cam Deals: Prepare Yourself for 2025 and Keep Your Car Trips Safe

A dash cam can make a big difference when it comes to making a claim through your insurance.

from CNET https://ift.tt/JSf7tsB

Latest Tech News


  • U7inh instance has 1,920 virtual CPUs and 32TB of memory, and you can run 4 of these
  • Need to sustain SAP workloads is what encouraged HPE to work with AWS
  • However, there's also the fact HPE may be helping AWS get newer, lucrative leads

As businesses face relentless data growth, challenges like data silos and outdated legacy systems, such as Unix-based servers, increasingly stand in the way of progress.

HPE has positioned its Compute Scale-up Server 3200, which it launched in 2023, as the answer to these challenges, promising scalability and performance for mission-critical applications.

The server supports workloads like SAP HANA and ERP with up to 16 sockets and 32TB of shared memory, enabling seamless scalability and reduced server sprawl. Powered by 4th Gen Intel Xeon Scalable processors (Sapphire Rapids), it doubles core counts, integrates AI accelerators, and features DDR5 memory with PCIe 5.0 for enhanced performance and bandwidth, ideal for data-intensive applications.

Eroding on-premises market share?

AWS has now announced the general availability of a new Amazon Elastic Compute Cloud (Amazon EC2) U7inh-32tb.480xlarge instance which runs on the 16-socket HPE Compute Scale-up Server 3200 and is built on the AWS Nitro System.

While that sounds like a positive move for HPE customers, it does raise concerns about the strategic implications.

AWS’s new U7inh instance features 1,920 vCPUs, 32TB of DDR5 memory, 160 Gbps of EBS bandwidth, and 200 Gbps of network bandwidth. AWS says, “You can run your largest in-memory database workloads like SAP HANA or seamlessly migrate workloads running on HPE hardware to AWS.”

As The Register notes, however, AWS’s announcement of the new offering mentions "customers that currently run on-premises with HPE servers have also asked how we can help them migrate to AWS to take advantage of cloud benefits while continuing to use HPE hardware."

By partnering with AWS, HPE potentially opens the door for the cloud giant to gain access to customers running critical workloads on-premises who may be considering cloud migration.

This partnership could inadvertently help AWS capture more enterprise leads, potentially eroding HPE’s on-premises market share. The timing is particularly striking, as AWS has acknowledged a growing trend of customers revisiting on-prem solutions.

You might also like



from Latest from TechRadar US in News,opinion https://ift.tt/xIZ0K9B

Sunday, December 29, 2024

Best Internet Providers in Presque Isle, Maine

In Presque Isle, national providers often offer the best value. Here’s CNET’s guide to help you choose your next ISP.

from CNET https://ift.tt/RT07gOw

Latest Tech News

New year's resolutions are just the kind of chore you don't need after a week of relaxing excess – so this year we're committing to 'tech resolutions' instead.

These friendlier, less intimidating commitments involve using gadgets or apps to nudge your life in the right direction. Or they can simply help you survive the brutal month that is January. Best of all, they all involve tech.

The tech resolutions below don't involve giving up indulgences or hitting gyms. Instead, they're a mix of ideas from the TechRadar team on how they'll be using gadgets, apps or a mix of the two to try new hobbies, save money or just have a blast with new board game discoveries in early 2025.

There are guides on how to slash your streaming bills, set up your iPhone 16 to take better photos in 2025, and use Notion to plan your new year. But there are also fun side quests, like how to rediscover the joy of CDs and one writer's advice on the best cheap gadgets to raise your bread-making game.

Whichever part of your tech life needs a shot of new year's enthusiasm, you'll find some valuable nuggets of advice below. And if it inevitably all goes wrong, you can always blame the gadgets...

The money savers

1. I'm slashing my streaming bills by 71% in 2025 with subscription hopping – here's how

A person holding a remote to a TV screen showing the Disney Plus, Netflix and Prime Video logos

(Image credit: Netflix / Disney+ / Amazon Prime Video)

Our streaming bills have quickly become bigger than Elon Musk's ego – it's time to take action. TechRadar contributor Esat Dedezade has broken down how he's embracing 'subscription hopping' in 2025 to save hundreds on his bills.

The tactic involves a little planning, but fortunately we've done all of that for you – including a 'cheat sheet' that shows all of the biggest shows landing on Netflix, Prime Video, Disney Plus and more in the first few months of 2025, so you can quickly create your own plan. Trust us, you'll feel extremely smug afterwards.


2. I review EVs for a living – here are 5 ways I'm cutting my charging bills in 2025

A man sitting in an Electrogenic DeLorean DMC-12 and a person holding a phone at an EV charging station.

(Image credit: Leon Poultney / Getty Images)

Owning an EV can be an expensive business, not least because of pesky depreciation. But whether you've bought new or second-hand, there is one thing you can control – charging costs.

TechRadar's EV expert Leon Poultney, who spends roughly 72% of his life on the road in electric cars, has broken down all of his top tips for saving cash on EV charging in 2025. And no, it doesn't involve buying a solar farm.


3. YouTube Premium is the only digital subscription I'm keeping for the whole of 2025 – here's why

Three Android phones on a purple and pink background showing YouTube Premium

(Image credit: Google / YouTube)

YouTube Premium has so many hidden benefits that it may well be the best-value streaming subscription out there. That's the compelling argument made by TechRadar contributor David Nield, who describes why it's the only digital subscription he's keeping for the whole of 2025.

As he describes, YouTube Premium isn't just about getting respite from ads (although that is one major benefit). It also brings a host of bonus, like YouTube Music – which could convince you to ditch a separate music streaming service.

The life upgraders

4. Why I’m skipping the PS5 Pro in 2025 and upgrading my gaming PC instead

A PS5 Pro next to a pair of hands carefully inserting an MSI graphics card into a PC case.

(Image credit: Sony / Shutterstock / Skrypnykov Dmytro)

Our PS5 Pro review was enough to convince TechRadar contributor Darren Allan to skip the console and make a different gaming plan for 2025 – and it involves cables.

That's because this particular Plan B is built around hooking up a gaming PC in another room to a living room TV. A not inconsiderable task, but one that means saving lots of cash on a PS5 Pro – and ultimately creating a better gaming setup.


5. I’m swapping Spotify for CDs in 2025 with the affordable Fiio DM13 – here’s why

The FiiO DM13 CD player sitting open on a speaker

(Image credit: FiiO)

Do you have a tower of dusty CDs at home that audibly grumble every time you open Spotify? So does TechRadar contributor and CD hoarder Tom Wiggins, but he has a plan to put that right in 2025.

The Fiio DM13, a modern Discman tribute act, is the key to this particular tech resolution. And it means riding the mid-90s comeback and living like it's the height of Britpop, with even better sound quality.

The tech optimizers

6. I’m a photographer – 5 ways to set up your iPhone 16 to take great photos in 2025

Two iPhone 16 Pro phones on a grey background showing its camera and settings

(Image credit: Apple / Future)

Looking to take better photos with your iPhone in 2025? This guide from TechRadar's former cameras editor will help set you up.

While it's mainly focused on the iPhone 16 and 16 Pro (including their new Camera Control button), a lot of the tips also apply to older iPhones that are running iOS 18.

It's now possible to get the experience and results of a traditional compact camera from your iPhone – here's how to do it.


7. 5 reasons why I'm finally upgrading to Windows 11 in January

A finger touching a screen showing the Windows 11 logo

(Image credit: Shutterstock / mundissima)

Yes, it's finally time – Windows 11 might be a magnet for online criticism (justifiably so, in many cases), but TechRadar computing writer Darren Allan explains why he's going to be upgrading to Microsoft's OS as the new year begins.

And no, it isn't just because the sands of time are running out for Windows 10 – there are also now positive reasons to upgrade, including some much-needed interface improvements.

You might also like



from Latest from TechRadar US in News,opinion https://ift.tt/QT39XdC

Saturday, December 28, 2024

Latest Tech News

I used to own a lot of CDs. And by “a lot”, I mean a lot. Since I was a teenager in the late ‘90s a significant chunk of my disposable income went on music, but when Spotify launched in 2009 that began to slow down, until on July 29 2016 I ordered my final CD from Amazon (Drive Like Jehu’s self-titled album, if you’re interested).

I’ve been pretty much streaming-only ever since, and while I’ve purged a lot of CDs in recent years (mainly the albums I couldn't name a single song from without looking at the track-listing) there are a couple of hundred I can’t bring myself to get rid of.

Without a means to play any of them – I’d streamlined my hi-fi setup to include just a pair of Ruark MR1 Mk 2 speakers and an iFi Uno DAC connected to my disc-less MacBook Air when I moved out of London a few years ago – that seemed stupid.

But I spent a good chunk of my young adult life accumulating these shiny silver discs, some of which were acquired at gigs by obscure math-rock bands of the mid-2000s and certainly won’t be found on Spotify or Apple Music.

Even if they do now live in boxes under the stairs, existing as little more than a back-up archive to a hard drive full of ripped MP3s, they’re still part of my identity. And then one day the perfect solution dropped into my inbox.

What's in a name?

A black FiiO DM13 CD player with a 3.5mm aux cable plugged in.

(Image credit: Future)

Try to buy a small, affordable CD player these days and you’ll mainly find cheap, plastic all-in-one systems made by companies called things like Pjlopj, Lvcdodvd and Gelielim (I actually made one of those up but I bet you don’t know which one).

Why was I choosing to listen to audibly inferior versions of stuff I’d spent so much time and money collecting?

In all honesty, the name FiiO isn’t much better, but the company at least has some recent pedigree when it comes to making hi-fi gear, with a number of its products earning five-star reviews right here on TechRadar. So news of its new $139 / £139 (around AU$275) DM13 CD player was music to my ears.

Here was a battery-powered CD player that’s barely any bigger than an old Sony Discman – not quite pocket-sized, but small enough to stash in a drawer when not in use – that has Bluetooth onboard so you can connect a pair of headphones, load up a copy of OK Computer and hit the streets like it’s 1997 all over again.

Mine arrived in the post just before Christmas, and its brushed metal chassis makes it vaguely reminiscent of Apple’s old Superdrive (RIP), particularly if you opt for the silver version. It can even convert CDs into MP3s if you hook it up to a computer.

I attempted to pair it with my Ruarks over Bluetooth but with its limited single-line display and very basic instruction manual, getting the two to talk to each other was like trying to change the clock on a microwave using only morse code.

Fortunately, it has standard aux and optical outputs as well, so I just dug out my box of miscellaneous cables (we’ve all got one) and went wired instead. I had created the perfect hi-fi setup for a millennial with limited space.

Don't look back in anger

A pile of CDs on a desk. A speaker and plant can be seen in the background.

(Image credit: Future)
Top tips for CD revivalists

1. Check Ebay for CD bargains
A copy of Oasis’s debut album Definitely Maybe will set you back the best part of $35 / £30 on vinyl, but resellers such as Music Magpie (or Discogs in the US) have eBay shops where you can pick up a CD copy for far less. Try your local charity shops, too.

2. Only buy stuff you really love
If you start buying everything on CD you’ll quickly end up with a collection that’s hard to manage and you’ll be forced to purge some of it. Spotify and the other streaming services are the perfect tool for quality control, allowing you to try before you buy.

3. Keep your CDs out of the sun
My CD collection spent over a decade of its life by a window where it would catch the evening sun. As a result a good chunk has severely faded spines and partially bleached back covers, which bothers me from a sentimental perspective rather than a resale one. Try to look after yours better.

It’s not just because I’m a hoarder who can’t let go of the past that I’m planning to spend 2025 like it’s the height of Britpop all over again (and the Oasis reunion has nothing to do with it either).

Spotify’s refusal to increase its streaming quality has been bothering me for some time, but it was only when I was listening to some of those old MP3s that I realised just how noticeable it is. Why was I choosing to listen to audibly inferior versions of stuff I’d spent so much time and money collecting?

I also rarely listen to a full album from start to finish anymore, so rather than just switching to a different digital format I figured a CD player would be the perfect way to reconnect with music all over again.

I’m not the only one who’s plotting a physical-media renaissance. Sales of CDs rose 2% in 2023 and were up again 3.2% in the first half of 2024. That small upward trend is partially down to younger generations developing an interest in owning tangible formats but not having the disposable income to spend on vinyl (apparently it all goes on snozzberry vapes).

I get that. I bought a lot of my CDs for £7 or less in shops like Fopp or Rounder Records (another RIP) in Brighton, UK and the thought of spending over £20 on just one album back then would’ve horrified me. What if it was rubbish? (Which, considering a proportion of it was mid-noughties math-rock, was fairly likely.)

With people like me offloading hordes of old CDs there are bargains to be found – and while the format isn’t as indestructible as was once claimed, the discs are often in decent nick, even if the cases and liner notes aren’t.

You can connect a pair of headphones, load up a copy of OK Computer and hit the streets like it’s 1997 all over again.

I probably won’t ditch Spotify completely. As portable as the FiiO DM13 is – connecting a pair of Bluetooth headphones presented fewer issues and it does have skip protection – my pockets are only big enough for my iPhone 16 Pro. Streaming is also unbeatable when it comes to discovering new stuff, plus I don’t have space to add significantly to my existing CD collection. My bank balance wouldn’t thank me either.

But the contents of those boxes under the stairs is going to get a chance to shine again in 2025 – and it’s all down to another little black box with a silly name.

You might also like



from Latest from TechRadar US in News,opinion https://ift.tt/PmXZeSV

Best Internet Providers in Salinas, California

For Salinas residents, AT&T Fiber is a solid choice for internet service, but we found some other options to consider, too.

from CNET https://ift.tt/tUzA98F

Best Internet Providers in Tennessee

Are you an ISP from the Volunteer State? Because you're the only 10 I see. Here are CNET's recommendations of the best home internet in Tennessee.

from CNET https://ift.tt/ecZIgk9

Latest Tech News


  • AWS is Netflix's only cloud computing platform
  • But AWS is also part of Amazon, which owns Amazon Prime Video, a huge rival to Netflix
  • Netflix engineers have been struggling to keep track of how much resources they use on AWS

Netflix, the world’s most popular streaming platform, may dominate home entertainment, but it’s struggling to manage one of its biggest operational challenges: cloud computing costs.

Despite its tech-forward image, Netflix has admitted it doesn’t fully know how much it spends on the cloud, an oversight made even more surprising given that its cloud provider, AWS, is part of Amazon - owner of Prime Video, one of Netflix’s largest competitors.

Relying on AWS for compute, storage, and networking, Netflix’s cloud infrastructure supports its global streaming service. Engineering teams use self-service tools to create and deploy applications, generating vast amounts of data. However, the complexity of this ecosystem makes it difficult for Netflix to understand exactly how resources are used and how costs accumulate.

Keeping its content flowing

The Platform Data Science Engineering (DSE) team at Netflix has taken on the task of untangling this problem. The team’s mission is to help the company’s engineers understand resource usage, efficiency, and associated costs.

Yet, as Netflix acknowledged in a recent blog post, its cloud cost management is still a work in progress.

To address the challenges it finds itself facing, Netflix has developed two tools: Foundational Platform Data (FPD) and Cloud Efficiency Analytics (CEA). FPD provides a centralized data layer with a standardized model, aggregating data from applications like Apache Spark. CEA builds on this by applying business logic to generate cost and ownership attribution, providing insights into efficiency and usage patterns.

The hurdles are significant. Netflix’s sprawling infrastructure includes services with multiple owners, varying cost heuristics, and multi-tenant platforms that complicate tracking.

Data delays and platform-specific customizations add a further layer of complexity. Regular audits and data transformations are necessary to maintain accuracy, but the company admits it has yet to achieve full visibility into its cloud spending.

Looking ahead, Netflix says it plans to expand its tools and incorporate predictive analytics and machine learning to optimize usage and detect cost anomalies.

While the company works to refine its approach, its situation highlights a striking irony: the world’s most popular streaming platform relies on its rival’s technology to deliver its own service, yet it is still figuring out the true cost of keeping its content flowing.

More from TechRadar Pro



from Latest from TechRadar US in News,opinion https://ift.tt/HxfgiUT

Friday, December 27, 2024

Best Hotel Mattresses in 2024

Did you know you can experience the luxury feeling of a hotel mattress at home? These are the best hotel mattresses to buy, tested by our experts.

from CNET https://ift.tt/tJ4ZnWy

Latest Tech News


  • Trillium has hit general availability just months after preview release
  • Powerful AI chip offers more than four times the training performance
  • Google uses it to train Gemini 2.0, the company's advanced AI model

Google has been developing Tensor Processing Units (TPUs), its custom AI accelerators, for over a decade, and a few months after being made available in preview, has announced that its sixth-generation TPU has reached general availability and is now available for rent.

Trillium doubles both the HBM capacity and the Interchip Interconnect bandwidth, and was was used to train Gemini 2.0, the tech giant’s flagship AI model.

Google reports it offers up to a 2.5x improvement in training performance per dollar compared to prior TPU generations, making it an appealing option for enterprises seeking efficient AI infrastructure.

Google Cloud’s AI Hypercomputer

Trillium delivers a range of other improvements over its predecessor, including more than four times the training performance. Energy efficiency has been increased by 67%, while peak compute performance per chip has risen by a factor of 4.7.

Trillium naturally improves inference performance as well. Google’s tests indicate over three times higher throughput for image generation models such as Stable Diffusion XL and nearly twice the throughput for large language models compared to earlier TPU generations.

The chip is also optimized for embedding-intensive models, with its third-generation SparseCore providing better performance for dynamic and data-dependent operations.

Trillium TPU also forms the foundation of Google Cloud’s AI Hypercomputer. This system features over 100,000 Trillium chips connected via a Jupiter network fabric delivering 13 Petabits/sec of bandwidth. It integrates optimized hardware, open software, and popular machine learning frameworks, including JAX, PyTorch, and TensorFlow.

With Trillium now generally available, Google Cloud customers have the opportunity to access the same hardware used to train Gemini 2.0, making high-performance AI infrastructure more accessible for a wide range of applications.

You might also like



from Latest from TechRadar US in News,opinion https://ift.tt/6KJwYDz

Thursday, December 26, 2024

Latest Tech News


  • Project Infinity and Mobile Security Rewards Program bolster Samsung's security strategy
  • Red, Blue, and Purple teams safeguard Galaxy devices from cyber threats
  • CTI task force scours the Dark Web to prevent device breaches

Samsung has always prioritized security for its Galaxy smartphones, and with the launch of the Galaxy S24 series, it promised an unprecedented seven years of mobile security updates.

Behind this extended protection lies a secretive and highly specialized security initiative known as Project Infinity - but Samsung has now lifted the veil and provided some details about the project.

Project Infinity comprises multiple task forces which ensure that the billions of Galaxy smartphone users worldwide are protected from the ever-growing threat of cybercrime.

The invisible guardians of Galaxy devices

At the core of Project Infinity are three distinct teams, Red, Blue, and Purple, alongside a Cyber Threat Intelligence (CTI) taskforce. These groups operate globally in countries such as Vietnam, Poland, and Brazil, working in the shadows to prevent and mitigate cyberattacks.

Each team has a specific role, from proactive threat detection to creating and deploying defensive measures. Their work is largely invisible to the public, only surfacing when you receive a security patch on your device.

The CTI task force specializes in identifying potential cyber threats, ensuring that hackers can’t exploit vulnerabilities in Galaxy devices. The team scours the Deep Web and Dark Web, looking for signs of illicit activity, from malware to stolen data.

By analyzing system behaviors, such as unusual data requests or suspicious network traffic, the team can identify and neutralize threats, while collaborating with other departments to roll out security updates.

“Occasionally, we engage in security research by simulating real-world transactions," noted Justin Choi, Vice President and Head of the Security Team, Mobile eXperience Business at Samsung Electronics.

"We closely monitor forums and marketplaces for mentions of zero-day or N-day exploits targeting Galaxy devices, as well as any leaked intelligence that could potentially serve as an entry point for system infiltration.”

Samsung’s security operation is modeled on military-style tactics, with the Red and Blue teams simulating attacks and defenses, respectively.

Through techniques like "fuzzing," which involves throwing random data at software, they can find hidden vulnerabilities that might otherwise go unnoticed. Meanwhile, the Blue team works tirelessly to develop and implement patches that protect against these vulnerabilities.

The Purple team combines the expertise of both Red and Blue teams, focusing on critical areas of Galaxy’s security infrastructure. They also work with external security researchers to ensure no potential weak spot goes unnoticed.

You may also like



from Latest from TechRadar US in News,opinion https://ift.tt/DweYpmX

Latest Tech News


  • HBM4 chips poised to power Tesla's advanced AI ambitions
  • Dojo supercomputer to integrate Tesla’s high-performance HBM4 chips
  • Samsung and SK Hynix compete for Tesla's AI memory chip orders

As the high-bandwidth memory (HBM) market continues to grow, projected to reach $33 billion by 2027, the competition between Samsung and SK Hynix intensifies.

Tesla is fanning the flames as it has reportedly reached out to both Samsung and SK Hynix, two of South Korea's largest memory chipmakers, seeking samples of its next-generation HBM4 chips.

Now, a report from the Korean Economic Daily claims Tesla plans to evaluate these samples for potential integration into its custom-built Dojo supercomputer, a critical system designed to power the company’s AI ambitions, including its self-driving vehicle technology.

Tesla’s ambitious AI and HBM4 plans

The Dojo supercomputer, driven by Tesla’s proprietary D1 AI chip, helps train the neural networks required for its Full Self-Driving (FSD) feature. This latest request suggests that Tesla is gearing up to replace older HBM2e chips with the more advanced HBM4, which offers significant improvements in speed, power efficiency, and overall performance. The company is also expected to incorporate HBM4 chips into its AI data centers and future self-driving cars.

Samsung and SK Hynix, long-time rivals in the memory chip market, are both preparing prototypes of HBM4 chips for Tesla. These companies are also aggressively developing customized HBM4 solutions for major U.S. tech companies like Microsoft, Meta, and Google.

According to industry sources, SK Hynix remains the current leader in the high-bandwidth memory (HBM) market, supplying HBM3e chips to NVIDIA and holding a significant market share. However, Samsung is quickly closing the gap, forming partnerships with companies like Taiwan Semiconductor Manufacturing Company (TSMC) to produce key components for its HBM4 chips.

SK Hynix seems to have made progress with its HBM4 chip. The company claims that its solution delivers 1.4 times the bandwidth of HBM3e while consuming 30% less power. With a bandwidth expected to exceed 1.65 terabytes per second (TB/s) and reduced power consumption, the HBM4 chips offer the performance and efficiency needed to train massive AI models using Tesla’s Dojo supercomputer.

The new HBM4 chips are also expected to feature a logic die at the base of the chip stack, which functions as the control unit for memory dies. This logic die design allows for faster data processing and better energy efficiency, making HBM4 an ideal fit for Tesla’s AI-driven applications.

Both companies are expected to accelerate their HBM4 development timelines, with SK Hynix aiming to deliver the chips to customers in late 2025. Samsung, on the other hand, is pushing its production plans with its advanced 4-nanometer (nm) foundry process, which could help it secure a competitive edge in the global HBM market.

Via TrendForce

You may also like



from Latest from TechRadar US in News,opinion https://ift.tt/sSFDUyt

Latest Tech News


  • Broadcom is rumored to have an ongoing partnership with Apple to help it build its own AI chip
  • TikTok parent company, ByteDance, OpenAI also reportedly in the picture
  • The move comes as hyperscalers look to reduce their dependency on AI chips from Nvidia

Nvidia has ridden the generative AI boom to record-breaking revenues and profits over the past two years, and while it remains well ahead of its competitors, the company is facing growing pressure - not only from rival AMD but also from hyperscalers which have traditionally relied on Nvidia GPUs but are now looking to reduce their dependence on its hardware.

As The Next Platform notes, “Nvidia’s biggest problem is that its biggest customers have massive enough IT expenditures that they can afford to compete with Nvidia and AMD and design their own XPUs for serial and parallel computing. And when they do so, it is chip design and manufacturing houses Broadcom and Marvell, who have vast expertise running chippery through the foundries of Taiwan Semiconductor Manufacturing Co, who will be benefiting.”

In its most recent earnings conference call, Hock Tan, President and CEO of Broadcom, told investors, “Specific hyperscalers have begun their respective journeys to develop their own custom AI accelerators or XPUs, as well as network these XPUs with open and scalable Ethernet connectivity. As you know, we currently have three hyper-scale customers who have developed their own multi-generational AI XPU roadmap to be deployed at varying rates over the next three years. In 2027, we believe each of them plans to deploy one million XPU clusters across a single fabric.”

Gaining its fair share

Without naming specific companies, Tan added, “To compound this, we have been selected by two additional hyperscalers and are in advanced development for their own next-generation AI XPUs.”

It is widely believed that Broadcom is working with Google and Meta, and as we previously reported, with ByteDance and OpenAI on custom AI chips.

Apple is also thought to be developing its first artificial intelligence server chip, codenamed “Baltra,” with Broadcom providing the advanced networking technologies essential for AI processing.

During the Q&A portion of the earnings call, when Tan was asked about market share, he responded, “All we are going to do is gain our fair share. We're just very well positioned today, having the best technology, very relevant in this space. We have, by far, one of the best combination technologies out there to do XPUs and to connect those XPUs. The silicon technology that enables it, we have it here in Broadcom by the boatloads, which is why we are very well positioned with these three customers of ours.”

You might also like



from Latest from TechRadar US in News,opinion https://ift.tt/8N9PaFG

Wednesday, December 25, 2024

Best Outdoor Smart Plugs for 2024

Being outside doesn’t mean you can’t stay connected, thanks to our list of the best outdoor smart plugs you can buy in 2024.

from CNET https://ift.tt/LuNyUYz

Latest Tech News


  • Castrol planning fluid-as-a-service model launch to eliminate waste and increase sustainability
  • Immersion cooling has emerged as an essential component in the race to reach AGI
  • Castrol wants to play a key role in immersion cooling as integrated smart city data centers become mainstream

Founded in 1899, CC Wakefield & Co. Limited initially focused on producing lubricants for trains and heavy machinery. Over time, the company expanded its expertise to develop specialized lubricants for automobiles and airplane engines, incorporating castor oil - a plant-based oil derived from castor beans - to ensure performance under extreme temperature conditions. The product was called Castrol, and the company was later renamed after its famous creation.

125 years later, Castrol remains at the forefront of innovation, applying its extensive expertise in fluid engineering to address modern challenges.

One of its key focus areas is the development of advanced dielectric fluids for immersion cooling systems. This approach sees entire servers submerged in non-conductive fluids that absorb and transfer heat away from the components, eliminating the need for traditional fans.

Advanced thermal management

The Castrol ON Liquid Cooling Centre of Excellence in Pangbourne, UK, serves as a state-of-the-art research and development hub for liquid cooling technologies.

The facility develops customized solutions and rigorously tests fluid dynamics, material compatibility, and server performance, to address the challenges of traditional cooling methods.

In a recent visit, StorageReview had the opportunity to see Castrol’s cutting-edge immersion tanks from providers like GRC and Submer and was impressed by the adaptability and efficiency of the solutions.

Writer Jordan Ranous noted, “In one of the test cells, we observed GRC’s tank, which had a striking green glow due to the specific fluid Castrol was using. The servers submerged in this tank were undergoing compatibility and performance testing. Castrol ensures that every component, from CPUs to cables, can operate effectively in immersion cooling environments without degradation.”

Castrol’s ON range of single-phase dielectric fluids, including DC15 and DC20, aims to deliver advanced thermal management, durability, and safety while maintaining efficient performance at operating temperatures between 40°C and 50°C, with some systems capable of handling up to 70°C.

Chris Lockett, VP of Electrification and Castrol Product Innovation at BP, Castrol’s parent company, told StorageReview, “At the moment, about 40% of power consumption in data centers goes toward cooling. Immersion cooling can drop that figure to less than 5%, significantly lowering power and water usage.”

Data centers account for an estimated 2–3% of global power consumption, with current liquid cooling efforts primarily focused on direct-to-chip solutions. Immersion cooling has the potential to establish a new standard for thermal management and Castrol wants to lead this transformation, positioning itself as “a one-stop partner for the liquid cooling solutions of today and tomorrow.”

You might also like



from Latest from TechRadar US in News,opinion https://ift.tt/RNmptK2

Tuesday, December 24, 2024

Best Internet Providers in North Dakota

The Peace Garden State boasts fast local and national internet providers. Here are our top picks for North Dakota.

from CNET https://ift.tt/xsilODL

Latest Tech News


  • $250 GPU card is competitive with both the GeForce 4060 and the RX 7600 on numerous benchmarks
  • However, both are set to be replaced by new models launching at CES 2025
  • Driver updates from Intel will hopefully drive the performance of the B580 even further

Over two years after its first discrete GPU release, Intel has launched the Arc B580 “Battlemage,” marking its second generation of dedicated graphics cards.

The B580, which will mostly be sold through add-in-board (AIB) partners like Maxon, Sparkle, and ASRock, features Intel’s updated Xe2 architecture.

It offers efficiency improvements and second-generation Ray Tracing Units (RTUs) alongside enhanced XMX engines, Intel’s counterpart to Nvidia’s Tensor cores.

Unfortunate timing

Puget Systems recently put the $250 GPU card through its paces and found it competes effectively with Nvidia’s GeForce RTX 4060 and AMD’s Radeon RX 7600 across a range of benchmarks. With 12GB of VRAM, the B580 certainly stands out in the budget category, surpassing the RTX 4060’s 8GB at a lower price point.

This additional memory gives it an edge in workflows demanding higher VRAM capacity, such as GPU effects in Premiere Pro and Unreal Engine, but performance in creative applications delivered mixed, and surprising, results.

In graphics-heavy tasks like GPU effects for DaVinci Resolve, Adobe After Effects, and Unreal Engine, the B580 impressed, often matching or exceeding more expensive GPUs. Puget Systems noted the B580 matched the RTX 4060 across resolutions in Unreal Engine while benefiting from its superior VRAM capacity.

Unfortunately, inconsistencies in media acceleration held it back in other areas. In Premiere Pro, for example, Intel’s hardware acceleration for HEVC codecs lagged behind expectations, with Puget Systems observing slower results compared to software-based processing. These issues appear to be driver-related, something Intel is likely to address in upcoming updates.

Shortly after it launched in 2022, Puget Systems tested the Arc A750 (8GB and 16GB models) and came away disappointed. The B580 shows clear improvements over its predecessor, and Intel’s continued driver development will no doubt extend the performance of the B580 even further. Intel's release timing is unfortunate, however.

While the B580 is a strong contender in the entry-level segment right now, Nvidia and AMD are expected to reveal replacements for the GeForce 4060 and the RX 7600 at CES 2025, and those new models are likely to diminish the appeal, and competitiveness, of Intel's new GPU significantly.

You might also like



from Latest from TechRadar US in News,opinion https://ift.tt/6WrITmd

Monday, December 23, 2024

A Week Left to Spend Your 2024 FSA Money: How It Works and What You Can Buy

If you don't use your Flexible Spending Account funds, you could lose them at the end of the year.

from CNET https://ift.tt/U6F10uA

Latest Tech News


  • HBM is fundamental to the AI revolution as it allows ultra fast data transfer close to the GPU
  • Scaling HBM performance is difficult if it sticks to JEDEC protocols
  • Marvell and others wants to develop a custom HBM architecture to accelerate its development

Marvell Technology has unveiled a custom HBM compute architecture designed to increase the efficiency and performance of XPUs, a key component in the rapidly evolving cloud infrastructure landscape.

The new architecture, developed in collaboration with memory giants Micron, Samsung, and SK Hynix, aims to address limitations in traditional memory integration by offering tailored solutions for next-generation data center needs.

The architecture focuses on improving how XPUs - used in advanced AI and cloud computing systems - handle memory. By optimizing the interfaces between AI compute silicon dies and High Bandwidth Memory stacks, Marvell claims the technology reduces power consumption by up to 70% compared to standard HBM implementations.

Moving away from JEDEC

Additionally, its redesign reportedly decreases silicon real estate requirements by as much as 25%, allowing cloud operators to expand compute capacity or include more memory. This could potentially allow XPUs to support up to 33% more HBM stacks, massively boosting memory density.

“The leading cloud data center operators have scaled with custom infrastructure. Enhancing XPUs by tailoring HBM for specific performance, power, and total cost of ownership is the latest step in a new paradigm in the way AI accelerators are designed and delivered,” Will Chu, Senior Vice President and General Manager of the Custom, Compute and Storage Group at Marvell said.

“We’re very grateful to work with leading memory designers to accelerate this revolution and, help cloud data center operators continue to scale their XPUs and infrastructure for the AI era.”

HBM plays a central role in XPUs, which use advanced packaging technology to integrate memory and processing power. Traditional architectures, however, limit scalability and energy efficiency.

Marvell’s new approach modifies the HBM stack itself and its integration, aiming to deliver better performance for less power and lower costs - key considerations for hyperscalers who are continually seeking to manage rising energy demands in data centers.

ServeTheHome’s Patrick Kennedy, who reported the news live from Marvell Analyst Day 2024, noted the cHBM (custom HBM) is not a JEDEC solution and so will not be standard off the shelf HBM.

“Moving memory away from JEDEC standards and into customization for hyperscalers is a monumental move in the industry,” he writes. “This shows Marvell has some big hyperscale XPU wins since this type of customization in the memory space does not happen for small orders.”

The collaboration with leading memory makers reflects a broader trend in the industry toward highly customized hardware.

“Increased memory capacity and bandwidth will help cloud operators efficiently scale their infrastructure for the AI era,” said Raj Narasimhan, senior vice president and general manager of Micron’s Compute and Networking Business Unit.

“Strategic collaborations focused on power efficiency, such as the one we have with Marvell, will build on Micron’s industry-leading HBM power specs, and provide hyperscalers with a robust platform to deliver the capabilities and optimal performance required to scale AI.”

More from TechRadar Pro



from Latest from TechRadar US in News,opinion https://ift.tt/9UONTXo

Sunday, December 22, 2024

Best Internet Providers in Las Vegas, Nevada

Las Vegas has a decent variety of options for good internet. This list will help you make your choice based on speed, value and availability.

from CNET https://ift.tt/DVQl7xY

Latest Tech News


  • First look at Dell Pro Max 18 Plus emerges in new images
  • Pictures show a completely redesigned mobile workstation laptop
  • Pro Max could either replace popular Precision range or be a whole new range, offering up to 256GB RAM and up to 16TB SSD

Leaked details have suggest Dell is developing a new addition to its workstation offerings designed to deliver high-performance capabilities for professional workloads.

Available in two sizes, the Dell Pro Max 18 Plus is expected to debut officially at CES 2025 and could either replace the popular Precision range or form an entirely new lineup.

The device allegedly features an 18-inch display, while the Pro Max 16 Plus provides a smaller 16-inch alternative with similar specifications. According to information shared by Song1118 on Weibo, which includes Dell marketing slides, the laptops will be powered by Intel’s upcoming Core Ultra 200HX “Arrow Lake-HX” CPUs. For graphics, the series will reportedly feature Nvidia’s Ada-based RTX 5000-class workstation GPUs, though the exact model isn’t named in the leaked documents.

Triple-fan cooling system

The Pro Max series is set to offer up to 200 watts for the CPU/GPU combination in the 18-inch version and 170 watts in the 16-inch model. VideoCardz notes that while we have already seen much higher targets in ultra-high-end gaming machines, “this would be the first laptop confirmed to offer 200W for a next-gen Intel/Nvidia combo.”

The laptops will reportedly support up to 256GB of CAMM2 memory. The 18-inch model can accommodate up to 16TB of storage via four M.2 2280 SSD slots, while the 16-inch version supports 12TB with three slots. The heat generated by these high-power components will be managed by an “industry first” triple-fan cooling system.

Additional features look to include a magnesium alloy body to reduce weight, an 8MP camera, and a tandem OLED display option. Connectivity options include Thunderbolt 5 (80/120Gbps), WiFi 7, Bluetooth 5.4, and optional 5G WWAN. The two laptops also feature a quick-access bottom cover for easy serviceability and repairability of key components like batteries, memory, and storage.

The Dell Pro Max 16/18 Plus laptops are expected to be officially unveiled along with pricing at CES on January 7, 2025, with a mid-2025 release window.

You might also like



from Latest from TechRadar US in News,opinion https://ift.tt/IsbKH3r

Saturday, December 21, 2024

Latest Tech News

  • Shuttle XH610G2 offers compact design supporting Intel Core processors up to 24 cores
  • Exclusive heat pipe technology ensures reliable operation in demanding environments
  • Flexible storage options include M.2 slots and SATA interfaces

Shuttle has released its latest mini PC, aimed at meeting the diverse demands of modern commercial tasks.

With a small 5-liter chassis and a compact design measuring just 250mm x 200mm x 95mm, the Shuttle XH610G2 employs the Intel H610 chipset, making it compatible with a broad spectrum of Intel Core processors, from the latest 14th Gen models back to the 12th Gen series.

The company says the device is designed to handle applications that require significant computational power like image recognition, 3D video creation, and AI data processing.

Shuttle XH610G2

The Shuttle XH610G2 comes with an exclusive heat pipe cooling technology which allows the workstation to operate reliably even in demanding environments, being capable of withstanding temperatures from 0 to 50 degrees Celsius, making it suitable for continuous operation in various commercial settings.

The Shuttle XH610G2 can accommodate Intel Core models with up to 24 cores and a peak clock speed of 5.8GHz. This processing power allows the workstation to handle intensive tasks while staying within a 65W thermal design power (TDP) limit. The graphics are enhanced by the integrated Intel UHD graphics with Xe architecture, offering capabilities to manage demanding visual applications, from high-quality media playback to 4K triple-display setups. The inclusion of dual HDMI 2.0b ports and a DisplayPort output facilitates independent 4K display support.

The XH610G2 offers extensive customization and scalability with support for dual PCIe slots, one x16 and one x1, allowing users to install discrete graphics cards or other high-performance components like video capture cards.

For memory, the XH610G2 supports up to 64GB of DDR5-5600 SO-DIMM memory, split across two slots, making ideal for resource-intensive applications, providing the system with the necessary power to handle complex computational tasks efficiently. Running at a low 1.1V, this memory configuration also minimizes energy consumption, which can be a significant advantage in environments conscious of power usage.

In terms of storage, this device features a SATA 6.0Gb/s interface for a 2.5-inch SSD or HDD, along with two M.2 slots for NVMe and SATA storage options. Users are recommended to choose a SATA SSD over a traditional HDD to ensure faster performance.

The I/O options on the XH610G2 further enhance its flexibility, with four USB 3.2 Gen 1 ports, two Ethernet ports, one supporting 1GbE and another 2.5GbE, and an optional RS232 COM port offering enhanced compatibility for specialized peripheral connections, which can be particularly useful in industrial or legacy environments.

Furthermore, the compact chassis includes M.2 expansion slots for both WLAN and LTE adapters, providing options for wireless connectivity that can be critical in setups where wired connections are not feasible.

You might also like



from Latest from TechRadar US in News,opinion https://ift.tt/YuyqwPS

Latest Tech News

  • TeamGroup claims CAMM2 memory promises high-speed DDR5 performance
  • Revolutionary design offers dual-channel operation in a single module
  • Limited motherboard compatibility poses challenges for CAMM2 adoption

TeamGroup has introduced its Compression Attached Memory Module 2 (CAMM2), promising high-speed DDR5 performance with its new T-Create lineup.

The company says CAMM2 features a revolutionary design that offers significant advantages over traditional memory types like SO-DIMM, U-DIMM, and R-DIMM. It supports dual-channel operation with just one module, streamlining system architecture and lowering power consumption.

The built-in Client Clock Driver (CKD) boosts signal integrity, making CAMM2 well-suited for slim notebooks while its optimized thermal design enhances heat dissipation, allowing higher performance despite the smaller form factor.

CAMM2-compatible motherboards are very scarce

The T-Create CAMM2 modules are designed with DDR5-7200 specifications and a CAS latency of CL34-42-42-84, delivering remarkable read, write, and copy speeds of up to 117GB/s, 108GB/s, and 106GB/s, respectively.

This performance is achieved through manual overclocking, which has driven latency down to 55ns, a significant reduction compared to typical DDR5 JEDEC specifications. TeamGroup is now focused on pushing boundaries and the company says it is working to achieve even faster speeds, aiming to reach DDR5-8000 and even DDR5-9000 in future iterations.

One major setback for TeamGroup lies in the availability of CAMM2-compatible motherboards, which are currently limited. The T-Create CAMM2 memory was tested on MSI’s Z790 Project Zero, one of the few boards currently compatible with this new form factor.

Other brands, such as Gigabyte, hint at possible CAMM2-enabled designs, like an upcoming TACHYON board. However, the CAMM2 ecosystem is still emerging, and widespread adoption may depend on the release of more compatible boards and competitive pricing.

Nevertheless, TeamGroup expects to launch the first-generation T-Create CAMM2 modules by Q1 2025, with broader motherboard support potentially arriving as manufacturers introduce new CPU platforms. With AMD and Intel rumoured to announce budget-friendly CPUs at CES 2025, the rollout of mid-range boards compatible with CAMM2 could align with TeamGroup’s release plans, potentially helping CAMM2 secure a foothold in the market.

CAMM2 offers a couple of advantages over the widely used SO-DIMM, UDIMM, and RDIMM standards. Notably, CAMM2 modules operate in dual-channel mode while only occupying a single physical slot. Furthermore, they incorporate a Client Clock Driver (CKD), similar to CUDIMM memory, which bolsters signal integrity at high speeds, allowing for more reliable and faster memory performance.

These features make CAMM2 particularly appealing for laptops, which often face limitations with current SO-DIMM speeds or non-upgradeable LPDDR5/5X options.

Via Tom's Hardware

You might also like



from Latest from TechRadar US in News,opinion https://ift.tt/wjS3NFW

Latest Tech News


  • We might not see the OnePlus Open 2 until later in 2025
  • Previous leaks predicted a Q1 2025 launch
  • Major upgrades have been rumored for the foldable

A quick browse through our OnePlus Open review will tell you why we're very much looking forward to the foldable phone's successor – though if a new leak is to be believed, the wait for the OnePlus Open 2 might be longer than originally thought.

According to tipster Sanju Choudhary (via GSMArena), the handset is going to break cover during the second half of next year – anytime from July onwards. That contradicts an earlier rumor that it would be unveiled in the first three months of 2025.

There's no indication whether or not OnePlus has changed its plans, or if the launch date was originally set for the first quarter of next year and has since been pushed back (engineering foldable phones is a tricky challenge, after all).

It's also fair to say that none of these rumors can be confirmed until OnePlus actually makes its announcement. The original OnePlus Open was launched in October 2023, which doesn't really tell us much about a schedule for its successor.

Upgrades on the way

Whenever the next OnePlus folding phone shows up, it sounds like it's going to be worth the wait – which has lasted 14 months and counting. Rumors have pointed to major upgrades in terms of the rear camera and the internal components.

We've also heard that the OnePlus Open 2 will have the biggest battery ever seen in a foldable, as well as being thinner and more waterproof than the handset it's replacing. That's a significant number of improvements to look forward to.

In our OnePlus Open review, we described the phone as "the only foldable phone that doesn't compromise", and there was particular praise for the design and the camera setup – so the upcoming upgrade has a lot to live up to.

Before we see another foldable from OnePlus, we'll see the OnePlus 13 and the OnePlus 13R made available worldwide: OnePlus has confirmed this is happening on January 7, so we could also get a teaser for the OnePlus Open 2 at the same time.

You might also like



from Latest from TechRadar US in News,opinion https://ift.tt/bJIzVCo

Friday, December 20, 2024

Latest Tech News

  • 15% of Steam users' playtime dedicated to 2024 games
  • 47% of playtime on games up to eight years old
  • Many reasons for this, including more older games to play

Steam’s end-of-the-year review has always revealed some fascinating PC gaming trends and this year’s is no exception. According to 2024’s stats, only 15% of Steam users spent their total playing time on games that launched in 2024.

Looking further at the data that PC Gamer reports on, 47% of the total playing time on Steam was spent on games released in the last seven years, while 37% of that time was spent on games that launched eight years or more ago. Now the question is, why and what does this mean?

One possible explanation is that gamers could be focusing more on their backlogs rather than new releases. We do know that playtime for current releases is higher this year than in 2023, as there was an increase from 9% to 15%, which means players are buying new titles at least. There are other possibilities for this trend as well.

Other possibilities for this statistic

One reason could be that older games are easier to access due to their cheaper prices, especially due to the many Steam sales. There’s also the influence of the Steam Deck and what’s considered ‘Steam Deck playable,’ since many recent AAA games may be too demanding for a portable PC.

There’s also the fact that older live service games like Counter-Strike, Dota 2, and PUBG have made up Steam's Most Played charts, while newer titles have an incredibly difficult time breaking through and building a player base.

Another reason is that Steam has over 200,000 titles released over the course of decades, compared to the relatively paltry 18,000 games released in 2024 according to SteamDB. So naturally, more users will spend more time playing older games versus recent ones.

Regardless, 15% of playtime dedicated to new games is rather impressive, compared to 2022’s 17% stat. It means that the numbers are recovering after the massive dip in 2023. Hopefully next year we’ll see another increase, as gamers delve into more new titles.

You might also like...



from Latest from TechRadar US in News,opinion https://ift.tt/oy487SC

Latest Tech News


  • OpenAI announced upcoming o3 and o3-mini AI models.
  • The new models are enhanced "reasoning" AI models that build on the o1 and o1-mini models released this year.
  • Both models handily outperform existing AI models and will roll out in the next few months.

The final day of the 12 Days of OpenAI, brought back OpenAI CEO Sam Altman to show off a brand new set of AI models coming in the new year. The o3 and o3-mini models are enhanced versions of the relatively new o1 and o1-mini models. They're designed to think before they speak, reasoning out their answers. The mini version is smaller and aimed more at carrying out a limited set of specific tasks but with the same approach.

OpenAI is calling it a big step toward artificial general intelligence (AGI), which is a pretty bold claim for what is, in some ways, a mild improvement to an already powerful model. You might have noticed there's a number missing between the current o1 and the upcoming o3 model. According to Altman, that's because OpenAI wants to avoid any confusion with British telecom company O2.

So, what makes o3 special? Unlike regular AI models that spit out answers quickly, o3 takes a beat to reason things out. This “private chain of thought” lets the model fact-check itself before responding, which helps it avoid some of the classic AI pitfalls, like confidently spewing out wrong answers. This extra thinking time can make o3 slower, even if only a little bit, but the payoff is better accuracy, especially in areas like math, science, and coding.

One great aspect of the new models is that you can adjust that extra thinking time manually. If you’re in a hurry, you can set it to “low compute” for quick responses. But if you want top-notch reasoning, crank it up to “high compute” and give it a little more time to mull things over. In tests, o3 has easily outstripped its predecessor.

This is not quite AGI; o3 can't take over for humans in every way. It also does not reach OpenAI's definition of AGI, which describes models that outperform humans in the most economically valuable projects. Still, should OpenAI reach that goal, things get interesting for its partnership with Microsoft since that would end OpenAI's obligation to give Microsoft exclusive access to the most advanced AI models.

New year, new models

Right now, o3 and its mini counterpart aren’t available to everyone. OpenAI is giving safety researchers a sneak peek via Copilot Labs, and the rest of us can expect the o3-mini model to drop in late January, with the full o3 following soon after. It’s a careful, measured rollout, which makes sense given the kind of power and complexity we’re talking about here.

Still, o3 gives us a glimpse of where things are headed: AI that doesn’t just generate content but actually thinks through problems. Whether it gets us to AGI or not, it’s clear that smarter, reasoning-driven AI is the next frontier. For now, we’ll just have to wait and see if o3 lives up to the hype or if this last gift from OpenAI is just a disguised lump of coal.

You might also like



from Latest from TechRadar US in News,opinion https://ift.tt/AtjHbLM

Latest Tech News


  • Asus' latest monitor releases come with a kit to mount a mini PC at the back
  • There's also a groove to place your smartphone, plus an integrated USB hub
  • Sadly it is not a 4K display, merely a full HD+ one

As Mini PCs are becoming increasingly powerful, offering a compact design and a wealth of ports, they offer a versatile solution for users who need a powerful setup but don’t necessarily have the workspace to dedicate to a traditional desktop PC.

Recognizing this trend, Asus has introduced two 24-inch monitors, the BE248CFN and BE248QF, which are designed to accommodate these miniature marvels. Each monitor includes a mounting kit to securely attach a mini PC at the back of the stand, positioned closer to the base for easier access.

The two monitors offer other practical features, including a groove at the base that you can use to stash a smartphone. There’s also an integrated USB hub for users managing multiple devices.

Not 4K, sadly

Asus BE248CFN screen with mini PC

(Image credit: Asus)

Both models offer ergonomic adjustments to suit various viewing preferences. The stands support tilt from -5 to 35 degrees, swivel 180 degrees left and right, pivot 90 degrees in either direction, and 130mm of height adjustment. The IPS panels deliver wide 178-degree viewing angles and 16.7 million colors, with a 5ms response time, 350cd/m² brightness, and a contrast ratio of 3,000:1.

Rather disappointingly, the display resolution of the two screens is Full HD+ (1,920 x 1,200), rather than 4K upwards, which may limit their appeal to those requiring higher detail or sharper visuals, such as content creators, or those who like to have a lot of windows open on screen at the same time.

Connectivity varies slightly between the two models. The BE248CFN includes HDMI 1.4, DisplayPort 1.4, USB Type-C with a 96W power delivery function, a four-port USB 3.2 Gen 1 hub, and Gigabit Ethernet. The BE248QF adds a mini D-Sub 15-pin connector, catering to users with legacy hardware.

Both monitors incorporate 2W stereo speakers and Asus Eye Care technologies, such as Flicker-Free and Low Blue Light, which should make them comfortable to use during extended work sessions.

There’s no word on pricing or global availability as yet, but they should be on sale soon, starting in Japan, before hopefully heading to other countries.

You might also like




from Latest from TechRadar US in News,opinion https://ift.tt/AIZ1pNj

Thursday, December 19, 2024

I Set Up My Own ADT Home Security System. Here's How It Works

Commentary: I didn't need a technician to come to my home to set up ADT's smart security system. Here's what it includes and how I did my own DIY installation.

from CNET https://ift.tt/VB5lSek

Latest Tech News


  • Apple developing "Baltra" server chip for AI, targeting 2026 production
  • Israeli silicon team leading project; Mac chip canceled for focus
  • Broadcom collaboration and TSMC’s N3P tech to enhance development

Apple is reportedly developing its first server chip tailored specifically for artificial intelligence.

A paywalled report by Wayne Ma and Qianer Liu in The Information claims the project, codenamed “Baltra,” aims to address the growing computational demands of AI-driven features and is expected to enter mass production by 2026.

Apple’s silicon design team in Israel, which was responsible for designing the processors that replaced Intel chips in Macs in 2020, is now leading the development of the AI processor, according to sources. To support this effort, Apple has reportedly canceled the development of a high-performance Mac chip made up of four smaller chips stitched together.

Central to Apple’s efforts

The report notes this decision, made over the summer, is intended to free up engineers in Israel to focus on Baltra, signaling Apple’s shift in priorities toward AI hardware.

Apple is working with semiconductor giant Broadcom on this project, using the company’s advanced networking technologies needed for AI processing. While Apple usually designs its chips in-house, Broadcom’s role is expected to focus on networking solutions, marking a new direction in their partnership.

To make the AI chip, The Information says Apple plans to use TSMC’s advanced N3P process, an upgrade from the technology behind its latest processors, like the M4. This move highlights Apple’s focus on enhancing performance and efficiency in its chip designs.

The Baltra chip is expected to drive Apple’s efforts to integrate AI more deeply into its ecosystem. By leveraging Broadcom’s networking expertise and TSMC's advanced manufacturing techniques, Apple appears determined to catch up to rivals in the AI space and establish a stronger presence in the industry.

In November 2024, we reported that Apple approached its long-time manufacturing partner Foxconn to build AI servers in Taiwan. These servers, using Apple’s M-series chips, are intended to support Apple Intelligence features in iPhones, iPads, and MacBooks.

You might also like



from TechRadar - All the latest technology news https://ift.tt/7KHpRVC

Wednesday, December 18, 2024

Sony’s WF-1000XM5 Wireless Earbuds Make a Great Gift at This Record-Low Price

The Sony WF-1000XM5 wireless earbuds offer superb sound quality and you can now snag them at Amazon for $198, their lowest price ever.

from CNET https://ift.tt/CFWAean

Latest Tech News


  • Huawei may be adding HBM support to Kunpeng SoC
  • Clues hint at a replacement for the Kunpeng 920, launched in 2019
  • New SoC with HBM may target HPC, server market rivals

Huawei engineers have reportedly released new Linux patches to enable driver support for High Bandwidth Memory (HBM) management on the company’s ARM-based Kunpeng high-performance SoC.

The Kunpeng 920, which debuted in January 2019 as the company’s first server CPU, is a 7nm processor featuring up to 64 cores based on the Armv8.2 architecture. It supports eight DDR4 memory channels and has a thermal design power (TDP) of up to 180W. While these specifications were competitive when first introduced, things have moved on significantly since.

Introducing a new Kunpeng SoC with integrated HBM would align with industry trends as companies seek to boost memory bandwidth and performance in response to increasingly demanding workloads. It could also signal Huawei’s efforts to maintain competitiveness in the HPC and server markets dominated by Intel Xeon and AMD EPYC.

No official announcement... yet

Phoronix’s Michael Larabel notes that Huawei has not yet formally announced a new Kunpeng SoC (with or without HBM), and references to it are sparse. Kernel patches, however, have previously indicated work on integrating HBM into the platform.

The latest patches specifically address power control for HBM devices on the Kunpeng SoC, introducing the ability to power on or off HBM caches depending on workload requirements.

The patch series includes detailed descriptions of this functionality. Huawei explains that HBM offers higher bandwidth but consumes more power. The proposed drivers will allow users to manage HBM power consumption, optimizing energy use for workloads that do not require high memory bandwidth.

The patches also introduce a driver for HBM cache, enabling user-space control over this feature. By using HBM as a cache, operating systems can leverage its bandwidth benefits without needing direct awareness of the cache’s presence. When workloads are less demanding, the cache can be powered down to save energy.

While we don't have any concrete details on future Kunpeng SoCs, integrating HBM could potentially allow them compete more effectively against other ARM-based server processors, as well as Intel’s latest Xeon and AMD EPYC offerings.

You might also like



from TechRadar - All the latest technology news https://ift.tt/fexBqHY

Could Apple's New Adaptive Power Feature Extend Your iPhone's Battery Life?

With this new feature being tested in the iOS 26 developer beta, you may be able to ditch the Low Power Mode setting in the future. from C...