Sunday, February 2, 2025

Best Web Hosting Services for 2025

We compared popular web hosting services like A2 Hosting, HostGator and many others to help you find the best service to fit your needs now and in the future.

from CNET https://ift.tt/DMxzlKV

Are Tips Taxable Income? Everything to Know When Filing Your Return

A lot of Americans work jobs where they can collect tips. If you're in that group, you need to read this.

from CNET https://ift.tt/fr9Dbvz

10 of the Best Peacock Shows to Watch Right Now

Check out the new comedy series Laid or the third season of The Traitors.

from CNET https://ift.tt/nuv9zWC

Latest Tech News


  • A shipping manifest has detailed what looks like a professional workstation card
  • It could possibly be the successor to Nvidia's RTX 6000 Ada, the most expensive graphics card in the world
  • Based on the RTX5090, it is expected to have a whopping 96GB, twice that of it predecessor

The GeForce RTX 5090, the latest flagship graphics card for gamers and creatives in Nvidia's GeForce 50 series, was unveiled at CES 2025 and has just gone on sale - buts hortly before it did, rumors began to swirl of an RTX 5090 Ti model featuring a fully enabled GB202-200-A1 GPU and dual 12V-2×6 power connectors, theoretically allowing for up to 1,200 watts of power.

This speculation began following the appearance of a prototype image on the Chinese industry forum Chiphell - reporting on the image, ComputerBase said, “With 24,576 shaders, the GB202-200-A1 GPU is said to offer 192 active streaming multiprocessors, which were previously rumored to be the full expansion of the GB202 chip. The memory is said to continue to offer 32GB capacity, but with 32Gbps instead of 28Gbps, it will exceed the 2TB/s mark.”

Shortly after the engineering card surfaced online, ComputerBase alsospotted shipping documents on NBD Data listing a graphics card with 96GB of GDDR7 memory, marked as “for testing.” It is a reasonable assumption that this unidentified model is actually a professional workstation card, potentially – let’s say probably – the RTX 6000 Blackwell.

Useful for AI applications

Nvidia RTX 6000 Blackwell

(Image credit: NBD)

The GeForce RTX 5090 features 32GB of GDDR7, using sixteen 2GB modules connected through a 512-bit memory interface. 48GB would be possible if sixteen 3GB chips were used instead of 2GB chips.

If two of these 3GB chips were connected to each 32-bit controller, placing 16 chips on both the front and back of the graphics card in a "clamshell" configuration, the 96GB mentioned in the documents – which is twice as much as the RTX 6000 Ada, the most expensive graphics card in the world – would become a reality.

The shipping records indicate these GPUs use a 512-bit memory bus, reinforcing this theory. The internal PCB designation PG153, seen in the documents, aligns with known Nvidia Blackwell designs and has not yet appeared in any existing consumer graphics cards.

Nvidia is expected to introduce the RTX Blackwell series for workstations at its annual GPU Technology Conference (GTC 2025), so we should know more about them come March 2025. And yes, if you’re thinking 96GB of GDDR7 memory is overkill for gaming or creative purposes I’d agree with you. It is a good amount for AI tasks though, so we can expect to see Nvidia announce an AI version of the RTX 6000 Blackwell when it finally takes the wraps off its next-gen product.

You might also like



from Latest from TechRadar US in News,opinion https://ift.tt/v3i7J5X

Saturday, February 1, 2025

Healthy Aging Is as Easy as Getting Plenty of These 6 Essential Vitamins and Minerals

Whether through your diet or supplements, these are the top nutrients you need to keep your body in top shape as you age.

from CNET https://ift.tt/MohzFam

Latest Tech News


  • Mukesh Ambani's Reliance touches the lives of nearly everyone in India
  • Reliance plans to build a 3GW data center in India, the largest in the world
  • Ambani has pledged to use green and renewable energy to power this giga data center

Mukesh Ambani’s Reliance Group, one of India's largest and most influential conglomerates, is developing a large-scale data center in Jamnagar - a small town in Gujarat that’s already home to Reliance’s major oil refining and petrochemical operations.

Reports from Bloomberg claim the data center, which could become the world’s largest, is expected to reach a total capacity of 3 gigawatts, significantly boosting India’s current data center capacity, which is estimated at under 1 gigawatt.

That will make it five times the size of Microsoft’s 600 megawatts facility in Boydton, Virginia.

Operational by 2027

Nvidia will provide Reliance Group with the AI chips it needs for the project, which comes at a time when tech firms are investing heavily in AI infrastructure. In the US, OpenAI, SoftBank, and Oracle recently announced Project Stargate, a $500 billion investment venture, and Meta’s CEO Mark Zuckerberg said on Facebook that his company was earmarking a $65 billion capital expenditure spend in 2025 and “building a 2GW+ data center so large it would cover a significant part of Manhattan.”

Reliance reportedly plans to power the facility primarily with renewable energy, integrating it with its existing solar, wind, and green hydrogen projects. However, Bloomberg believes achieving a stable energy supply may require backup from fossil fuels or other sources.

Although a Reliance spokesperson declined to provide further details on the Jamnagar project they did point Bloomberg to previous remarks from Mukesh Ambani’s son Akash, CEO of Reliance Jio Infocomm, who claimed the company aims to complete the data center within 24 months.

Headquartered in Mumbai, Reliance was founded by Dhirubhai Ambani in 1966 as a small textile business. Over the decades, it expanded into petrochemicals, refining, and other industries, and following Dhirubhai’s death in 2002, Mukesh took control of the company and led its transformation into a global powerhouse.

Despite the conglomerate's success, it’s not clear how Reliance will fund the $20 billion to $30 billion the data center will reportedly cost. As Bloomberg notes, “Reliance Industries Ltd., the group’s primary listed entity, has the equivalent of about $26 billion on its balance sheet.”

You might also like



from Latest from TechRadar US in News,opinion https://ift.tt/whue6Wx

Friday, January 31, 2025

Best Internet Providers in Waukesha, Wisconsin

Get the best internet in Waukesha with fiber, cable and wireless options from providers like AT&T, Spectrum and T-Mobile -- here’s how they compare.

from CNET https://ift.tt/7kp6PwR

Latest Tech News


  • Nvidia's H800 was launched in March 2023 and is a cut-down version of the H100
  • It is also significantly slower than Nvidia's H200 and AMD's Instinct range
  • These artificial constraints have forced DeepSeek's engineering to innovate

It was widely assumed that the United States would remain unchallenged as the global AI superpower, particularly after President Donald Trump’s recent announcement of Project Stargate - a $500 billion initiative to bolster AI infrastructure across the US. However, this week saw a seismic shift with the arrival of China’s DeepSeek. Developed at a fraction of the cost of its American rivals, DeepSeek came out swinging seemingly out of nowhere and made such an impact that it wiped $1 trillion from the market value of US tech stock, with Nvidia the major casualty.

Obviously, anything developed in China is going to be highly secretive, but a tech paper published a few days before the chat model stunned AI watchers does give us some insight into the technology that drives the Chinese equivalent of ChatGPT.

In 2022, the US blocked the importation of advanced Nvidia GPUs to China to tighten control over critical AI technology, and has since imposed further restrictions, but evidently that hasn’t stopped DeepSeek. According to the paper, the company trained its V3 model on a cluster of 2,048 Nvidia H800 GPUs - crippled versions of the H100.

Training on the cheap

The H800 launched in March 2023, to comply with US export restrictions to China, and features 80GB of HBM3 memory with 2TB/s bandwidth.

It lags behind the newer H200, which offers 141GB of HBM3e memory and 4.8TB/s bandwidth, and AMD’s Instinct MI325X which outpaces both with 256GB of HBM3e memory and 6TB/s bandwidth.

Each node in the cluster DeepSeek trained on houses 8 GPUs connected by NVLink and NVSwitch for intra-node communication, while InfiniBand interconnects handle communication between nodes. The H800 has lower NVLink bandwidth compared to the H100, and this, naturally, affects multi-GPU communication performance.

DeekSeek-V3 required a total of 2.79 million GPU-hours for pretraining and fine-tuning on 14.8 trillion tokens, using a combination of pipeline and data parallelism, memory optimizations, and innovative quantization techniques.

The Next Platform, which has done a deep dive into how DeepSeek works, says “At the cost of $2 per GPU hour – we have no idea if that is actually the prevailing price in China – then it cost a mere $5.58 million to train V3.”

You might also like



from Latest from TechRadar US in News,opinion https://ift.tt/V0bjifp

Thursday, January 30, 2025

Best High-End Bluetooth Headphones and Earbuds

Want one set of headphones for multiple devices without the hassle of switching? Get the best multipoint Bluetooth headphones or earbuds.

from CNET https://ift.tt/zU3Yjm1

Latest Tech News


  • Following $500 billion Project Stargate launch, Meta is also dolling out the dollars
  • Meta's $65 billion is lower than Microsoft's $80 billion commitment
  • AWS is set to spend more than $75 billion while Google has yet to say how much it will spend

If you have a few hundred billion dollars burning a hole in your pocket, you’re probably considering spending it on an AI data center or two. US President Donald Trump recently announced OpenAI, SoftBank, and Oracle would launch a new AI infrastructure venture called Project Stargate, investing $500 billion over four years across the US. OpenAI noted that $100 billion would be made available “immediately.”

Elon Musk, no stranger to building AI data centers and perhaps a bit miffed about being left out, claimed that Project Stargate doesn’t actually have the money, stating, “SoftBank has well under $10 billion secured.”

While that was unfolding, Meta CEO Mark Zuckerberg made an announcement on Facebook, revealing the company is “building a 2GW+ data center so large it would cover a significant part of Manhattan,” while also outlining other AI plans.

A defining year for AI

Zuckerberg's full post reads: “This will be a defining year for AI. In 2025, I expect Meta AI will be the leading assistant, serving more than 1 billion people. Llama 4 will become the leading state-of-the-art model, and we’ll build an AI engineer that will start contributing increasing amounts of code to our R&D efforts. To power this, Meta is building a 2GW+ data center so large it would cover a significant part of Manhattan. We’ll bring online ~1GW of compute in '25 and end the year with more than 1.3 million GPUs. We’re planning to invest $60-65 billion in capex this year while also growing our AI teams significantly, and we have the capital to continue investing in the years ahead. This is a massive effort, and over the coming years, it will drive our core products and business, unlock historic innovation, and extend American technology leadership. Let’s go build!”

$65 billion on capital expenses certainly isn’t nothing, but it pales in comparison to the $80 billion Microsoft plans to invest in fiscal 2025 or the $75 billion-plus AWS intends to spend this year. We don't know how much Google will be pumping into AI infrastructure, but it's going to be a similar figure.

That said, Meta's investment is higher than most would have expected. Reuters points out: “The $60 billion to $65 billion capital spending outlined for 2025 would mark a significant jump from the company's estimated expenditure of $38 billion to $40 billion last year. It is also above analysts' estimates of $50.25 billion for 2025, according to LSEG data.”

You might also like



from Latest from TechRadar US in News,opinion https://ift.tt/0apkCl4

Wednesday, January 29, 2025

NordVPN Launches NordWhisper Protocol Designed to Bypass VPN Blocks

NordVPN’s newly launched NordWhisper VPN protocol aims to bypass VPN restrictions by mimicking traditional web traffic.

from CNET https://ift.tt/8B9yPO1

Latest Tech News


  • Wine 10 is now out with more than 6,000 updates on its release log
  • The emulator - which turns 32 this year - allows Linux/Unix users to run Windows software
  • Open source project doesn't require Windows unlike virtual machine solutions

If you want to run Windows software on a Linux OS, you'll need to install the Wine compatibility layer. Wine, a recursive acronym for "Wine Is Not an Emulator," provides a runtime environment for running Windows applications natively on Linux without virtualization. It can also be configured as the default installer for Windows software, simplifying the setup process.

First released on July 4, 1993, Wine was created by Bob Amstadt (the project’s original lead) and Eric Youngdale as an open source implementation of the Windows API for Unix-based systems. Over the past 32 years, it has evolved into a powerful tool for bridging the gap between Windows and Linux environments and after a year of development, the developers behind it have announced the stable release of Wine 10.

This new version includes over 6,000 individual changes. While many are minor fixes, there are some notable highlights, including full support for the ARM64EC architecture and hybrid ARM64X modules, allowing seamless integration of ARM64EC and plain ARM64 code.

OpenGL support

High-DPI support has been improved in this release, with automatic scaling for non-DPI aware applications. Vulkan enhancements now support child window rendering and Vulkan Video extensions.

Direct3D updates include a new HLSL-based fixed function pipeline, Vulkan shader backend improvements, and dynamic state extensions to reduce stuttering. And, in a welcome move, OpenGL is now supported within the Wayland driver, which is enabled by default. This addition will allow for better integration with Wayland-based Linux environments.

Other changes include an experimental FFmpeg-based multimedia backend, improved HID and touchscreen input handling, enhanced Unicode and locale compatibility, and solid RPC/COM support on ARM platforms.

A number of other tweaks have been made too, such as process elevation, improved serial port event handling, and support for modern vector extensions like AVX-512. Developers also benefit from updated build tools, static analysis options, and bundled library upgrades like Capstone, Vkd3d 1.14, and FFmpeg.

You might also like



from Latest from TechRadar US in News,opinion https://ift.tt/kUGA5D8

Tuesday, January 28, 2025

Best Website Builder for 2025

A website builder is a great tool for building your personal brand in 2025. These are the best ones, regardless of your experience level.

from CNET https://ift.tt/Jo2V1gC

Latest Tech News


  • AI company has released a design of a keyboard dock for smartphones
  • With a beige color scheme and a pseudo-mechanical keyboard, it has a strong 1980s feel to it
  • It is expected to cost less than $250 and will appeal to those looking for a Psion 5MX replacement

Amber.Page, an AI startup for writers, has teased a new keyboard case that transforms an iPhone into a portable, laptop-like writing device.

Shared by company founder Justin Mitchell on Threads, the renders of what he’s calling the AmberDeck show a clamshell-style design reminiscent of retro tech, like a mini Tandy TRS-80 Model 100 or the Psion 5MX. The latter had a similar design with what was widely considered one of the best keyboards ever made for a device of its size.

The device offers a minimalist setup for drafting articles, editing text, and even tackling writer’s block with the support of the startup’s AI-powered writing assistant.

Mitchell wrote, “Been cooking on some hardware for Amber over the holiday break. Always wanted a clamshell iPhone keyboard for focused writing on the go, so I made one.”

The case isn’t intended as a permanent enclosure for your phone; rather, it serves as a docking station, turning your device into a focused typing tool with a high-travel pseudo-mechanical keyboard that features a compact 60% layout. Yanko Design says when docked, you won’t be able to use your phone for usual tasks such as accessing the camera or making calls, but if you're focused on writing that will be a good thing.

A work in progress

Mitchell wrote, “Been cooking on some hardware for Amber over the holiday break. Always wanted a clamshell iPhone keyboard for focused writing on the go, so I made one.”

The case isn’t intended as a permanent enclosure for your phone; rather, it serves as a docking station, turning your device into a focused typing tool with a high-travel pseudo-mechanical keyboard that features a compact 60% layout. Yanko Design says when docked, you won’t be able to use your phone for usual tasks such as accessing the camera or making calls, but if you're focused on writing that will be a good thing.

The clamshell case includes an adjustable hinge for screen positioning and a charging port for the keyboard itself. Yanko Design notes the case doesn’t appear to support pass-through charging for the phone, and the keyboard lacks backlighting, which will limit its usability in low-light settings. That said, this is still only a prototype, so things could change.

Indeed, in another thread, Mitchell has updated the design based on feedback and suggestions from his social media followers, including changes made to improve functionality and compatibility. In the updated design, the hinge is centered and features a dual-action mechanism, allowing the case to lay completely flat thanks to an internal gap inside the top shell.

The arrow keys have been reconfigured to a standard layout, with the shift key moved to the right and enlarged for a more intuitive typing experience. A gap has been added around the faceplate, making it easy to pop off and customize or swap as needed. To accommodate a wider range of devices, including Android phones, the inset on the inside has been squared off for better versatility. Additionally, the lip of the case has been moved to the top.

Mitchell has suggested the AmberDeck could sell for under $250, and while these early renders are just a preview, the case could potentially appeal to writers, journalists, and editors looking to turn their phones into compact writing devices that can be used anywhere.

You might also like



from Latest from TechRadar US in News,opinion https://ift.tt/jOhkb3U

Monday, January 27, 2025

Tax Season 2025 Opens Today: Our Essential Cheat Sheet for Filing Your Tax Return

Taxes are no-doubt confusing, but this handy guide can help you finish your taxes and get your refund from the IRS.

from CNET https://ift.tt/NHxUGD1

Latest Tech News

They say fortune favors the bold, so why not rebel from cookie-cutter colorways and mix things up with some eye-catching tech instead? As a...