Saturday, November 9, 2024

Best Solar Panel Installation Companies in Alaska

Discover a local solar installer to build your home's renewable energy system.

from CNET https://ift.tt/7TkA4sh

7 Ways to Use Lemons to Naturally Clean Your Home

There's much more to lemons than their pleasant scent. The acidic fruit is a natural cleaner that can help you keep your house spick and span.

from CNET https://ift.tt/3VQimar

Latest Tech News


  • Samsung to release 400-layer NAND chip for AI data centers
  • New BV NAND tech boosts density and minimizes heat buildup
  • Plans for 1,000-layer NAND by 2030 to expand capacity

Samsung is working to launch a record-breaking 400-layer vertical NAND flash chip by 2026, reports have claimed.

A report by the Korea Economic Daily says Samsung’s Device Solutions (DS) division aims to advance the NAND flash market with its cutting-edge V10 NAND, designed to meet surging demand in AI data centers.

The company’s memory roadmap, as outlined in the report, shows plans for an advanced 10th-generation NAND that will utilize bonding technology to separately build memory cells and the peripheral circuitry on different wafers, later fusing them into a single chip. Known as bonding vertical NANDFlash (BV NAND), this new approach minimizes heat buildup and maximizes both capacity and performance, creating what Samsung has described as a “dream NAND for AI.”

1,000 layers by 2030

The BV NAND design, boasting a 1.6x increase in bit density per unit area, supports ultra-high-capacity solid-state drives (SSDs) ideal for AI applications.

Samsung’s current 286-layer V9 NAND chips marked a significant milestone, but the 400-layer V10 is expected to redefine capacity limits, potentially breaking the 200TB storage threshold for ultra-large AI hyperscaler SSDs, while improving energy efficiency.

For future releases, world’s largest memory chipmaker plans to introduce 11th-generation V11 NAND in 2027 with a 50% faster data transfer speed, further optimizing performance for high-demand data storage needs.

Samsung’s ambitious NAND roadmap extends even further, with plans for chips exceeding 1,000 layers by 2030, KED reports. This advancement aims to keep Samsung at the forefront of the high-capacity NAND market, where demand is spurred by AI applications that require expansive storage solutions to process vast volumes of data.

In the DRAM sector, Samsung aims to release sixth-generation 1c DRAM and seventh-generation 1d DRAM by the end of 2024, targeting use in high-performance AI chips. According to the Korea Economic Daily report, the company also has plans for sub-10 nm 0a DRAM by 2027, using a vertical channel transistor structure for greater stability and efficiency.

You might also like



from TechRadar - All the latest technology news https://ift.tt/FaPXsG2

Friday, November 8, 2024

GOP, Cruz Tell FCC 'Pencils Down' After Trump Win

Republican lawmakers asked FCC Chairwoman Jessica Rosenworcel to halt any new work that could be considered controversial.

from CNET https://ift.tt/dPnvMaz

Latest Tech News


  • Trillium offers 4x training boost, 3x inference improvement over TPU v5e
  • Enhanced HBM and ICI bandwidth for LLM support
  • Scales up to 256 chips per pod, ideal for extensive AI tasks

Google Cloud has unleashed its latest TPU, Trillium, the sixth-generation model in its custom AI chip lineup, designed to power advanced AI workloads.

First announced back in May 2024, Trillium is engineered to handle large-scale training, tuning, and inferencing with improved performance and cost efficiency.

The release forms part of Google Cloud’s AI Hypercomputer infrastructure, which integrates TPUs, GPUs, and CPUs alongside open software to meet the increasing demands of generative AI.

A3 Ultra VMs arriving soon

Trillium promises significant improvements over its predecessor, TPU v5e, with over a 4x boost in training performance and up to a 3x increase in inference throughput. Trillium delivers twice the HBM capacity and doubled Interchip Interconnect (ICI) bandwidth, making it particularly suited to large language models like Gemma 2 and Llama, as well as compute-heavy inference applications, including diffusion models such as Stable Diffusion XL.

Google is keen to stress Trillium’s focus on energy efficiency as well, with a claimed 67% increase compared to previous generations.

Google says its new TPU has demonstrated substantially improved performance in benchmark testing, delivering a 4x increase in training speeds for models such as Gemma 2-27b and Llama2-70B. For inference tasks, Trillium achieved 3x greater throughput than TPU v5e, particularly excelling in models that demand extensive computational resources.

Scaling is another strength of Trillium, according to Google. The TPU can link up to 256 chips in a single, high-bandwidth pod, expandable to thousands of chips within Google’s Jupiter data center network, providing near-linear scaling for extensive AI training tasks. With Multislice software, Trillium maintains consistent performance across hundreds of pods.

Tied in with the arrival of Trillium, Google also announced the A3 Ultra VMs featuring Nvidia H200 Tensor Core GPUs. Scheduled for preview this month they will offer Google Cloud customers a high-performance GPU option within the tech giant’s AI infrastructure.

You might also like



from TechRadar - All the latest technology news https://ift.tt/Ane2uSP

Thursday, November 7, 2024

Best Solar Panel Installation Companies in Tucson

With all the sun you see in Tucson getting solar just makes sense. These are our suggestions for the best solar installation companies in the area.

from CNET https://ift.tt/421t8NU

Latest Tech News


  • Minisforum MGA1 boosts graphics with AMD Radeon 7600M XT
  • Supports three displays, 8K at 60Hz via HDMI and DisplayPort
  • Features OCuLink, USB 3.2, USB-C ports; requires OCuLink connectivity

Minisforum, best known for its range of mini PCs like the AtomMan Series and EliteMini, has launched the MGA1, an external GPU docking station. Powered by the AMD Radeon 7600M XT GPU and backed with 8GB of GDDR6 RAM, the MGA1 gives users a straightforward way to boost a connected device's graphics capabilities.

Built with advanced RDNA 3.0 architecture and a 6nm production process, and functioning as both an eGPU and a docking station, the MGA1 is ideal for use with compatible laptops and mini PCs. The substantial upgrade in graphics performance it provides makes it ideal for graphics-intensive tasks like video editing, 3D rendering and even gaming.

The MGA1 supports up to three displays through its HDMI 2.1 and dual DisplayPort 2.0 connections, each capable of 8K at 60Hz for ultra-high-resolution and smooth refresh rates. It also features three USB 3.2 ports with 10Gbps transfer speeds, an OCuLink 4i port, and a USB-C 3.1 port with 65W Power Delivery, making it suitable for charging devices like - you guessed it - Minisforum’s own mini PCs.

As Tom’s Hardware notes, “By adding an eGPU like the MGA1 to your system, you’d get the best of both worlds - a mini-PC that won’t take up much space on your desk and easy to carry anywhere, and a gaming PC that will give you the performance you need to play AAA titles.”

The docking station includes high-speed data transfer through its OCuLink 4i port with PCIe 4.0 x4 compatibility, ensuring reliable performance for demanding tasks.

But - and it will be a deal breaker for many - USB4 and Thunderbolt 4 connectivity are missing, so you'll need to make sure you have an OCuLink port on your laptop or mini PC before buying the MGA1.

Priced at a rather steep $559, the MGA1 provides a balanced mix of power and connectivity in a compact form, though there are more versatile eGPUs available that may be better suited for use with a broader range of devices, such as GPD's newly upgraded G1.

You might also like



from TechRadar - All the latest technology news https://ift.tt/46aNwJs

Wednesday, November 6, 2024

'Look Back' Anime Movie Hits Streaming: When to Watch on Prime Video

Tatsuki Fujimoto adapted Look Back for the silver screen, and you can watch it at home.

from CNET https://ift.tt/pJrEy6b

Latest Tech News

The Samsung Galaxy Ring took the wearables world by storm, with the product shoving this smartwatch alternative into the spotlight – and helping to generate more interest in competitors in the best smart ring category from the likes of Oura and RingConn. Now it looks like Samsung is gearing up to launch an upgrade to this product line shortly, with a tipster’s comments making it sound like we could see the Galaxy Ring 2 fairly soon.

Korean leaker Lanzuk – who has a track record of spoiling Samsung’s release plans – says Samsung is “planning to launch its Galaxy Ring 2 model a bit earlier than originally scheduled” (translated from Korean). Specifics are light on the ground, but that could mean we’ll get a Galaxy Ring 2 in 2025, and maybe even in the first half of the year at that.

In fairness, the Galaxy Ring was first shown off in January 2024, so an early Galaxy Ring 2 announcement was also already kind of on the cards. However, we didn’t get our hands on the Galaxy Ring until July, so in 2025, we could get the Galaxy Ring 2 in, say, February instead of needing to wait half a year.

Alternatively, Samsung may want to stick with a more condensed reveal and release schedule some time in the middle of the year (say May or June), especially as a refresh after barely six months could upset purchasers of the original Galaxy Ring. Yearly refreshes are expected. However, a biyearly refresh is less likely to happen.

'More feature' on the way too

Several Oura Ring 4s next to each other

Will Samsung take cues from Oura? (Image credit: Oura)

Beyond teasing the release date, Lanzuk added that the device will supposedly be thinner, have a longer battery life, and contain “more features.” Again, details are light, but this could include sleep apnea detection, expanded gesture controls, or improved fitness tools.

Perhaps it’ll also include that adaptive sizing feature teased by a recently awarded Samsung patent, though we’ll have to wait and see what’s announced.

On this note, as with all leaks we should take Lanzuk’s blog post and our speculation with a pinch of salt. Until Samsung makes an announcement we don’t know what we’ll get from the Galaxy Ring 2 – nor when it will launch, assuming it does ever launch.

You might also like



from TechRadar - All the latest technology news https://ift.tt/YKu0yM3

Tuesday, November 5, 2024

Best Smart Scale for 2024

Stay on top of your weight journey the smart way with CNET's top picks of the best smart scales, tested by our health experts.

from CNET https://ift.tt/Mx4orYP

Latest Tech News

Google unveiled its Axion processors at Google Next '24, revealing custom Arm-based CPUs built on Neoverse V2 architecture and designed to support a wide range of data center workloads, including web servers, media processing, and AI applications.

Google's main cloud rivals, Amazon and Microsoft, already have their own CPUs based on Arm technology, but at launch, Google stated its chips would offer up to 30% better performance than current Arm instances and up to 60% higher energy efficiency compared to similar x86-based options.

Fast forward to now, and Google Cloud has begun offering C4A virtual machines powered by Axion processors. The instances are optimized for various general-purpose workloads, such as web and app servers, containerized microservices, databases, and AI inference.

The core count mystery

With its Titanium technology - a system of custom silicon microcontrollers and tiered scale-out offloads designed to optimize performance for customer workloads - Google reports up to 65% better price-performance and 60% greater energy efficiency than comparable x86-based VMs, along with 30% better price-performance for MySQL and 35% for Redis workloads.

Although certain key details of the Axion processors remain unknown, such as core count, Google confirms the chips have been designed exclusively for its data centers, with no plans for commercial sale. Google services - including Bigtable, Spanner, BigQuery, F1 Query, Blobstore, Pub/Sub, Google Earth Engine, and the YouTube Ads platform - have already begun deploying Axion-based servers.

"Spanner is one of the most critical and complex services at Google, powering products including YouTube, Gmail, and Google Ads," noted Andi Gutmans, VP/GM of Databases at Google. "In our initial tests on Axion processors, we've observed up to 60% better query performance per vCPU over prior-generation servers. As we scale out our footprint, we expect this to translate to a more stable and responsive experience for our users, even under the most demanding conditions."

C4A VMs are currently available in Google Cloud regions across the US, Europe, and Asia, with multiple configuration options. By keeping Axion exclusive to its own platform and not bringing the chip to the open market, Google aims to strengthen its own cloud ecosystem, appealing to enterprises looking for more powerful, energy-efficient options.

More from TechRadar Pro



from TechRadar - All the latest technology news https://ift.tt/I02oQOt

Monday, November 4, 2024

Best Food Processors of 2024: KitchenAid, Cuisinart and More

If you love being in the kitchen and want to save time on chopping, slicing and mixing, these food processors can be a game changer.

from CNET https://ift.tt/8hsBtWX

Latest Tech News

Seagate and BAE Systems have tested the first high-capacity data storage solution designed for use in space.

The hardened SSD was evaluated aboard the International Space Station as part of a mission aimed at improving data storage for Low Earth Orbit (LEO) satellites, addressing challenges such as heat dissipation, unpressurized environments, and the absence of conventional cooling.

This technology could ultimately expand CDNs and support AI-driven applications in space. By adding storage to satellite infrastructures, AI inferencing and real-time analysis could reach previously inaccessible regions, providing last-mile connectivity where fiber or cell networks are absent.

Drives in space

Seagate’s “Space Drive” was part of a broader payload by BAE Systems which included Linux-based software for real-time data processing.

This software enables containerized applications that can be updated in orbit, adapting to the evolving demands of space-based systems.

Additionally, the payload contained a radio frequency sounder and dual-band short-wavelength infrared (SWIR) camera for enhancing atmospheric measurement capabilities, hurricane modeling, and weather forecasting.

“Our team was able to assemble, integrate, and test this payload in just eight months,” Steve Smith, vice president of engineering, science and analysis for BAE Systems Space & Mission Systems told Aerospace Manufacturing.

The 2TB Seagate SSD with PCIe Gen3 x4 connectivity, which you can see in the “Terrestrial Demo Unit” photo below, and which Seagate plans to sell in 2025, has been specially designed to withstand the harsh conditions of space. Delivered to the ISS via a NASA resupply mission and assembled by the astronauts onboard, the device using the Seagate SSDs achieved impressive speeds of over 2Gbps.

Set for one year, the mission will conclude with the payload’s return to Earth for analysis. Engineers from BAE Systems and Seagate will examine the effects of space exposure on the SSD's performance and durability and use this data to refine future designs, advancing resilient storage solutions for space-based applications.

Space Drive customer development unit that Seagate plans to sell in 2025

(Image credit: Seagate)

More from TechRadar Pro



from TechRadar - All the latest technology news https://ift.tt/ZLDW6kR

Sunday, November 3, 2024

Best Christmas Decorations on Amazon in 2024

We scoured Amazon and found all the best Christmas decorations, from lights to trees and everything in between. All it takes is a few clicks to make your season merry and bright.

from CNET https://ift.tt/RDuK7Zg

Latest Tech News

Over 27 million tons of single-use polystyrene packaging are produced worldwide each year, yet only 12% is recycled - most ends up in landfills after its initial use.

Researchers at RMIT University and Riga Technical University have developed an innovative way to generate electricity using waste polystyrene, addressing both energy needs and the environmental impact of the ubiquitous packaging material.

The invention repurposes discarded polystyrene into a device that generates static electricity from motion, such as wind or airflow. The device is a thin patch, made from multiple layers of polystyrene, each around "one-tenth the thickness of a human hair," according to lead researcher Dr. Peter Sherrell, who went on to explain, “We can produce this static electricity just from air blowing on the surface of our clever patches, then harvest that energy.”

Producing electricity consistently

The patch, which can capture turbulent airflow from air conditioning units, could reduce energy demand by up to 5% and lower the carbon footprint of these systems. Tests show the device can reach up to 230 volts, comparable to household voltage but at a lower power level.

Sherrell noted, “The biggest numbers come from a compression and separation, where you've got faster speeds and bigger motion, while smaller motion generates less energy. This means that in addition to air conditioners, integrating our patches in high traffic areas such as underground walkways could supplement local energy supply without creating additional demand on the grid."

The device’s longevity stems from the same properties that make polystyrene slow to decompose. “The great thing here is the same reason that it takes 500 years for polystyrene to break down in landfill makes these devices really stable – and able to keep making electricity for a long time,” Sherrell said.

This process involves learning how to modify plastics to optimize their energy-generating potential: “We've studied which plastic generates more energy and how when you structure it differently – make it rough, make it smooth, make it really thin, make it really fat – how that changes all this charging phenomenon.”

This static electricity generation project is part of the team’s ongoing research into triboelectric nanogenerators, as published in Advanced Energy and Sustainability Research. RMIT has filed a provisional patent for its device and is now looking for industry partners to help develop the technology for commercial applications.

More from TechRadar Pro



from TechRadar - All the latest technology news https://ift.tt/yY7WKHC

Could Apple's New Adaptive Power Feature Extend Your iPhone's Battery Life?

With this new feature being tested in the iOS 26 developer beta, you may be able to ditch the Low Power Mode setting in the future. from C...