Monday, July 8, 2024

Latest Tech News

Windows 11 is getting some fixes for various frustrating issues with stuttering in the interface here and there.

Windows Latest points out there are numerous problems around glitchy animations and generally sluggish loading when using certain parts of the desktop interface in Windows 11, both in testing, and in the release version of the OS. However, the good news is that the 24H2 update is bringing some fixes for all these gremlins when it’s rolled out later this year.

The problems witnessed in test builds of Windows 11 include freezing or stuttering with Task View previews and other UI elements. However, Microsoft has implemented a cure in the recent build 26100 (24H2 in the Release Preview Channel).

Microsoft observed: “This update addresses an issue believed to be the underlying cause of some Insiders noticing stutters in some animations recently (dropping frames), particularly with Task View.”

More broadly, Windows Latest notes that issues around sluggish or buggy animations are evident in older Windows 11 versions already out there. Furthermore, the tech site also highlights a glitch with the Quick Settings interface, which has been slow to appear when invoked on Windows 11 23H2 (or indeed older versions of the OS).

The Quick Settings panel itself may appear swiftly enough, but some elements may not actually become visible – and therefore usable – for a few seconds. Again though, with the 24H2 update this problem has been ironed out, we’re told.

Small bugs, but seriously annoying glitches

While these glitches might sound like little things – and indeed they are in the grand scheme of bugs, which can really mess with your PC in the worst cases – they are still wrinkles that can hamper the overall experience of using an operating system.

When you flit around from menu to menu in Windows 11, if, at times, you’re having to pause to wait for a panel of options to actually appear, it feels like using a piece of beta software more than anything else. And true, some of these problems are indeed only in testing right now – but not all of them, with others actually affecting the finished version of Windows 11.

A modern OS must not only look good – and Windows 11 has done some impressive work on that front – but it has to feel good, too, which means no unresponsive bits of interface ruining the smooth flow of navigating around the desktop. Still, at least these fixes are inbound now, and should be here before too long – we’re expecting the 24H2 update to arrive in September or thereabouts.

Note that the 24H2 update is not to be confused with Windows 11 24H2 as installed on Copilot+ PCs – the latter was needed for these Arm-based machines, but it doesn’t contain all the full fixes and features that’ll be delivered in September (in theory). Think of it as a barebones version, with the meat still to be added as an update when Microsoft finalizes everything for the 24H2 update (which is officially still in testing).

You might also like...



from TechRadar - All the latest technology news https://ift.tt/a2xHY7K

Sunday, July 7, 2024

Best Extended July 4th Mattress Deals: 23 Different Designs to Find the Best Night's Sleep

Shop extended July 4th sales on mattresses and bedding from Purple, DreamCloud, Nectar and other top brands to score the best deals.

from CNET https://ift.tt/RT0veXA

Latest Tech News

Japanese memory giant Kioxia says it has begun sampling shipments of 2Tb QLC devices, utilizing its eighth-generation BiCS FLASH 3D flash memory technology. 

This new chip will be used in SSDs, servers, and other forms of digital storage hardware that require high-capacity, high-efficiency memory components.

Kioxia's BiCS FLASH technology expands the memory die vertically and laterally, while the integration of CBA (CMOS directly Bonded to Array) technology supports denser memory creation and interface speeds of up to 3.6Gbps. The new product features a bit density about 2.3 times higher than Kioxia's previous fifth-generation QLC device and 70% greater write power efficiency. With a 16-die stacked architecture in a single memory package, Kioxia says its new QLC device can achieve 4TB capacity.

Pure Storage to adopt the technology

Hideshi Miyajima, Chief Technology Officer of Kioxia, said, “With its industry-leading high bit density, high speed data transfer, and superior power efficiency, the 2Tb QLC product will offer new value for rapidly emerging AI applications and large storage applications demanding power and space savings."

Pure’s DirectFlash Module (DFM) storage (which allows all-flash arrays to communicate directly with raw flash storage) already offers improved density and efficiency, as well as longer life spans compared to SSDs. The firm has previously said it will release 150TB DFMs in 2025, with the aim of shipping 300TB DFMs by 2026. Incorporating Kioxia’s technology in its products will assist that aim. 

Kioxia also added a new 1Tb QLC memory device to its portfolio. Compared to the 2Tb QLC, the 1Tb device provides approximately a 30 percent faster sequential write performance and about a 15 percent improvement in read latency and will be used in high-performance applications including client SSDs and mobile devices.

Charles Giancarlo, Chief Executive Officer of Pure Storage, said, "We have a long-standing relationship with Kioxia and are delighted to incorporate their eighth-generation BiCS FLASH 2Tb QLC flash memory products to enhance the performance and efficiency of our all-flash storage solutions. Pure’s unified all-flash data storage platform is able to meet the demanding needs of artificial intelligence as well as the aggressive costs of backup storage.” 

More from TechRadar Pro



from TechRadar - All the latest technology news https://ift.tt/1wA7lnr

Saturday, July 6, 2024

Best Solar Panel Installation Companies in North Carolina

North Carolina has multiple programs designed to help make the switch to clean energy easier and several great solar panel companies to choose from. Here's what you need to know to get solar in the Tar Heel State.

from CNET https://ift.tt/lvzBLRj

Latest Tech News

Despite US trade restrictions aimed at keeping advanced chips and chip-making equipment out of China, domestic semiconductor production continues to impress there. 

Loongson recently informed investors that the first samples of the Loongson 3C6000/3D6000/3E6000 series of server-grade processors had been successfully returned and are "meeting expectations". In line with its roadmap, the release is on track for Q4 2024.

Loongson claims that its 3C6000 design, a single chip with 16 cores and 32 threads, significantly improves the performance-price ratio of its server CPUs.

The chip features a hexa-issue LA664 processing core, which Loongson says doubles the general processing performance compared to the previous generation 3C5000.

Additionally, it includes DDR4-3200 4x4 RAM, boosting memory bandwidth multiple times over its predecessor. The PCIe 4.0 x64 interface also enhances IO performance significantly compared to the 3C5000. The 3C6000 supports high-speed national encryption standards calculation, with SM3 encryption bandwidth exceeding 20Gbps.

Loongson's 3D6000 contains two 3C6000 chips connected via "Loongson Coherent Link" technology, creating a 32-core/64-thread processor, while the 3E6000 connects four 3C6000 chiplets for 64 cores and 128 threads. Chiplet architectures are increasingly recognized as the future of microprocessors, and things are no different in China.

The Coherent Link technology is similar to Nvidia's NVLink and AMD's Infinity Fabric (or the recently announced UALink) and enables core cache coherent interconnects between multiple devices, ensuring all resources are virtualized, allowing for dynamic allocation of devices and chips. Loongson says its technology is compatible with mainstream hardware ecosystems and PCIE electrical standards, and supports the interconnection and upgrading of 1 to 8 chips.

While there’s no independent verification of the 3C6000's performance, Loongson does continue to make impressive strides within regulatory constraints. Leveraging its in-house MIPS-based LoongArch ISA and domestic Chinese fabs, the company may not be directly challenging EPYC and Xeon chips yet, but the gap is narrowing.

More from TechRadar Pro



from TechRadar - All the latest technology news https://ift.tt/hzqKVcu

Friday, July 5, 2024

34 Best Apple July 4th Deals Still Available: Save on AirPods, Apple Watch, MacBooks and More

Even though the Fourth is over, there are plenty of Apple-specific deals you can take advantage of.

from CNET https://ift.tt/wAavdfY

Latest Tech News

WhatsApp is apparently developing an artificial intelligence-fueled image creator that will allow users to make AI avatars of themselves to place in virtual settings they suggest. The feature, first identified by WABetaInfo in a future beta version of the Android version of WhatsApp, will likely use WhatsApp parent company Meta’s Llama portfolio of AI models. 

“This feature allows you to take photos of yourself once, then ask Meta AI to generate AI images of you,” the screenshot of the beta explains. “To generate an AI image of yourself, type “Imagine me…” in your Meta AI chat. You can also use this feature in other chats by typing “@Meta AI imagine me…”

Users would upload some photos of themselves and Meta’s AI Llama model would create an AI avatar of them that can then be placed into any image setting based on text prompts. While the feature is currently in the beta testing phase and available to a limited group of users, it remains unclear when it will be widely available.

The Avatars won’t be WhatsApp’s first AI feature. The social media platform recently added in-app custom stickers that users can create using text prompts. But, making it possible to embed yourself in an AI-generated image would be a major step forward for the company. 

AI Identity

The idea makes a lot of sense as an attractive feature for WhatsApp on its own. But, it would also stand out among Meta's peers. AI image generators provided by the likes of OpenAI and Google are usually very reluctant to create AI avatars of any real person, let alone one that can be used repeatedly. ChatGPT will almost always say it can't make an image of a real person, and Google Gemini pushes back on the idea of replicating a person in favor of an image of a character with a similar likeness or just in the same clothes. In fact, Google has recently made it easier to remove unauthorized AI avatars on YouTube, setting up a method for those who spot deepfake versions of themselves to request their takedown.

Privacy and security are the obvious concerns around AI avatars of real people. While WhatsApp has not yet detailed the specific privacy measures for the AI-generated avatars, it likely will have to have some strict rules when the feature comes out. The company will want to ensure that user data is handled securely and that users have control over how their likeness is used. Offering the feature as optional is a step in that direction. That the feature appears to employ the Meta AI interface now available across Facebook, Instagram, and other Meta properties suggests WhatsApp may be a testing ground for an eventual wider rollout. 

You might also like



from TechRadar - All the latest technology news https://ift.tt/y1aF3oH

Thursday, July 4, 2024

Apple Cider Vinegar Is the Latest Health Hack. 4 Unexpected Ways It Can Help

Apple cider vinegar isn't just a kitchen staple. Here's what to know about the potential health benefits, precautions and dosage.

from CNET https://ift.tt/D0ix7Eo

Latest Tech News

The topic of 8K TVs has become complicated over the past few years. At one stage, many brands including LG, Samsung, Sony, TCL, and Hisense jumped on the 8K TV bandwagon, embracing the new technology in an attempt to future-proof their TVs. So, if 8K TVs were meant to be the next big thing, what happened?

The main factor is price. You’re often paying double for one of the best 8K TVs compared to a 4K equivalent, For example, Samsung’s 2024 flagship 8K TV, the Samsung QN900D, is roughly $4,999 / £4,999 / AU$6,499 for the 65-inch model. The Samsung QN95D, its 4K equivalent, is £2,899. (The QN95D is a UK-only model, and its US/Australia price would roughly be $2,699 / AU$4,099). Also, there’s the ongoing lack of available 8K content, with a limited number of YouTube videos being the exception. As a result of these factors, 8K TVs lost popularity amongst consumers and companies began to move away from the tech. 

I never really bought into the 8K TV hype when I used to work in AV retail, mainly for the reasons stated above. However, after testing the Samsung QN800D, a fantastic mid-range 8K TV, that skepticism turned into belief – I’m starting to get 8K TVs. Still, there’s no getting over the fact that 8K TVs are expensive. 

Recent developments suggest that this could change in the future. Hisense, maker of some of the best TVs including the Hisense U7N and the Hisense U8K, have joined the 8K association, a not-for-profit organization dedicated to future investment and development of 8K technologies. But, why is this such a big deal?

Could affordable 8K TVs be on the way? 

Samsung QN900D showing image of lizard

The Samsung QN900D (pictured) is the best 8K TV of 2024, but it carries a high price tag. (Image credit: Future)

Hisense TVs are popular amongst consumers and critics alike for offering solid picture quality and features at a fraction of the price of some competitors. I tested a budget and premium mini-LED TV side-by-side, with the budget model represented by the Hisense U6N, and the premium represented by the Sony X95L. Although the X95L was clearly the superior TV thanks to its richer contrast, deeper blacks, and more natural textures, the U6N offered solid performance across the board at a $1,200 / £700 cheaper price than the X95L (X93L in the US). 

If Hisense can achieve this in the world of 4K TVs, why not 8K TVs? The company joining the 8K association could signal the arrival of more affordable 8K TVs, ones with similar features to more premium options from major rivals such as Samsung. 

In a statement, David Gold, president of Hisense USA and Hisense Americas, said: “We are eager to contribute to the 8K ecosystem and collaborate with other industry leaders to accelerate the integration of 8K technology into the home entertainment experience.” So it appears that Hisense is keen to get 8K TVs into more homes – hopefully by selling them at lower prices. 

8K TVs – should they stay premium? 

Hisense U80G ULED 8K TV

Hisense has dabbled in the world of 8K before, with the Hisense U80 (pictured) – but this was still at a premium price. (Image credit: TechRadar)

My excitement for cheaper 8K TVs does come with reservations. 8K TVs, particularly those from Samsung, are designed with not just 8K in mind, but also 4K. The aforementioned Samsung QN900D and QN800D both use AI upscaling on 4K sources, and this processing gives an incredible boost to textures, detail, color and high dynamic range in pictures. 

Samsung’s AI technology is strengthened by the quality of the mini-LED backlighting used in its TVs. Part of this is the number of local dimming zones used – the more zones the better, as I discovered during a mini-LED backlight demo. 

Hisense sometimes makes performance sacrifices, such as the number of local dimming zones used in the TV, to achieve low price tags. But can this be done at the 8K level, where there isn’t as much room for picture quality compromise? This isn’t Hisense’s first foray into 8K TVs, with the U80G from 2021 and the recent ULED X 8K displayed at IFA 2023, among its contributions. But both those sets were premium models, which begs the question: Does Hisense believe 8K should remain premium? 

Final thoughts 

Ultimately, Hisense’s commitment to the 8K association signals that it is looking to an 8K future for both TVs and projectors. Hisense has provided budget alternatives to higher-end 4K models that offer unbeatable bang for your buck, so if there’s a chance the company can do the same with 8K TVs, then sign me up.  

You might also like...



from TechRadar - All the latest technology news https://ift.tt/pjcCzsO

Wednesday, July 3, 2024

5K Video Camera

64MP WiFi, 18X & IR night vision 3.0.

from CNET https://ift.tt/Qlf7i5C

Latest Tech News

Intel has unveiled the industry's first fully integrated bidirectional optical I/O chiplet at the recent Optical Fiber Communication Conference (OFC) 2024. 

This optical compute interconnect (OCI), showcased by Intel’s Integrated Photonics Solutions group, supports 64 channels of 32Gbps data transmission in both directions over up to 100 meters of fiber optics. 

The technology, which can be attached to CPUs and GPUs - a previously complicated task to achieve - addresses AI infrastructure's increasing demand for higher bandwidth, lower power consumption, and longer reach.

Meeting AI demand

It's well documented that AI-based applications, including LLMs and generative AI, are causing unprecedented demand in I/O bandwidth and driving the need for longer reach to support larger CPU/GPU clusters. Electrical I/O, which relies on copper traces, offers high bandwidth density and low power but is limited to short distances. Intel says its co-packaged optical I/O solution can transmit data over much longer distances with higher efficiency and reduced power consumption, vital for AI/ML infrastructure scaling.

The OCI chiplet integrates a silicon photonics integrated circuit (PIC) with on-chip lasers and optical amplifiers, with an electrical IC. It supports 4Tbps bidirectional data transfer, compatible with PCIe Gen5, using 8 wavelengths at 200GHz spacing on a single fiber. It also consumes just 5 pico-Joules (pJ) per bit, compared to 15 pJ/bit for pluggable optical transceiver modules.

Intel’s OCI chiplet is only a prototype right now, but company says it’s working with select customers to co-package OCI with systems-on-chips (SoCs) and system-in-packages (SiPs).

“The ever-increasing movement of data from server to server is straining the capabilities of today’s data center infrastructure, and current solutions are rapidly approaching the practical limits of electrical I/O performance," said Thomas Liljeberg, senior director of Product Management and Strategy at Intel’s Integrated Photonics Solutions Group.

"However, Intel’s achievement empowers customers to seamlessly integrate co-packaged silicon photonics interconnect solutions into next-generation compute systems, and increases reach, enabling ML workload acceleration that promises to revolutionize high-performance AI infrastructure.”

More from TechRadar Pro



from TechRadar - All the latest technology news https://ift.tt/tJVlYh6

Latest Tech News

In a major milestone for modernization, Japan’s government has eliminated the use of floppy disks in all its systems.

The achievement comes two or three decades after the technology’s prime, and marks a pivotal milestone in Japan’s ongoing campaign to digitize and streamline government operations.

By mid-June, the Digital Agency had successfully abolished 1,034 regulations governing the use of floppy disks, retaining only one environmental regulation related to vehicle recycling.

Floppy disks are now extinct in Japan

An iconic tool used in early computers up until the 2000s, the floppy disk continues to at least partly live on as the widely recognized symbol for saving a document.

While advancements have seen the likes of CDs, DVDs and USBs come and go (to a certain degree), making way for the cloud, floppy disks continued to be used in Japan for their numerous benefits. Known for their reliability and stability, they’re also less hackable than more modern solutions.

The eradication of floppy disks follows the establishment of the Digital Agency, which was tasked with creating a more efficient and digitally adept governmental framework. Taro Kono, Japan’s Minister for Digital Transformation and the head of the Agency, has been at the forefront of the country’s digitization efforts since assuming the position in August 2022.

Speaking to Reuters, Kono stated: “We have won the war on floppy disks on June 28!”

However, despite these advancements, Japan’s journey towards full digitization has faced numerous challenges. The failure of a contact-tracing app during the pandemic and the slow adoption rates of the My Number digital identification card have highlighted these issues.

However, with Kono at the helm, Japan’s ambitious plans to ditch less efficient systems could see the progress they need.

More from TechRadar Pro



from TechRadar - All the latest technology news https://ift.tt/UP5yMIR

Tuesday, July 2, 2024

Instant Pot Pro

10-in-1 slow cooker, rice cooker, steamer & more, 8qt.

from CNET https://ift.tt/oG0RXNK

A Strands Hint for July 2, #121 Can Be Found in Your Kitchen

Any Julia Child-wannabes out there? You'll have an easy time with the July 2 Strands puzzle, No. 121.

from CNET https://ift.tt/nVFQvTp

Latest Tech News

The developer behind some tools for Ryzen processors, including ClockTuner and Project Hydra, has told us about an exciting new introduction for Ryzen 9000 chips that AMD is bringing in, namely Curve Shaper, a new add-on for Curve Optimizer (in Team Red's Ryzen Master software).

As you can see, in the above post on X, the dev described the new ability as an "incredible new overclocking feature" no less.

So, what does the new Curve Shaper feature do? It gives enthusiasts control over the power curve for the whole temperature range, and prevents unnecessary boosting of the CPU (and power wastage therein) when the processor is idling or not doing much.

At present, the Curve Optimizer feature is available in Ryzen Master to hand-tune the AVFS curve of either specified CPU cores, or the entire processor, which can result in increased performance, but at the cost of higher temperatures. Multi-core overclocking benefits the likes of intensive rendering, whereas single-core better benefits gaming performance (to a large extent, at any rate).

In short, this new feature is a bid to lower temperatures when possible while engaging in overclocking Ryzen 9000 processors by using Curve Optimizer. That means you won't have to go in and manually disable the feature in the settings if you're planning on low activity instead of gaming or heavy CPU-bound processes, so everything's on track to be much smarter and more power-efficient for those wanting to squeeze the most out of the best processors from Team Red.

We'll soon get to see what AMD Zen 5 can really do

Everything we've seen about AMD Zen 5 (Ryzen 9000) from leaks to the official unveiling at Computex 2024 has given the impression that it's more of a slightly faster iteration than a revolution.

This isn't entirely unexpected from a second-generation AM5 chipset, though, as Team Red is working on how best to optimize its platform rather than reinventing the wheel. With features such as Curve Shaper baked into Curve Optimizer, those wanting the best performance will have the ability to push harder while gaining the mentioned efficiency benefits.

Also notable with Ryzen 9000 is that PC enthusiasts may get the ability to make the best gaming CPUs even better - given that X3D variants are rumored to be lined up for full overclocking support this time around (with the caveat that some safeguards are likely to be in place).

Via VideoCardz

You may also like...



from TechRadar - All the latest technology news https://ift.tt/2GcMhgx

Today's Wordle Hints, Answer and Help for May 10, #1786

Here are hints and the answer for today's Wordle for May 10, No. 1,786. from CNET https://ift.tt/FztnkY5