Saturday, July 6, 2024

Best Solar Panel Installation Companies in North Carolina

North Carolina has multiple programs designed to help make the switch to clean energy easier and several great solar panel companies to choose from. Here's what you need to know to get solar in the Tar Heel State.

from CNET https://ift.tt/lvzBLRj

Latest Tech News

Despite US trade restrictions aimed at keeping advanced chips and chip-making equipment out of China, domestic semiconductor production continues to impress there. 

Loongson recently informed investors that the first samples of the Loongson 3C6000/3D6000/3E6000 series of server-grade processors had been successfully returned and are "meeting expectations". In line with its roadmap, the release is on track for Q4 2024.

Loongson claims that its 3C6000 design, a single chip with 16 cores and 32 threads, significantly improves the performance-price ratio of its server CPUs.

The chip features a hexa-issue LA664 processing core, which Loongson says doubles the general processing performance compared to the previous generation 3C5000.

Additionally, it includes DDR4-3200 4x4 RAM, boosting memory bandwidth multiple times over its predecessor. The PCIe 4.0 x64 interface also enhances IO performance significantly compared to the 3C5000. The 3C6000 supports high-speed national encryption standards calculation, with SM3 encryption bandwidth exceeding 20Gbps.

Loongson's 3D6000 contains two 3C6000 chips connected via "Loongson Coherent Link" technology, creating a 32-core/64-thread processor, while the 3E6000 connects four 3C6000 chiplets for 64 cores and 128 threads. Chiplet architectures are increasingly recognized as the future of microprocessors, and things are no different in China.

The Coherent Link technology is similar to Nvidia's NVLink and AMD's Infinity Fabric (or the recently announced UALink) and enables core cache coherent interconnects between multiple devices, ensuring all resources are virtualized, allowing for dynamic allocation of devices and chips. Loongson says its technology is compatible with mainstream hardware ecosystems and PCIE electrical standards, and supports the interconnection and upgrading of 1 to 8 chips.

While there’s no independent verification of the 3C6000's performance, Loongson does continue to make impressive strides within regulatory constraints. Leveraging its in-house MIPS-based LoongArch ISA and domestic Chinese fabs, the company may not be directly challenging EPYC and Xeon chips yet, but the gap is narrowing.

More from TechRadar Pro



from TechRadar - All the latest technology news https://ift.tt/hzqKVcu

Friday, July 5, 2024

34 Best Apple July 4th Deals Still Available: Save on AirPods, Apple Watch, MacBooks and More

Even though the Fourth is over, there are plenty of Apple-specific deals you can take advantage of.

from CNET https://ift.tt/wAavdfY

Latest Tech News

WhatsApp is apparently developing an artificial intelligence-fueled image creator that will allow users to make AI avatars of themselves to place in virtual settings they suggest. The feature, first identified by WABetaInfo in a future beta version of the Android version of WhatsApp, will likely use WhatsApp parent company Meta’s Llama portfolio of AI models. 

“This feature allows you to take photos of yourself once, then ask Meta AI to generate AI images of you,” the screenshot of the beta explains. “To generate an AI image of yourself, type “Imagine me…” in your Meta AI chat. You can also use this feature in other chats by typing “@Meta AI imagine me…”

Users would upload some photos of themselves and Meta’s AI Llama model would create an AI avatar of them that can then be placed into any image setting based on text prompts. While the feature is currently in the beta testing phase and available to a limited group of users, it remains unclear when it will be widely available.

The Avatars won’t be WhatsApp’s first AI feature. The social media platform recently added in-app custom stickers that users can create using text prompts. But, making it possible to embed yourself in an AI-generated image would be a major step forward for the company. 

AI Identity

The idea makes a lot of sense as an attractive feature for WhatsApp on its own. But, it would also stand out among Meta's peers. AI image generators provided by the likes of OpenAI and Google are usually very reluctant to create AI avatars of any real person, let alone one that can be used repeatedly. ChatGPT will almost always say it can't make an image of a real person, and Google Gemini pushes back on the idea of replicating a person in favor of an image of a character with a similar likeness or just in the same clothes. In fact, Google has recently made it easier to remove unauthorized AI avatars on YouTube, setting up a method for those who spot deepfake versions of themselves to request their takedown.

Privacy and security are the obvious concerns around AI avatars of real people. While WhatsApp has not yet detailed the specific privacy measures for the AI-generated avatars, it likely will have to have some strict rules when the feature comes out. The company will want to ensure that user data is handled securely and that users have control over how their likeness is used. Offering the feature as optional is a step in that direction. That the feature appears to employ the Meta AI interface now available across Facebook, Instagram, and other Meta properties suggests WhatsApp may be a testing ground for an eventual wider rollout. 

You might also like



from TechRadar - All the latest technology news https://ift.tt/y1aF3oH

Thursday, July 4, 2024

Apple Cider Vinegar Is the Latest Health Hack. 4 Unexpected Ways It Can Help

Apple cider vinegar isn't just a kitchen staple. Here's what to know about the potential health benefits, precautions and dosage.

from CNET https://ift.tt/D0ix7Eo

Latest Tech News

The topic of 8K TVs has become complicated over the past few years. At one stage, many brands including LG, Samsung, Sony, TCL, and Hisense jumped on the 8K TV bandwagon, embracing the new technology in an attempt to future-proof their TVs. So, if 8K TVs were meant to be the next big thing, what happened?

The main factor is price. You’re often paying double for one of the best 8K TVs compared to a 4K equivalent, For example, Samsung’s 2024 flagship 8K TV, the Samsung QN900D, is roughly $4,999 / £4,999 / AU$6,499 for the 65-inch model. The Samsung QN95D, its 4K equivalent, is £2,899. (The QN95D is a UK-only model, and its US/Australia price would roughly be $2,699 / AU$4,099). Also, there’s the ongoing lack of available 8K content, with a limited number of YouTube videos being the exception. As a result of these factors, 8K TVs lost popularity amongst consumers and companies began to move away from the tech. 

I never really bought into the 8K TV hype when I used to work in AV retail, mainly for the reasons stated above. However, after testing the Samsung QN800D, a fantastic mid-range 8K TV, that skepticism turned into belief – I’m starting to get 8K TVs. Still, there’s no getting over the fact that 8K TVs are expensive. 

Recent developments suggest that this could change in the future. Hisense, maker of some of the best TVs including the Hisense U7N and the Hisense U8K, have joined the 8K association, a not-for-profit organization dedicated to future investment and development of 8K technologies. But, why is this such a big deal?

Could affordable 8K TVs be on the way? 

Samsung QN900D showing image of lizard

The Samsung QN900D (pictured) is the best 8K TV of 2024, but it carries a high price tag. (Image credit: Future)

Hisense TVs are popular amongst consumers and critics alike for offering solid picture quality and features at a fraction of the price of some competitors. I tested a budget and premium mini-LED TV side-by-side, with the budget model represented by the Hisense U6N, and the premium represented by the Sony X95L. Although the X95L was clearly the superior TV thanks to its richer contrast, deeper blacks, and more natural textures, the U6N offered solid performance across the board at a $1,200 / £700 cheaper price than the X95L (X93L in the US). 

If Hisense can achieve this in the world of 4K TVs, why not 8K TVs? The company joining the 8K association could signal the arrival of more affordable 8K TVs, ones with similar features to more premium options from major rivals such as Samsung. 

In a statement, David Gold, president of Hisense USA and Hisense Americas, said: “We are eager to contribute to the 8K ecosystem and collaborate with other industry leaders to accelerate the integration of 8K technology into the home entertainment experience.” So it appears that Hisense is keen to get 8K TVs into more homes – hopefully by selling them at lower prices. 

8K TVs – should they stay premium? 

Hisense U80G ULED 8K TV

Hisense has dabbled in the world of 8K before, with the Hisense U80 (pictured) – but this was still at a premium price. (Image credit: TechRadar)

My excitement for cheaper 8K TVs does come with reservations. 8K TVs, particularly those from Samsung, are designed with not just 8K in mind, but also 4K. The aforementioned Samsung QN900D and QN800D both use AI upscaling on 4K sources, and this processing gives an incredible boost to textures, detail, color and high dynamic range in pictures. 

Samsung’s AI technology is strengthened by the quality of the mini-LED backlighting used in its TVs. Part of this is the number of local dimming zones used – the more zones the better, as I discovered during a mini-LED backlight demo. 

Hisense sometimes makes performance sacrifices, such as the number of local dimming zones used in the TV, to achieve low price tags. But can this be done at the 8K level, where there isn’t as much room for picture quality compromise? This isn’t Hisense’s first foray into 8K TVs, with the U80G from 2021 and the recent ULED X 8K displayed at IFA 2023, among its contributions. But both those sets were premium models, which begs the question: Does Hisense believe 8K should remain premium? 

Final thoughts 

Ultimately, Hisense’s commitment to the 8K association signals that it is looking to an 8K future for both TVs and projectors. Hisense has provided budget alternatives to higher-end 4K models that offer unbeatable bang for your buck, so if there’s a chance the company can do the same with 8K TVs, then sign me up.  

You might also like...



from TechRadar - All the latest technology news https://ift.tt/pjcCzsO

Wednesday, July 3, 2024

5K Video Camera

64MP WiFi, 18X & IR night vision 3.0.

from CNET https://ift.tt/Qlf7i5C

Latest Tech News

Intel has unveiled the industry's first fully integrated bidirectional optical I/O chiplet at the recent Optical Fiber Communication Conference (OFC) 2024. 

This optical compute interconnect (OCI), showcased by Intel’s Integrated Photonics Solutions group, supports 64 channels of 32Gbps data transmission in both directions over up to 100 meters of fiber optics. 

The technology, which can be attached to CPUs and GPUs - a previously complicated task to achieve - addresses AI infrastructure's increasing demand for higher bandwidth, lower power consumption, and longer reach.

Meeting AI demand

It's well documented that AI-based applications, including LLMs and generative AI, are causing unprecedented demand in I/O bandwidth and driving the need for longer reach to support larger CPU/GPU clusters. Electrical I/O, which relies on copper traces, offers high bandwidth density and low power but is limited to short distances. Intel says its co-packaged optical I/O solution can transmit data over much longer distances with higher efficiency and reduced power consumption, vital for AI/ML infrastructure scaling.

The OCI chiplet integrates a silicon photonics integrated circuit (PIC) with on-chip lasers and optical amplifiers, with an electrical IC. It supports 4Tbps bidirectional data transfer, compatible with PCIe Gen5, using 8 wavelengths at 200GHz spacing on a single fiber. It also consumes just 5 pico-Joules (pJ) per bit, compared to 15 pJ/bit for pluggable optical transceiver modules.

Intel’s OCI chiplet is only a prototype right now, but company says it’s working with select customers to co-package OCI with systems-on-chips (SoCs) and system-in-packages (SiPs).

“The ever-increasing movement of data from server to server is straining the capabilities of today’s data center infrastructure, and current solutions are rapidly approaching the practical limits of electrical I/O performance," said Thomas Liljeberg, senior director of Product Management and Strategy at Intel’s Integrated Photonics Solutions Group.

"However, Intel’s achievement empowers customers to seamlessly integrate co-packaged silicon photonics interconnect solutions into next-generation compute systems, and increases reach, enabling ML workload acceleration that promises to revolutionize high-performance AI infrastructure.”

More from TechRadar Pro



from TechRadar - All the latest technology news https://ift.tt/tJVlYh6

Latest Tech News

In a major milestone for modernization, Japan’s government has eliminated the use of floppy disks in all its systems.

The achievement comes two or three decades after the technology’s prime, and marks a pivotal milestone in Japan’s ongoing campaign to digitize and streamline government operations.

By mid-June, the Digital Agency had successfully abolished 1,034 regulations governing the use of floppy disks, retaining only one environmental regulation related to vehicle recycling.

Floppy disks are now extinct in Japan

An iconic tool used in early computers up until the 2000s, the floppy disk continues to at least partly live on as the widely recognized symbol for saving a document.

While advancements have seen the likes of CDs, DVDs and USBs come and go (to a certain degree), making way for the cloud, floppy disks continued to be used in Japan for their numerous benefits. Known for their reliability and stability, they’re also less hackable than more modern solutions.

The eradication of floppy disks follows the establishment of the Digital Agency, which was tasked with creating a more efficient and digitally adept governmental framework. Taro Kono, Japan’s Minister for Digital Transformation and the head of the Agency, has been at the forefront of the country’s digitization efforts since assuming the position in August 2022.

Speaking to Reuters, Kono stated: “We have won the war on floppy disks on June 28!”

However, despite these advancements, Japan’s journey towards full digitization has faced numerous challenges. The failure of a contact-tracing app during the pandemic and the slow adoption rates of the My Number digital identification card have highlighted these issues.

However, with Kono at the helm, Japan’s ambitious plans to ditch less efficient systems could see the progress they need.

More from TechRadar Pro



from TechRadar - All the latest technology news https://ift.tt/UP5yMIR

Tuesday, July 2, 2024

Instant Pot Pro

10-in-1 slow cooker, rice cooker, steamer & more, 8qt.

from CNET https://ift.tt/oG0RXNK

A Strands Hint for July 2, #121 Can Be Found in Your Kitchen

Any Julia Child-wannabes out there? You'll have an easy time with the July 2 Strands puzzle, No. 121.

from CNET https://ift.tt/nVFQvTp

Latest Tech News

The developer behind some tools for Ryzen processors, including ClockTuner and Project Hydra, has told us about an exciting new introduction for Ryzen 9000 chips that AMD is bringing in, namely Curve Shaper, a new add-on for Curve Optimizer (in Team Red's Ryzen Master software).

As you can see, in the above post on X, the dev described the new ability as an "incredible new overclocking feature" no less.

So, what does the new Curve Shaper feature do? It gives enthusiasts control over the power curve for the whole temperature range, and prevents unnecessary boosting of the CPU (and power wastage therein) when the processor is idling or not doing much.

At present, the Curve Optimizer feature is available in Ryzen Master to hand-tune the AVFS curve of either specified CPU cores, or the entire processor, which can result in increased performance, but at the cost of higher temperatures. Multi-core overclocking benefits the likes of intensive rendering, whereas single-core better benefits gaming performance (to a large extent, at any rate).

In short, this new feature is a bid to lower temperatures when possible while engaging in overclocking Ryzen 9000 processors by using Curve Optimizer. That means you won't have to go in and manually disable the feature in the settings if you're planning on low activity instead of gaming or heavy CPU-bound processes, so everything's on track to be much smarter and more power-efficient for those wanting to squeeze the most out of the best processors from Team Red.

We'll soon get to see what AMD Zen 5 can really do

Everything we've seen about AMD Zen 5 (Ryzen 9000) from leaks to the official unveiling at Computex 2024 has given the impression that it's more of a slightly faster iteration than a revolution.

This isn't entirely unexpected from a second-generation AM5 chipset, though, as Team Red is working on how best to optimize its platform rather than reinventing the wheel. With features such as Curve Shaper baked into Curve Optimizer, those wanting the best performance will have the ability to push harder while gaining the mentioned efficiency benefits.

Also notable with Ryzen 9000 is that PC enthusiasts may get the ability to make the best gaming CPUs even better - given that X3D variants are rumored to be lined up for full overclocking support this time around (with the caveat that some safeguards are likely to be in place).

Via VideoCardz

You may also like...



from TechRadar - All the latest technology news https://ift.tt/2GcMhgx

Monday, July 1, 2024

Best Laptop of 2024

Whether you're after a MacBook, Windows PC or Chromebook, these are the best laptops we've tested and reviewed, including the best laptop overall.

from CNET https://ift.tt/qd05oat

Latest Tech News

Windows 11 is creeping up on its three-year anniversary since launch, and the OS has apparently hit an all-time high for users - almost 30% of all Windows PCs now run Windows 11, at least according to one analytics firm.

That may not seem like a lot - frankly, it isn’t - but it’s at least a marked improvement in recent times, where Windows 11’s adoption has actually slightly dropped, and this is certainly a positive sign compared to the cold reception that the operating system initially received.

Neowin flagged that Statcounter’s most recent monthly report shows Windows 11 at 29.7% of market share, with Windows 10 still currently enjoying a large majority of 66.1%. 

Normally, when a new operating system drops, it’s widely adopted. Still, if we’re celebrating a high of 30% nearly three years on from release, that’s obviously not a great indication that Windows 11 is being welcomed with open arms - despite all its extra perks and AI features, which are continuously being added.

That begs the question: Why are so many people reluctant to move to Windows 11? For starters, the more demanding system requirements that rule out older CPUs and machines without TPM are a hard barrier for adoption when it comes to some PCs.

Windows 11 laptop showing Copilot

(Image credit: Microsoft)

Furthermore, since its launch, Windows 11 has suffered more than its fair share of poor updates and buggy behavior. Plus, the OS is slowly turning into a conduit for ads that you can’t escape in some cases. Also, there’s just not a lot of difference between Windows 10 and Windows 11 for people who aren’t really that fussed about AI or Copilot (and Copilot is in Windows 10 anyway, even if all of Microsoft’s various AI features aren’t). 

Could this small victory for Windows 11 - which represents a monthly uptick of just over 2% in Statcounter’s figures - simply be the result of people buying new machines? You’d be hard-pressed to find a new Windows desktop PC or laptop that isn’t running Windows 11, and downgrading your system is just not worth the effort for many (or may not even be possible). Especially given that Windows 10 isn’t far off its End of Life anyway (that rolls around in October 2025).

It might be the case that we’ll have to wait until Windows 12 eventually debuts and hope that it’s a big enough improvement to get Windows 10 users to jump ship and skip Windows 11 - although, again, system requirements are likely to prove an insurmountable hurdle for some older PCs.

You might also like...



from TechRadar - All the latest technology news https://ift.tt/XjSPHhn

Latest Tech News

Threat actors are abusing a vulnerability in an outdated D-Link router to steal people’s sensitive data, researchers have claimed.

Cybersecurity experts from GreyNoise recently reported observing hackers in the wild, abusing a critical vulnerability in D-Link DIR-859 Wi-Fi routers. 

The flaw is described as a path traversal vulnerability that leads to information disclosure, and is tracked as CVE-2024-0769. It has a severity score of 9.8/10, and was first discovered in January 2024.

A fair warning

The researchers said that the threat actors are targeting the ‘DEVICE.ACCOUNT.xml’ file, in order to grab all account names, passwords, user groups, and user descriptions, found on the device. 

The worst part is that the device reached end-of-life in early 2020, meaning D-Link will not be patching this flaw. Instead, users are advised to replace the hardware with a newer component that still receives vendor support. Still, D-Link released a security advisory warning its customers of a vulnerability discovered in the ‘fatlady.php’ component of the device. In the advisory, the company explained that the flaw affects all versions of the firmware, and allows threat actors to escalate privileges and gain full control of the device through the admin panel.

The researchers subtly criticized D-Link, suggesting that publishing a security advisory without a patch is meaningless. 

"It is unclear at this time what the intended use of this disclosed information is, it should be noted that these devices will never receive a patch," the researchers said. 

"Any information disclosed from the device will remain valuable to attackers for the lifetime of the device as long as it remains internet facing.”

However, information such as this one can serve as a warning to motivate users into migrating towards a newer device, or at least to shift the responsibility of a potential data breach towards the consumer.

Via BleepingComputer

More from TechRadar Pro



from TechRadar - All the latest technology news https://ift.tt/YZ9n5AB

Could Apple's New Adaptive Power Feature Extend Your iPhone's Battery Life?

With this new feature being tested in the iOS 26 developer beta, you may be able to ditch the Low Power Mode setting in the future. from C...