Picking up right where you left off in the event of a failure will now be possible for those running Elastic Computing Cloud (EC2) instances on AWS.
Previously, if customers wanted to set an EC2 instance to recover automatically, they could only do so by setting up an alarm in Amazon CloudWatch.
Now though, if an underlying hardware problem comes along, EC2 instances will be automatically recovered on Amazon’s cloud computing service where they’ll receive a new instance ID, private IP addresses, public IPv4 IP address, Elastic IP addresses and all instance metadata. Unfortunately though, data in memory will still be lost in the event of a failure.
Automatic recovery of EC2 instances
While automatic recovery of EC2 instances will likely be a big deal for businesses that run their workloads in the cloud, AWS only released a brief announcement regarding the new feature.
While the announcement also contained a link to the company's documentation on how to recover instances, it didn’t mention anything about recovery points for data on disks or the time required for an auto-recovered instance to resume operations.
EC2 customers will also be able to disable the new auto-recovery feature. For most customers this probably isn’t a good idea but AWS points out in its documentation that those running instances in placement groups might want to disable the feature so that their instances are restored to the same placement group.
Still though, hardware faults in a server could indicate that there is a problem in a server rack or even an entire row which is why having an instance auto recover to another availability zone or region makes sense.
Consider this the other shoe dropping. Almost three months after deciding to make the E3 gaming conference an all-digital event, the Entertainment Software Association (ESA) has pulled the plug on the entire affair.
"We previously announced that E3 would not be held in person in 2022 due to the ongoing health risks surrounding COVID-19," the ESA told TechRadar. "Today, we announce that there will also be no digital E3 showcase in 2022."
The news came first via an industry insider, Will Powers, PR lead, Americas for popular gaming gear company Razer. He tweeted the news of the apparent cancellation on Thursday:
Just got an email... It's official, E3 digital is official cancelled for 2022. Lots of mixed feelings about this...March 31, 2022
See more
That's the bad news. The good news is, as of today, the ESA is planning to bring back E3 in 2023. The organization told TechRadar, "E3 will return in 2023 with a reinvigorated showcase that celebrates new and exciting video games and industry innovations."
The ESA added that it will "devote all our energy and resources to delivering a revitalized physical and digital E3 experience next summer."
It's been a rough few years for E3. In the face of the pandemic, it canceled the event in 2020, successfully held an all-digital one last year, but was forced to again cancel in-person festivities for 2022 in the face of the Omicron surge.
While we're mostly through that COVID variant wave, ESA has made another tough decision for the once-bustling event (66,000 attendees in 2019), pulling the plug on the digital experience, as well.
The loss of even a virtual event could be a tough pill to swallow for gamers who see it as more than just an industry trade show. As TRG's Vic Hood wrote when E3 canceled this year's in-person event:
"[E3] has grown to become a celebration of gaming: media attend to report on the biggest upcoming games, while fans attend to get their hands on the newest releases and to be among like-minded individuals. I can only describe it as a paradise for gamers."
It's not, according to the ESA the end of E3, and it's been through rough patches before. In 2016, multiple major game studios, including EA and Activision Blizzard, pulled out of the show, holding their own game showcases nearby. Plus, interest has waxed and waned in the periods between new console launches and major title releases.
E3 has, in general, rebounded and was looking strong right up until the pandemic when virtually all in-person tradeshows around the world were either canceled or shifted to virtual events.
As we write this, the ESA has not updated its social media or its news blog, though we assume it will soon. Notably, the E3 Twitter account page still has an E3 2021 logo on it and has posted nothing about any change in status for the event.
The official E3 page does say, "E3 2022 See you next year," with no hint of irony that we're already well into 2022. Subscribers to the E3 newsletter get a welcome message that still includes "E3 2021" in the header art.
from TechRadar - All the latest technology news https://ift.tt/vL09pjc
Netflix has confirmed that Top Boy, the British gangster drama it rebooted in 2019, will end after a third and final season.
The drama, which ran for two seasons on British broadcaster Channel 4 before being dropped in 2014, has just debuted its second season on the streamer.
Top Boy stars Ashley Walters and Kane Robinson (better known to some as rapper Kano) and is set in the heart of London on the fictional Summerhouse estate. It follows partners Dushane and Sully, two notorious drug dealers who, despite a wish to lead honest lives, are corrupted by the promise of increased wealth and respect in their borough.
The show's first two seasons aired on Channel 4 in 2011 and 2013 before Netflix came in 2017 with a reboot, a decision that was partly helped by the presence of rapper Drake as one of the show's executive producers.
Netflix launched the show's new season, which they referred to as its first, with the Channel 4 era then referred to as Top Boy: Summerhouse, in 2019, before ordering a follow-up, which arrived on March 18.
Top Boy will return for a third and final season! pic.twitter.com/yp9UC28coQMarch 31, 2022
See more
Stars Walters and Robinson, who also serve as executive producers, have released a statement as the news of the show's final bow was unveiled.
Speaking about the decision, they said: "For those that have followed the journey from the start, you will know how much this show means to everyone on our team and we wholeheartedly know how much it means to you. These characters have been a part of our lives for over a decade now and without everyone’s support we couldn’t have come this far."
They continued: "Whilst the journeys of Dushane and Sully have remained at the core of the show, the new characters that have entered the world of Top Boy have become a key part of the show’s legacy, representing each new storyline in a raw, authentic way. With all this being said, and staying true to our original goal, every story must have an ending and so season three will be our finale. A chance to come full circle and end the journey in the right way."
Newly discovered code on Twitter’s TweetDeck site points to the company possibly making TweetDeck an exclusive feature to Twitter Blue subscribers.
TweetDeck is a (currently) free platform that lets desktop users scroll through multiple timelines of different accounts, topics, or hashtags at once. Big proponents of the feature are media workers and businesses with multiple PR accounts that search for trending topics and interact with other Twitter users.
The rumor mill was started up when Twitter user @wongmjane, an established tipster, posted that the company is filling in a new TweetDeck signup page, advertising an ad-free experience as a big selling point.
Previously, Wong posted that code gates on the TweetDeck app may ask for a Twitter Blue subscription in the future, and redirects a user to the signup page if they aren’t subscribed.
Twitter has been teasing “a new & improved” TweetDeck for a while now, implying that upon launch, the app will be reworked and redesigned.
We reached out to Twitter for any comment, and a spokesperson told us that they had nothing to share at this moment.
Analysis: All signs point to exclusivity
Keep in mind that this is just a report, but the rumors are pretty strong. If they hold true and TweetDeck does become a paid feature, it’d be another expense for businesses that take advantage of it to keep up with the numerous interactions that company accounts often receive.
TweetDeck would also be another feature in Twitter Blue’s slim portfolio and could prove interesting when paired with the rumored “ad-free experience” touted on the incomplete new site.
from TechRadar - All the latest technology news https://ift.tt/sA14QOY
As long as gaming laptops have existed, there have only been two companies that have produced GPUs for them. Nvidia and AMD have had an iron grip on the market for years, and now Intel Arc mobile graphics processors are finally here. I don't know if an Arc 7 GPU is going to be faster than an RTX 3080 Ti – it probably won't – but since it's Intel at the helm, I at least know it's going to result in a good experience.
To be clear, while Intel has said that laptops using its Arc 3 graphics are available now, I have not even seen one in person, so all I have to go off of is the information that Intel has provided. I'm not exactly in the business of trusting internal benchmarks, and neither should you.
But let's be honest, while AMD and Nvidia both make mobile graphics solutions, the best gaming laptops on the market are using a combination of Intel processors and Nvidia graphics. AMD Navi started to chip away at Team Green's dominance, but Intel can hit a lot harder, and it's about more than raw frame rates.
(Image credit: Nvidia)
A stable platform
Intel has had a rocky few years, as it tried to catch up with AMD's Zen architecture. But even when Intel was furthest behind in terms of raw performance, it still excelled where it really matters – especially for laptops – reliability. When you're looking at charts and numbers it's pretty easy to forget about what the experience of actually using something is like, and Intel processors have never really had the same kind of adjustment period as AMD processors do.
It seems like every time a new AMD chipset comes out on desktop, there are a number of critical bugs that Team Red has to jump on after release. For instance, shortly after the release of its 5000-series processors there were widespread reports of USB problems, where devices would just stop responding, according to Tom's Hardware.
Intel generally doesn't have the same kind of problems with its new platforms. And while Intel is admittedly going to be new at the whole discrete graphics thing, the company has proven that it puts a lot of value on the experience of the user. So even if performance isn't quite there with this first generation of Arc graphics, it at least will likely result in a user-friendly product. Maybe that's why Intel prioritized laptop GPUs instead of trying to take on the RTX 3080 immediately.
(Image credit: Intel)
An actual competitor for DLSS
It's impossible to overestimate the impact DLSS has had on PC gaming since it debuted with the RTX 2080 in 2018. While it wasn't as exciting as first, it's become a critical technology for AAA game developers that want people to actually be able to play their games on affordable hardware.
And while AMD has come up with a competing technology in FSR, or FidelityFX Super Sampling, it just doesn't have the same visual quality as DLSS. However, it does have the advantage of being usable on any GPU.
What keeps DLSS out of reach of FSR is that it uses the Tensor Cores in Nvidia's RTX graphics cards to apply a deep learning algorithm specific to each game, that lets you render the game at a lower resolution with the regular CUDA cores, then use the Tensor cores to scale it up to your display's native resolution. Because this is a hardware accelerated approach that is so fine tuned, it's hard to even tell a difference between native resolution and DLSS on a Quality preset.
But XeSS could be a legitimate alternative to DLSS with a similar visual quality – because it's taking a similar approach. Let's, uh, break that down really quick.
But XeSS could be a legitimate alternative to DLSS with a similar visual quality – because it's taking a similar approach.
For example, the Intel A370M, one of the first GPUs the company is launching, comes equipped with 8 Xe cores. Intel has thankfully released the layout for each of these cores. Each Xe core will come with 16 Xe Vector Engines (XVE) and 16 Xe Matrix Engines (XMX). The XVE threads will basically serve the same function as CUDA cores in Nvidia GPUs. Then, the XMX cores are specially designed for AI workloads, and are able to perform this specialized workload a lot faster than the standard Vector units.
I won't go too much into why because I'm not an engineer, but this is a very similar structure to Nvidia Ampere, and should be just as efficient at upscaling workloads – at least on paper.
XeSS won't actually be available until later this year, but I can't wait to get my hands on it to see how it performs, and more importantly, how games look when the technology is turned on.
Because let's face it, performance gains between Nvidia's DLSS and AMD's FSR are pretty similar – to the point where we get the same framerate in Cyberpunk 2077 when switching between them on the new RTX 3090 Ti – but the image quality is so much better with Nvidia's tech.
The technology is definitely there for Intel, too, and it looks like XeSS is going to be just as important for PC gamers as the other upscaling technologies. But, it's also important to keep in mind that it took a while for Nvidia to get DLSS looking as good as it does now. I remember when the tech first became available in Battlefield 1 and Metro Exodus, and it has come a long way. It'd be fantastic if Intel is going to be able to avoid those growing pains, but there will probably be some issues along the way.
But because it is an Intel technology it will more than likely actually work, and you probably won't have to mess with it too much to get it going.
Although a bit later than initially expected, the US Department of Defense (DOD) has announced that it plans to award up to $9bn in cloud infrastructure contracts in December of this year.
Following the controversy surrounding its now canceled Joint Enterprise Defense Infrastructure or JEDI contract, the Pentagon announced its new Joint Warfighter Cloud Capability (JWCC) initiative back in November of this year. Unlike with the previous JEDI contract, the US military will now rely on multiple cloud providers as opposed to just a single one.
Although Amazon, Microsoft, Oracle and other cloud providers competed to win the $10bn JEDI contract, in the end the Pentagon awarded it to Microsoft before deciding to cancel it altogether.
In a recent call with reporters, Pentagon CIO John Sherman explained that the US military's timeline was a bit too bold which is why it will now be wrapping up bidding for the JWCC initiative in December. When JWCC was first announced in July of last year, the Pentagon initially intended to award contracts in April 2022.
Unclassified, secret and top secret networks
Sherman provided further details on how the JWCC initiative will provide enterprise cloud capabilities for the DOD in a recent press briefing, saying:
“In terms of what it comprises, the JWCC, it is going to be a multi-cloud effort that will provide enterprise cloud capabilities for the Department of Defense at all three security classifications: unclassified, secret, and top secret. All the way from the continental United States here, out to the tactical edge.”
Once the contracts have been awarded, the Pentagon expects to immediately have access to its unclassified network. From here, secret networks will come online 60 days after contracts have been awarded while both top-secret and tactical edge networks will come online no later than 180 days following the awarding of contracts to cloud service providers.
Just like with the JEDI contract though, Amazon, Google, Microsoft and Oracle are all competing for government contracts once again and the Pentagon reached out to all four companies in November of last year according to Sherman. These new contracts will have a three-year base period and two-year option periods.
We'll likely hear more towards the end of the year when the DOD actually begins awarding contracts for its new JWCC initiative.
My OnePlus 9 Pro battery is so swollen it split open the case.
I discovered this potentially dangerous situation by accident when I noticed the one-year-old Android phone sitting imperfectly in its carbon-fiber case. Absentmindedly, I reached over and pushed one corner of the phone down, trying to reseat it. It popped back out. After a few attempts, I removed the case and discovered the truth: The battery had expanded and split the chrome case along one long-glued seam, creating, in one area, a quarter-inch chasm.
"Not again," I thought.
A few years ago, a Google Pixel 3 XL that I mostly keep on a Pixel charging stand by my bed appeared to jump off the charger of its own accord. It turned out that the battery has expanded so much that the case no longer sat flush with the charging base.
I eventually sent the phone back to Google and got a replacement; at least that phone was more than a few years old. But I reviewed the OnePlus 9 Pro just over a year ago. At the time, I called it "a gorgeous device," and "one of the best devices I've used in the Android space." Even with the split back, it still looks pretty good.
Does it still work? Sure. I powered it up and it launched, like Head Wound Harry, as if there wasn't any critical damage.
Still, I won't use it now or ever again.
As soon as I posted a short video of the split phone on Twitter, I got a fast flood of responses and at least one warning: "I'd, uh, turn it off."
This is concerning. OnePlus 9Pro 5G battery is now bursting the phone’s seams. Not the first time I’ve seen an Android phone do this pic.twitter.com/5HfdokTXG5March 28, 2022
See more
I also found a community of people who have suffered similar battery calamities on a variety of Android phones and iPhones. I've owned and tested every iPhone since the iPhone 4 and never had a battery balloon or case split.
Still the tales of puffy batteries traveled through numerous Samsung handsets, iPhones, MacBooks, and Pixels.
It's a big enough problem that there are FAQs and services devoted to it. I found a place called Bebat that tries to explain why cellphone Lithium-Ion batteries balloon and what to do about it. Yes, it's also selling a repair service.
No one, including Bebat, is certain why all these batteries occasionally puff up. It could be:
Overheating
Overuse
Too much charging
A defective part
It's almost like no one quite understands how these batteries work. Though we know they do - most of the time.
An expanding phone battery and concerns over what could happen next if I keep using or charging it (explosion and fire come to mind) take me back seven or so years to Samsung's Galaxy Note 7.
The Galaxy Note 7 was an Android industry darling right up until units began to explode and catch fire. The culprit was the lithium-Ion battery.
Before you can understand what goes wrong with such a power source, it helps to understand how most Lithium-Ion batteries work. It's something we all got a crash course in back in 2016.
(Image credit: Future / Lance Ulanoff)
Like most batteries, there's a positive and negative side, usually made of two different conductive materials (say, aluminum and graphite). Since a battery creates power through chemistry and flow of charged ions, there's also a liquid (called an electrolyte), and a thin plastic layer made to separate the positive and negative sides. The cells charge when we plug the phone in.
Unlike the batteries in, say, your remote control, phone batteries can't be round and thick, nor can everything sit neatly in just two layers. Usually, smartphone batteries fold the layers over and over, sandwiching them to make them thin and as flat (and store as much energy) as possible.
As you can imagine, if all this isn't done perfectly, something can go wrong. In Samsung's case, it was a huge battery being squished into too small a space, deforming some layers, as well as a production issue in which a bad weld perforated these layers in some devices.
(Image credit: Future / Lance Ulanoff)
Samsung learned its lesson and instituted some rigorous battery oversight and testing for all future devices. There has not been a notable incidence since.
Which is good. But why are our phone batteries still expanding?
Since this issues cuts across devices, it's clearly some intrinsic lithium-ion issue.
Even experts like Bebat don't offer any clear-cut idea of why or what companies could do to prevent this. They do know what you can't do though:
There is no point waiting for the battery to “shrink”. The ever growing pressure can cause damage to the entire device. Leave the battery in the device only if it is stuck. Never try to “solve” the swelling yourself by pricking a hole in it or with any other creative stunt. That is very dangerous: not only is the gas flammable, but also toxic.
In every support forum, the advice is the same. Stop using the device immediately and get it to a service center. One MacBook user claimed to me on Twitter that an Apple Store genius told him to give the laptop to them and "we can put it in back in the special safe in case it explodes."
I don't think I'm in any imminent danger here, but I have turned off the OnePlus 9 Pro and reached out to the company's representatives for comment. My more immediate concern is that, while hardly common, I'm not sure expanding, gas-filled batteries are uncommon.
Making these incidents public is the first step, I hope, in pushing phone manufacturers to be more transparent, to work on safer, less expansion-prone batteries, and to look for a less volatile power storage system than lithium-ion. It's a tall order, I know.
Now, where do I store this scary OnePlus 9 Pro?
from TechRadar - All the latest technology news https://ift.tt/tc4yrXz
We recently heard a leak about the new iPad for 2022 - that's the entry-level slate that Apple refreshes every year. This leak suggested that the device could get a big design update, possibly including the removal of the home button and a sizeable reduction in the bezel size, to bring it more in line with Apple's more modern tablets.
If this information is true, Apple is bringing its last family of tablets in line with its new design, which it's slowly been rolling out to its different iPads for the last few years.
This design matches the specs of these tablets, and suits them well for businesspeople or creatives looking for a sleek device. But Apple seems to be forgetting something - it has a much wider audience than that.
A history of iPads
I first bought the iPad 9.7 in 2017 - I was a student at the time, and needed a device that I could take to campus and work on, that also wasn't as chunky or fragile as a low-end laptop.
For just £300 (around $395 / AU$525 - I got it for a discount, I don't remember why, but also bought a case that offset the saving), I had an incredibly portable device that served me well for years after.
We gave the 2017 iPad full marks in our review(Image credit: Future)
I used the tablet for university essays, creative writing (I minored in creative writing - yeah, I really wasted my youth), screenplays, watching movies at home, listening to music out loud and Duolingo too, which really took over one peculiar year of my life.
When my smartphone broke, I didn't buy a new one - I just used my old flip phone for calls and texts, and relied on the iPad for all social media.
I could use my iPad in the university library, in the media room for the student newspaper I was an editor at, in various cafes and pubs around campus, at home on my desk or bed, at my partner's house, even in the bath. It was a perfect utility device.
The main reason I loved it was its portability - I studied in a fairly small city that you could traverse without public transport, so I spent an hour or more every day on bikes and walking, and wasn't burdened by a massive laptop.
The iPad sat within my budget and fit my needs, and I couldn't find a laptop that did the same. And I'm not alone.
The iPad's audience
The entry-level iPad has remained a tablet designed for people exactly like I was - not particularly fussed about tech, who just want a useful, portable and inexpensive tablet to use.
Students don't need the iPad Pro(Image credit: Future)
I know plenty of people who don't care about tech but use their iPads all the time - seniors who find phones too small, musicians who need a big screen to see sheet music, reading fans who don't want a Kindle, the list goes on and on.
These people don't need the fastest, flashiest processor, or a super top-end screen, and also don't want to spend loads of money on a fancy tablet when they're only going to use 10% of the features.
And that's the thing Apple doesn't seem to realize.
The slow slipping of the iPad
iPads were great products for normal people, like I was as a student, but Apple's new tablets aren't catering to that kind of audience.
Admittedly the iPad Pro line was never designed for the everyday user - Pro is short for 'professional' after all - but the entry-level and Air lines used to be.
The iPad Air was once a great option for people who wanted something like the entry-level tablet, but with a larger display - well, that was the state of things by the iPad Air 3.
The fourth-gen option changed up the design quite a bit, eschewing the classic iPad design for iPad-Pro-inspired sleekness, and the fifth-gen option brought 5G connectivity and a super-powerful chipset.
The new device isn't one that any average buyer should consider. Not only is it more powerful than anyone needs (including professionals), but it's a lot pricier than the iPad Air 3, so people with limited budgets have been forgotten.
(Image credit: Future)
I'm worried that the standard iPad range is going to go the same way. Previously, these tablets have used the same processors as the iPhones, which makes them powerful enough as it is, without getting the unnecessarily powerful - and expensive - Apple M1.
This might seem outlandish - but the iPad Air getting the M1 also sounded unlikely, until it happened - and thanks to the high price tag, it's arguably a less tempting tablet compared to its predecessor as a result.
Students don't need the M1. Seniors don't need the M1. Musicians and readers and teachers and children don't need the M1. What all those people do need is affordable tech.
There might be some people who really want an M1 chipset in a tablet - but the only apps that will benefit from that are a select few work or creativity apps, so these people will be professionals. If only there was an iPad for them - an Pro iPad, say...
I wouldn't buy any of these
If I was a student now, in 2022, I wouldn't buy the iPad Pro or Air or even Mini - the iPad 10.2 from 2021 would be my only option. And if the new iPad for 2022 does bring some of the unnecessary improvements I'm expecting, and costs more as a result, it'll be ruled out too.
There have been several years of on-and-off recession where I live, and in many places around the world. There's currently a cost of living crisis going on, so buying tech isn't really a priority to many people.
That is to say, Apple should be making its tech more affordable, not bumping up the specs for no good reason. Sure, its new devices might be more tempting to current or aspiring creatives or professionals, but for the legion of Apple buyers who need a familiar and reliable device, there's nothing out there worth buying anymore.
For those unfamiliar, the DMA aims to reign in big tech platforms in Europe so that smaller companies can better compete with Meta, Google, Microsoft and others.
As part of the new bill, large tech companies with a market capitalization of over €75bn and a user base of more than 45m in the EU would be required to create products that are interoperable with smaller platforms. While this will likely be fine for online collaboration tools and office software, there are a number of security risks for messaging services like WhatsApp that included end-to-end encryption as part of their offerings.
TechRadar needs you!
We're looking at how our readers use VPNs with different devices so we can improve our content and offer better advice. This survey shouldn't take more than 60 seconds of your time, and entrants from the UK and US will have the chance to enter a draw for a £100 Amazon gift card (or equivalent in USD). Thank you for taking part.
The EU hopes that the DMA will help smaller competitors by breaking open some of the services provided by large tech giants that are considered gatekeepers due to the size of their customer base as well as their revenue. As a result, iPhone users could potentially be able to install third-party apps outside of the App Store, outside sellers may soon rank higher on Amazon's ecommerce platform and messaging apps would be required to allow users to send messages across multiple protocols, according to a new report from The Verge.
End-to-end encryption concerns
The DMA poses a serious problem for secure messaging services that included end-to-end encryption as part of their offerings.
Cryptographers agree that it will be difficult or even impossible to maintain encryption between apps which could put users at risk of having their messages and data exposed. While Signal is small enough that it likely won't be affected by the EU's new legislation, WhatsApp, which uses the Signal protocol, will likely need to change how its platform works.
As cryptographic standards need to be precisely implemented, security experts that spoke with The Verge warned that there is no easy way for secure messaging apps to provide both security and interoperability to their users. Essentially, different forms of encryption with different design features can't easily be fused together to comply with the DMA.
Internet security researcher and Columbia University computer science professor, Steven Bellovin provided further insight on the matter in a statement to The Verge, saying:
“Trying to reconcile two different cryptographic architectures simply can’t be done; one side or the other will have to make major changes. A design that works only when both parties are online will look very different than one that works with stored messages .... How do you make those two systems interoperate?”
As it stands now, every messaging service is responsible for its own security but by making them interoperable, users of one service could be exposed to vulnerabilities that may exist in another messaging platform.
Thankfully, there's still time for either the EU to reverse course or for secure messaging app providers to devise a way to make their services interoperable with smaller competitors as Digital Markets Act won't be implemented before next year.
The transition to hybrid work is leading businesses to purchase new equipment for both their remote workers and offices which is why HP has announced that it has entered a definitive agreement to acquire Poly.
If approved, the all-cash transaction at $40 per share would see HP acquire Poly for $1.7bn though the deal itself is valued at $3.3bn as it also includes Poly's net debt.
While employers and employees alike upgraded their devices and peripherals when working from home during the pandemic, the rise of hybrid work is creating sustained demand for technology that enables seamless collaboration across home and office environments. At the same time, traditional office spaces are being upgraded to better support hybrid work and collaboration with a focus on meeting room solutions.
According to a recent study from Frost & Sullivan, of the more than 90m meeting rooms today, less than ten percent have video capability and HP aims to capitalize on this through its acquisition of Poly.
President and CEO of HP, Enrique Lores provided further insight in a press release on why HP wants to add Poly's products and solutions to its portfolio of printers and business laptops, saying:
“The rise of the hybrid office creates a once-in-a-generation opportunity to redefine the way work gets done. Combining HP and Poly creates a leading portfolio of hybrid work solutions across large and growing markets. Poly’s strong technology, complementary go-to-market, and talented team will help to drive long-term profitable growth as we continue building a stronger HP.”
The perfect match?
In a more hybrid world, cloud-based platforms like Zoom and Microsoft Teams will play an important role in the future of work as they enable workers across industries to collaborate from anywhere.
Zoom CEO Eric Yuan praised the deal saying that high quality audio and video has become an “essential component of work across every industry”. He also noted that the combination of Poly and HP's offerings will unlock new opportunities for HP to partner with Zoom to “turn any space into a hub for dynamic video collaboration”.
Through its acquisition of Poly, HP aims to drive growth and scale its peripherals and workforce solutions businesses which represent a $110bn and $120bn segment opportunity respectively. With Poly's devices, software and services combined with HP's strengths across compute, device management and security, the deal will create a robust portfolio of hybrid meeting solutions.
HP expects the transaction to close by the end of this year once it receives Poly stockholder and regulatory approval.
We'll likely hear more on HP's plans to enter the video conferencing hardware market once its acquisition of Poly is complete.
Windows 11 gamers could get some really beefy benefits from DirectStorage tech, which was recently announced to have arrived on Microsoft’s newest OS – but it’ll be some time yet before developers incorporate it into games.
However, there’s been an eye-opening revelation concerning exactly how much difference this will make when it comes to relieving the pressure on the PC’s processor.
As TweakTown reports, Cooper Partin, a senior software engineer at Microsoft, explained that the DirectStorage implementation for PC is specifically designed for Windows.
Partin noted: “DirectStorage is designed for modern gaming systems. It handles smaller reads more efficiently, and you can batch multiple requests together. When fully integrated with your title, DirectStorage, with an NVMe SSD on Windows 11, reduces the CPU overhead in a game by 20-40%.
“This is attributed to the advancements made in the file IO stack on Windows 11 and the improvements on that platform in general.”
Analysis: CPU resources freed which will make a major difference elsewhere
A 40% reduction is a huge difference in terms of lightening the load on the CPU, although that is a best-case scenario – but even 20% is a big step forward for freeing up processor resources.
Those resources can then be used elsewhere to help big open world games run more smoothly – as we’ve seen before, DirectStorage isn’t simply about making games load more quickly . There’s much more to it than that, and now we’re getting some exciting glimpses of exactly how much difference this Microsoft tech could make to PC games.
Of course, while the public SDK (software development kit) has been released, it’s still up to game developers to bake in this tech when they’re coding, and it’ll be quite some time before we see DirectStorage appearing in many games.
The first game which uses DirectStorage is Forspoken, and we got a glimpse of that at GDC, where it was shown to load up in a single second. Forspoken is scheduled to arrive in October 2022.
On this edition of the TechRadar Pro Tech Talk, AVNation’s Tim Albright chats with Martin Fishman, Director of Strategic Alliances and Enterprise at Joan, about the return to offices for employees, and how the Joan 6 Pro display device can make the transition smoother for SMB and enterprise clients.
About Joan 6 Pro
The next generation, ultra low-power, touch screen ePaper device, designed for advanced meeting room management of enterprises. Mount it wirelessly with the Smart magnet or use a fully wired PoE dock.
Imagine if your iPhone setup started by asking if you want Siri, Alexa, or Google Assistant.
It could happen. Maybe.
The European Union's Digital Marketing Act (DMA), a set of rules targeting so-called digital gatekeepers like Amazon, Microsoft, Facebook, and Apple, could have wide-reaching implications for everything from search and browsers to messaging services across multiple platforms.
Now it also seems to target your favorite digital assistants.
Policymakers in the EU have been working on the DMA for almost two years with the goal of "ensuring fair and open markets." It targets companies and digital services that have huge userbases (45 million monthly active users) and significant revenues (75B Euros). If enacted in October of this year, it enforces:
Interoperability between these large companies and third-party providers
Customer access to data generated in these company's services
It would give advertisers the ability to use their own tools to measure ad performance on these large company platforms (think Google AdSense)
Opening proprietary messaging services to interoperation with third-party messaging services
Open the platforms to promotions and use of third-party transaction services
This week, however, the EU committees met again to further negotiate the DMA, and hidden among the laundry list of new requirements is this:
"...a requirement to allow users to freely choose their browser, virtual assistants or search engines."
Browser choice and search engine selection are already a given on Apple's iOS, macOS and iPadOS, as it is on Microsoft's Windows, and browsers like Microsoft Edge and Google Chrome.
What no one really asks us about and is often baked into hardware is our virtual assistants.
Every iPhone arrives with Siri built-in at a system level. Siri is the assistant that responds when you hold the power sleep button. It's the digital voice that responds when you say, "Hey Siri." In Google devices, Google Assistant is the default. More crucially, there isn't an Echo device sold with the option to switch from Alexa to another assistant.
As written, this DMA opens Apple, Amazon, and, possibly, Google up to violations and fines if they do not allow consumers to choose between, say, Siri and Alexa or Google Assistant and even Samsung's Bixby. Individual violations for any of the DMA rules would result in fines equal to 10% of overall worldwide revenue and could grow to 20% of revenue for repeat violations.
It's worth noting that the EU Commission, which has oversight over the European Union and not worldwide operations for Apple, Facebook, Microsoft, and other qualifying companies is going after each firm's global earnings.
It further raises the question of the actual teeth of these potential rules.
Developed with an aim of protecting small businesses, tinier companies, and especially the interests of European companies and customers, there is a chance that rules enacted in the EU could affect customers around the world.
Google and Apple released statements that, while appearing to support the sentiment of the DMA, both expressed concern about how the rules could impact innovation, choice, privacy, and security.
An Apple spokesperson shared this statement with TechRadar on the latest provisions:
"Apple has always been committed to creating the best, most innovative products for our customers, and to ensuring that their privacy and security are always protected. We remain concerned that some provisions of the DMA will create unnecessary privacy and security vulnerabilities for our users while others will prohibit us from charging for intellectual property in which we invest a great deal. We believe deeply in competition and in creating thriving competitive markets around the world, and we will continue to work with stakeholders throughout Europe in the hopes of mitigating these vulnerabilities.”
The company is not, for now, talking about virtual assistants.
The voice assistant question
For Google, Apple, and Amazon, Google Assistant, Siri, and Alexa represent important connective tissue across their ecosystems of connected devices and services.
Apple lets Siri talk through and control iPads, iPhones, Macs, HomePods, and control services like Apple Music and Maps. Google Assistant threads throughout almost all of Google's Knowledgebase-driven systems. Alexa is an interesting case because, perhaps even more than Google Assistant, it weaves through all Amazon hardware and services and, as an open service, lives across countless third-party hardware. Plus, unlike Apple's Siri, it allows you to embed third-party service control. Mabe the EU will look more kindly on Alexa.
If in October, the EU is able to enact the full scope of the DMA, its impact will be felt well beyond the EU's borders. It's likely Congress and The Whitehouse, which have been thinking through and discussing regulation and big-tech breakup for years, could use the DMA as a quick-fix stop-gap template until it can come up with its own rules (I won't hold muy breath).
If all that comes to pass, it will be a reckoning day for Apple, Google, Amazon, Facebook, and other big tech companies that meet the threshold. It could also be a sea-change for your favorite virtual assistant.
"Hey Siri, are you worried?"
from TechRadar - All the latest technology news https://ift.tt/LZprqaJ
Google Cloud customers will now be able to suspend their virtual machines (VMs) when not in use which will help lower their cloud spending.
The software giant's cloud computing division has announced that its new Suspend/Resume feature for VMs is now generally available after launching in alpha several years ago.
The new feature works in a similar way to closing the lid of your laptop or putting your desktop PC to sleep. By suspending a Google Compute Engine VM, the the state of your instance will be saved to disk so that you can pick up later right where you left off.
The best part about Suspend/Resume in Google Cloud though is that customers will no longer need to pay for cores or RAM when their VMs are in a suspended state. However, they will still need to pay the cloud storage costs of their instance memory as well as other VM running costs like OS licensing but these may be reduced.
Suspending and resuming VMs
When a Google Cloud customer suspends an instance, an ACPI S3 signal is sent to the instance's operating system just like when you close a laptop's lid or put a PC to sleep.
The company makes the case that using this type of signal allows for broad compatibility with a wide selection of OS images so that customers don't have to use a cloud specific OS image or install daemons. At the same time, undocumented and custom OS images that respond to the ACPI S3 signal may also work with Google Cloud's Suspend/Resume feature.
It's also worth noting that storage is dynamically provisioned when Suspend is requested and is separate from the instance's boot disk. Other cloud providers require users to ensure that they have sufficient space in their boot disk to save instance states which may increase the costs of running VMs.
In a new blog post announcing the general availability of Suspend/Resume, Google Cloud also pointed out that the feature can be used by organizations to deal with demand spikes as they can initialize instances with their critical applications and then suspend them so that they can be resumed later. Although Compute Engine instances can be created quickly, resuming an instance is much faster than creating an entirely new instance.
Internet technology pioneer Stephen Wilhite passed away on March 14 from COVID-related complications. He was 74 and he leaves behind an incredible legacy, the GIF, a game-changer for the blossoming World Wide Web in the 1990s.
Wilhite, who was interested in compression technology, created the GIF at his home in 1987. “I saw the format in my head and then started programming,” he told the New York Times. He then brought the technology to his job at CompuServe, the first major Internet service provider in the US, where he made finalizing tweaks.
In addition to his passion for technology, post-retirement Wilhite was an avid outdoorsman and enjoyed building model trains in his basement.
GIFs made the Internet
The GIF or Graphics Interchange Format has been a massive component of the Internet since its inception. TechRadar’s US Editor-in-Chief, Lance Ulanoff noted in 2016 that “for webmasters in the 1990’s, GIFs were as crucial to the site-building process as HTML….if HTML was the skeleton of our websites, then GIFs were the skin and blood.”
Per his obituary, Wilhite received a Webby Lifetime Achievement Award in New York in 2013 for his invention, during which he reiterated the correct pronunciation of GIF via, naturally, a GIF that simply stated: “it’s pronounced ‘JIF’, not ‘GIF’”. The crowd roared with excitement in response as Wilhite walked wordlessly off the stage.
In further response to the never-ending debate about the pronunciation, a somewhat annoyed Wilhite told the New York Times “The Oxford Dictionary accepts both pronunciations. They are wrong. It is a soft ‘G,’ pronounced ‘JIF’. End of story.”
While today you may associate GIFs with short animations that you see in memes or send in group chats to your friends, the early days of the format were much smaller in scale, usually consisting of just a few low-resolution frames at a time or even single-pixel spacers to help prop up complicated HTML designs.
From humble beginnings to humor
Being the building blocks of the early Internet, it’s almost surprising to observe how they're used today for comedy purposes, displaying fan-made excerpts of classic shows like Friends, or even in some cases, severely compressed, yet full episodes of Spongebob.
GIFs like those blew up on Tumblr and Reddit in their early days despite being regarded as jokes before the heydays of the sites; “No serious web developer or artist would use GIFs,” noted Lance.
These days, we all look upon GIF endearingly; they have withstood the test of time and prospered despite new technologies sprouting around them. Platforms like Giphy were created by people who love the format, and others like Tenor have followed in its wake.
GIFs are a critical component of the Internet that will likely never be replaced, so may Wilhite rest in peace knowing that the Internet will always celebrate his invention.
While the correct pronunciation may be dismaying for some, a commenter on The GIF Pronunciation Page says that the pronunciation may be an homage to peanut butter being “one of the principal three programmer foods”, with the other two being Doritos and Pepsi.
from TechRadar - All the latest technology news https://ift.tt/EPWpFaI
If you were looking at the teaser image for The Witcher 4 and thinking, “That’s not like any Witcher medallion I know,” then you were right on the money - publisher CD Projekt has revealed that it has created a whole new Witcher school for the new game.
We still know very little about The Witcher 4 beyond the fact that it’s in active development. CD Projekt put out a short announcement earlier this week confirming the open-world PRG was in the works and that the team had swapped from its own REDengine to Unreal Engine 5. Beyond that, all we have to go on is the teaser image – a Witcher medallion half-covered in snow.
(Image credit: CD Projekt Red)
While the medallion looks similar to the one we’ve come to know across The Witcher trilogy, that of the School of the Wolf, it is a narrower face with pointier ears.
Fans weren’t sure if they were looking at a redesigned medallion, either of the School of the Wolf, another canon Witcher School, or an entirely new medallion.
Our own Jessica Weatherbed called it, saying that not only was it a new medallion, but that it would be an entirely new school that didn’t feature in the games, books, or television series.
Yea, i'm pretty convinced this is a new Witcher school. Possibly a prequel set in the Second age of Witchers. Not book canon of course, but then, non of the games are either. pic.twitter.com/eC3gt2Em2tMarch 21, 2022
See more
"I can confirm that the medallion is, in fact, shaped after a lynx," global communication director Robert Malinowski told Eurogamer.
However, while the School of the Lynx doesn’t appear in official Witcher fiction, there are stories about the school elsewhere on the internet – on a Witcher fan fiction wiki.
On the fan wiki, the School of the Lynx forms after the School of the Wolf dissolves. Founded by the surviving members – including Geralt – the school eventually departs its homeland and explores the Western continent. However, remember that this is all fan-fiction and unlikely to be similar to the story that CD Projekt is working up for its new RPG. For a start, Geralt may not even feature in the new game. Indeed, the developer was talking about having a different protagonist for future games.
Also, we have no guarantees that the new Witcher game is set after the events of the original Witcher trilogy. To avoid the difficulties of picking which possible ending to the games is canon, CD Projekt may make the new game a prequel to the old trilogy.
While we are still likely years away from playing The Witcher 4, we now know a little more about the world that CD Projekt is developing.
YouTube is bringing two new features to provide more context on the health content on the platform. The company has unveiled health source information panels and content shelves in India to assist users in finding videos from authoritative sources.
In an effort to further support businesses transitioning to hybrid work, Dynabook has refreshed its Tecra line of business laptops with the latest Intel processors and Windows 11 Pro.
The new 14-inch Tecra A40-K is a performance-rich laptop that measures just 18.9mm thin and is well-suited for today's work-from-anywhere professionals. The updated device sports a 14-inch narrow-bezel display, webcam with privacy shutter, backlit keyboard and a large ClickPad though it can also be outfitted with an optional fingerprint reader.
The 15-inch Tecra A50-K meanwhile is designed with productivity in mind and features a 19.9mm thin chassis which houses a 15.6-inch thin-bezel display, a full-size backlit keyboard, a large ClickPad with optional fingerprint reader and a webcam with a privacy shutter.
General manager of Dynabook Americas, James Robbins provided further insight on how the company's new Tecra A40-K and Tecra A50-K laptops were specifically designed with professionals in mind, saying:
“With more than 35 years of experience creating powerful, feature rich laptops for businesses, our engineers take the time to truly understand the computing pain points faced by professionals, as well as what they need and want in their laptops. While productivity and portability remain top priorities, we’ve paid extra attention to making our Tecra laptops feel premium while maximizing their performance, durability and available features without compromising price competitiveness.”
Refreshed Tecra A40-K and Tecra A50-K
Dynabook's new Tecra laptops are also getting a major performance boost as they are configurable with the new hybrid architecture found in 12th Gen Intel Core P-Series i5 and i7 processors, optional Intel Iris Xe graphics and up to 64GB of memory.
When it comes to connectivity and ports, both the Tecra A40-K and Tecra A50-K feature Wi-Fi 6E and Thunderbolt 4 as well as full-size HDMI, Gigabit LAN, 3.5mm audio jacks, USB-A ports and a microSD card reader.
Dynabook's new Tecra laptops also include AI tools to help increase both productivity and collaboration with Cortana-enabled dual mics providing a new AI noise reduction function while an HD camera with face authentication comes with its own AI enhanced functionality. When combined with stereo speakers with DTS audio processing, the Tecra A40-K and Tecra A50-K are ideal for video conferencing. Both Tecra models also meet Microsoft's strict Secured-core PC requirements while Dynabook's proprietary BIOS provides another extensive security layer to mitigate BIOS-level security threats.
Although Dynabook will offer multiple configurations of both laptops on its website and from its network of reselllers, the Tecra A40-K will start at $1,019.99 (around £769,07) and the Tecra A50-K starts at $969.99 (around £731,37).
After allegedly gaining access to Microsoft's Azure DevOps source code repositories over the weekend, the South American-based data extortion hacking group Lapsus$ has now made some of the company's internal files available online.
In a recent post on Telegram, the group shared a screenshot of Microsoft's Azure DevOps account to show that they had hacked one of the company's servers which contained the source code for Bing, Cortana and a number of other internal projects.
Now though, Lapsus$ has made the source code for over 250 Microsoft projects available online in a 9GB torrent. According to the group, the torrent itself contains 90 percent of the source code for Bing and 45 percent of the source code for both Bing Maps and Cortana.
TechRadar needs you!
We're looking at how our readers use VPNs with different devices so we can improve our content and offer better advice. This survey shouldn't take more than 60 seconds of your time, and entrants from the UK and US will have the chance to enter a draw for a £100 Amazon gift card (or equivalent in USD). Thank you for taking part.
While Lapsus$ says that they only leaked some of Microsoft's source code, security researchers that spoke with BleepingComputer say that the uncompressed archive actually contains 37GB of projects. After examining the contents of the torrent more closely, the security researchers are confident that the leaked files are legitimate internal source code from the company.
Paying for access
In addition to internal source code, some of the leaked projects contain emails and other documentation that was used internally by Microsoft engineers working on mobile apps. The projects themselves all appear to be related to web-based infrastructure, websites or mobile apps and at this time, it seems that Lapsus$ did not steal any source code for Microsoft's desktop software such as Windows 11, Windows Server and Microsoft Office.
Microsoft may be the latest victim but over the past few months, the Lapsus$ group has made a name for itself by successfully attacking Nvidia, Samsung, Vodafone, Ubisoft and Mercado Libre.
While it's still unknown as to how the group has managed to target the source code repositories of so many big companies in such a short time, some security researchers believe Lapsus$ is paying corporate insiders for access. In fact, in a previous post on its fast-growing Telegram channel, the group said that it actively recruits employees and insiders at telecoms, large software and gaming companies, call centers and dedicated server hosting providers.
Besides recruitment, Lapsus$ also uses its Telegram channel to announce new leaks and attacks as well as for self-promotion. The group has already amassed close to 40k subscribers on the platform which it even uses to chat with its fans.
Now that the Lapsus$ group has gained a great deal of notoriety online, expect law enforcement agencies and even large companies like Microsoft to begin taking action to disrupt its activities before it strikes again.
LG has unveiled the price and tentative release dates for its new 2022 OLED TV models that include the new LG C2 OLED, LG G2 OLED, LG B2 OLED and Z2 OLED.
Pricing, as shown in a press release sent to TechRadar, shows that the LG C2 OLED will start off at $1,399 (around £1,060, AU$1,900) for the smallest 42-inch C2 OLED and rockets up to $24,999 (around £19,000, AU$34,800) if you’re after the monstrous 88-inch 8K Z2 OLED.
In terms of release dates, LG’s spread them out quite a bit with most models arriving this month in March and a few slated for release in April and May.
Here’s the pricing and release date information shared with TechRadar:
LG B2 OLED pricing and release dates
55-inch OLED55B2PUA is available March 2022 for $1,499
65-inch OLED65B2PUA is available March 2022 for $1,999
77-inch OLED77B2PUA is available March 2022 for $3,299
LG C2 OLED pricing and release dates
42-inch OLED42C2PUA is available May 2022 for $1,399
48-inch OLED48C2PUA is available March 2022 for $1,499
55-inch OLED55C2PUA is available March 2022 for $1,799
65-inch OLED65C2PUA is available March 2022 for $2,499
77-inch OLED77C1PUA is available March 2022 for $3,499
83-inch OLED83C2PUA is available April 2022 for $5,499
LG G2 OLED pricing and release dates
55-inch OLED55G2PUA is available April 2022 for $2,199
65-inch OLED65G2PUA is available March 2022 for $2,999
77-inch OLED77G2PUA is available March 2022 for $3,999
83-inch OLED83G2PUA is available April 2022 for $6,499
LG Z2 OLED pricing and release dates
77-inch OLED778Z2PUA is available April 2022 for $12,999
88-inch OLED88Z2PUA is available April 2022 for $24,999
(Note that the pricing above is for the US only, but we're expecting UK and AU pricing sometime in the next few weeks - stay tuned!)
Analysis: QD-OLED has its advantages... and a price markup
Without having seen Samsung's QD-OLED TV for ourselves yet (something we're expecting will change in the next few weeks), it's hard to say how it will stack up against LG's current crop of OLED TVs.
That said, based on price alone, the Samsung QD-OLED will have to be significantly better to justify its higher price tag.
To wit, pricing information sent to TechRadar last week shows that the S95B OLED TV will cost $2,199.99 for the 55-inch version and $2,999.99 for the 65-inch model. Considering that the LG C2 OLED costs just $1,799 for the same size and uses LG's OLED Evo technology, that $400 markup could be a big factor in determining which of the two models folks choose.
If you're looking to save even more, though, the LG B2 OLED in a 55-inch is only $1,499 and while it doesn't use LG OLED Evo technology for a 20% boost in brightness, it's still using a native 120Hz panel with four HDMI 2.1 inputs.
For gamers especially, it's going to be a tough year to pick out a new TV.