Vue normale

Il y a de nouveaux articles disponibles, cliquez pour rafraîchir la page.
À partir d’avant-hierFlux principal

Microsoft Drops Azure Egress Fees

Par : msmash
14 mars 2024 à 18:40
Microsoft has eliminated egress fees for customers removing data from its Azure cloud, joining Amazon Web Services and Google in this move. The decision comes as the European Data Act's provisions targeting lock-in terms are set to take effect in 2025. Microsoft adds: Azure already offers the first 100GB/month of egressed data for free to all customers in all Azure regions around the world. If you need to egress more than 100GB/month, please follow these steps to claim your credit. Contact Azure Support for details on how to start the data transfer-out process. Please comply with the instructions to be eligible for the credit. Azure Support will apply the credit when the data transfer process is complete and all Azure subscriptions associated to the account have been canceled. The exemption on data transfer out to the internet fees also aligns with the European Data Act and is accessible to all Azure customers globally and from any Azure region.

Read more of this story at Slashdot.

Broadcom Is 'Holding the Sector To Ransom' With VMware License Changes, Claims CISPE

Par : BeauHD
21 mars 2024 à 22:00
couchslug shares a report from ITPro: A European cloud trade body has called for an investigation into Broadcom amid concerns over changes it has made to VMware licensing structures. The Cloud Infrastructure Service Providers in Europe (CISPE) consortium called on regulatory and legislative bodies across Europe to investigate the changes Broadcom has made to the VMware operating model, which it says will "decimate" the region's cloud infrastructure. "CISPE calls upon regulators, legislators and courts across Europe to swiftly scrutinize the actions of Broadcom in unilaterally canceling license terms for essential virtualization software," the trade body said in a statement. Since acquiring VMware in November 2023, Broadcom has embarked on a comprehensive overhaul of software licensing at the firm, which has drawn widespread criticism from customers. Broadcom stated it would continue to support customers under a perpetual licensing agreement for the period defined in the contract, but following this customers would need to exchange any remaining licenses for subscription-based products. This has left both cloud service vendors and customers in limbo, according to CISPE, without any solid information on how, when, or if they will be able to license VMware products essential for their operations from April 2024. Moreover, even if they are able to relicense the VMware software, a number of customers reported dramatic price hikes of as much as 12 times. CISPE's characterisation of the move was far less charitable, arguing Broadcom is using VMware's market dominance, controlling almost 45% of the virtualization market, to charge exorbitant rents from cloud providers. Several CISPE members admitted that without the ability to license VMware products they will be unable to operate and will go bankrupt, with some stating that over 75% of their revenue depends on VMware virtualization tech. Members added that they often received termination notices late, if at all, with short notice periods that spanned just a few weeks. In addition, CISPE also complained about the decision to remove hundreds of products without any notice, and re-bundle the outstanding products under new prohibitive contract terms, despite there being no changes to the products themselves. Francisco Mingorance, secretary general of CISPE, said the changes will hurt both European customers and cloud service providers by increasing costs and reducing choice. At a time when our members are moving to support the requirements for switching and portability between cloud services outlined in the Data Act, Broadcom is holding the sector to ransom by leveraging VMware's dominance of the virtualization sector to enforce unfair license terms and extract unfair rents from European cloud customers," Mingorance said. CISPE noted that for some cloud sector applications that require certifications by software or service providers, VMware products are the only viable option. As such, the association called for Broadcom to be recognized as a designated gatekeeper under the terms of the Digital Markets Act (DMA) that came into force on March 7, 2024. Mingorance argued Broadcom's moves will only further restrict an already limited set of options for cloud providers in Europe, warning that Broadcom has a dangerous degree of control over the region's digital ecosystems. "As well as inflicting financial damage on the European digital economy, these actions will decimate Europe's independent cloud infrastructure sector and further reduce the diversity of choice for customers," he explained. "Dominant software providers, in any sector from productivity software to virtualization, must not be allowed to wield life or death power over Europe's digital ecosystems."

Read more of this story at Slashdot.

Amazon Bets $150 Billion on Data Centers Required for AI Boom

Par : msmash
28 mars 2024 à 14:41
Amazon plans to spend almost $150 billion in the coming 15 years on data centers, giving the cloud-computing giant the firepower to handle an expected explosion in demand for artificial intelligence applications and other digital services. From a report: The spending spree is a show of force as the company looks to maintain its grip on the cloud services market, where it holds about twice the share of No. 2 player Microsoft. Sales growth at Amazon Web Services slowed to a record low last year as business customers cut costs and delayed modernization projects. Now spending is starting to pick up again, and Amazon is keen to secure land and electricity for its power-hungry facilities. "We're expanding capacity quite significantly," said Kevin Miller, an AWS vice president who oversees the company's data centers. "I think that just gives us the ability to get closer to customers." Over the past two years, according to a Bloomberg tally, Amazon has committed to spending $148 billion to build and operate data centers around the world. The company plans to expand existing server farm hubs in northern Virginia and Oregon as well as push into new precincts, including Mississippi, Saudi Arabia and Malaysia.

Read more of this story at Slashdot.

Cloud Server Host Vultr Rips User Data Ownership Clause From ToS After Web Outage

Par : BeauHD
29 mars 2024 à 00:20
Tobias Mann reports via The Register: Cloud server provider Vultr has rapidly revised its terms-of-service after netizens raised the alarm over broad clauses that demanded the "perpetual, irrevocable, royalty-free" rights to customer "content." The red tape was updated in January, as captured by the Internet Archive, and this month users were asked to agree to the changes by a pop-up that appeared when using their web-based Vultr control panel. That prompted folks to look through the terms, and there they found clauses granting the US outfit a "worldwide license ... to use, reproduce, process, adapt ... modify, prepare derivative works, publish, transmit, and distribute" user content. It turned out these demands have been in place since before the January update; customers have only just noticed them now. Given Vultr hosts servers and storage in the cloud for its subscribers, some feared the biz was giving itself way too much ownership over their stuff, all in this age of AI training data being put up for sale by platforms. In response to online outcry, largely stemming from Reddit, Vultr in the past few hours rewrote its ToS to delete those asserted content rights. CEO J.J. Kardwell told The Register earlier today it's a case of standard legal boilerplate being taken out of context. The clauses were supposed to apply to customer forum posts, rather than private server content, and while, yes, the terms make more sense with that in mind, one might argue the legalese was overly broad in any case. "We do not use user data," Kardwell stressed to us. "We never have, and we never will. We take privacy and security very seriously. It's at the core of what we do globally." [...] According to Kardwell, the content clauses are entirely separate to user data deployed in its cloud, and are more aimed at one's use of the Vultr website, emphasizing the last line of the relevant fine print: "... for purposes of providing the services to you." He also pointed out that the wording has been that way for some time, and added the prompt asking users to agree to an updated ToS was actually spurred by unrelated Microsoft licensing changes. In light of the controversy, Vultr vowed to remove the above section to "simplify and further clarify" its ToS, and has indeed done so. In a separate statement, the biz told The Register the removal will be followed by a full review and update to its terms of service. "It's clearly causing confusion for some portion of users. We recognize that the average user doesn't have a law degree," Kardwell added. "We're very focused on being responsive to the community and the concerns people have and we believe the strongest thing we can do to demonstrate that there is no bad intent here is to remove it."

Read more of this story at Slashdot.

Ne comptez pas que sur Google Drive pour sauvegarder vos fichiers

31 mars 2024 à 08:01

google drive

La journée mondiale de la sauvegarde a lieu le 31 mars. L'occasion de vérifier si vous appliquez les bonnes pratiques en la matière. Par exemple, ne pas compter que sur le cloud pour conserver vos fichiers et vos données.

Irish Power Crunch Could Be Prompting AWS To Ration Compute Resources

Par : msmash
12 avril 2024 à 17:40
Datacenter power issues in Ireland may be coming to a head amid reports from customers that Amazon is restricting resources users can spin up in that nation, even directing them to other AWS regions across Europe instead. From a report: Energy consumed by datacenters is a growing concern, especially in places such as Ireland where there are clusters of facilities around Dublin that already account for a significant share of the country's energy supply. This may be leading to restrictions on how much infrastructure can be used, given the power requirements. AWS users have informed The Register that there are sometimes limits on the resources that they can access in its Ireland bit barn, home to Amazon's eu-west-1 region, especially with power-hungry instances that make use of GPUs to accelerate workloads such as AI. "You cannot spin up GPU nodes in AWS Dublin as those locations are maxed out power-wise. There is reserved capacity for EC2 just in case," one source told us. "If you have a problem with that, AWS Europe will point you at spare capacity in Sweden and other parts of the EU." We asked AWS about these issues, but when it finally responded the company was somewhat evasive. "Ireland remains core to our global infrastructure strategy, and we will continue to work with customers to understand their needs, and help them to scale and grow their business," a spokesperson told us. Ireland's power grid operator, EirGrid, was likewise less than direct when we asked if they were limiting the amount of power datacenters could consume.

Read more of this story at Slashdot.

Amazon Cloud Unit Kills Snowmobile Data Transfer Truck Service

Par : msmash
17 avril 2024 à 16:40
At Amazon's annual cloud conference in 2016, the company captured the crowd's attention by driving an 18-wheeler onstage. Andy Jassy, now Amazon's CEO, called it the Snowmobile, and said the company would be using the truck to help customers speedily transfer data to Amazon Web Services facilities. Less than eight years later, the semi is out of commission. From a report: As of March, AWS had removed Snowmobile from its website, and the Amazon unit has stopped offering the service, CNBC has confirmed. The webpage devoted to AWS' "Snow family" of products now directs users to its other data transport services, including the Snowball Edge, a 50-pound suitcase-sized device that can be equipped with fast solid-state drives, and the smaller Snowcone. An AWS spokesperson said in an emailed statement that the company has introduced more cost-effective options for moving data. Clients had to deal with power, cooling, networking, parking and security when they used the Snowmobile service, the spokesperson said.

Read more of this story at Slashdot.

L’IA vous rend déjà la vie plus simple, c’est juste que vous ne le voyez pas [Sponso]

Par : humanoid xp
25 avril 2024 à 05:47

Cet article a été réalisé en collaboration avec OVHcloud

L’IA est partout ou presque : dans nos mails, dans nos smartphones, dans notre vie quotidienne numérique. Et c’est justement quand elle est invisible qu’elle est le plus pratique.

Cet article a été réalisé en collaboration avec OVHcloud

Il s’agit d’un contenu créé par des rédacteurs indépendants au sein de l’entité Humanoid xp. L’équipe éditoriale de Numerama n’a pas participé à sa création. Nous nous engageons auprès de nos lecteurs pour que ces contenus soient intéressants, qualitatifs et correspondent à leurs intérêts.

En savoir plus

Cette technologie vous simplifie la vie au quotidien et vous n’en avez même pas conscience [Sponso]

Par : humanoid xp
25 avril 2024 à 10:15

Cet article a été réalisé en collaboration avec OVHcloud

Elle est partout : dans nos mails, dans nos smartphones, dans notre vie quotidienne numérique. Et pourtant, vous ne la voyez pas. Qui ? L'intelligence artificielle bien sûr !

Cet article a été réalisé en collaboration avec OVHcloud

Il s’agit d’un contenu créé par des rédacteurs indépendants au sein de l’entité Humanoid xp. L’équipe éditoriale de Numerama n’a pas participé à sa création. Nous nous engageons auprès de nos lecteurs pour que ces contenus soient intéressants, qualitatifs et correspondent à leurs intérêts.

En savoir plus

US 'Know Your Customer' Proposal Will Put an End To Anonymous Cloud Users

Par : BeauHD
25 avril 2024 à 21:50
An anonymous reader quotes a report from TorrentFreak: Late January, the U.S. Department of Commerce published a notice of proposed rulemaking for establishing new requirements for Infrastructure as a Service providers (IaaS) . The proposal boils down to a 'Know Your Customer' regime for companies operating cloud services, with the goal of countering the activities of "foreign malicious actors." Yet, despite an overseas focus, Americans won't be able to avoid the proposal's requirements, which covers CDNs, virtual private servers, proxies, and domain name resolution services, among others. [...] Under the proposed rule, Customer Identification Programs (CIPs) operated by IaaS providers must collect information from both existing and prospective customers, i.e. those at the application stage of opening an account. The bare minimum includes the following data: a customer's name, address, the means and source of payment for each customer's account, email addresses and telephone numbers, and IP addresses used for access or administration of the account. What qualifies as an IaaS is surprisingly broad: "Any product or service offered to a consumer, including complimentary or "trial" offerings, that provides processing, storage, networks, or other fundamental computing resources, and with which the consumer is able to deploy and run software that is not predefined, including operating systems and applications. The consumer typically does not manage or control most of the underlying hardware but has control over the operating systems, storage, and any deployed applications. The term is inclusive of "managed" products or services, in which the provider is responsible for some aspects of system configuration or maintenance, and "unmanaged" products or services, in which the provider is only responsible for ensuring that the product is available to the consumer." And it doesn't stop there. The term IaaS includes all 'virtualized' products and services where the computing resources of a physical machine are shared, such as Virtual Private Servers (VPS). It even covers 'baremetal' servers allocated to a single person. The definition also extends to any service where the consumer does not manage or control the underlying hardware but contracts with a third party for access. "This definition would capture services such as content delivery networks, proxy services, and domain name resolution services," the proposal reads. The proposed rule, National Emergency with Respect to Significant Malicious Cyber-Enabled Activities, will stop accepting comments from interested parties on April 30, 2024.

Read more of this story at Slashdot.

How an Empty S3 Bucket Can Make Your AWS Bill Explode

Par : msmash
30 avril 2024 à 19:10
Maciej Pocwierz, a senior software engineer Semantive, writing on Medium: A few weeks ago, I began working on the PoC of a document indexing system for my client. I created a single S3 bucket in the eu-west-1 region and uploaded some files there for testing. Two days later, I checked my AWS billing page, primarily to make sure that what I was doing was well within the free-tier limits. Apparently, it wasn't. My bill was over $1,300, with the billing console showing nearly 100,000,000 S3 PUT requests executed within just one day! By default, AWS doesn't log requests executed against your S3 buckets. However, such logs can be enabled using AWS CloudTrail or S3 Server Access Logging. After enabling CloudTrail logs, I immediately observed thousands of write requests originating from multiple accounts or entirely outside of AWS. Was it some kind of DDoS-like attack against my account? Against AWS? As it turns out, one of the popular open-source tools had a default configuration to store their backups in S3. And, as a placeholder for a bucket name, they used... the same name that I used for my bucket. This meant that every deployment of this tool with default configuration values attempted to store its backups in my S3 bucket! So, a horde of misconfigured systems is attempting to store their data in my private S3 bucket. But why should I be the one paying for this mistake? Here's why: S3 charges you for unauthorized incoming requests. This was confirmed in my exchange with AWS support. As they wrote: "Yes, S3 charges for unauthorized requests (4xx) as well[1]. That's expected behavior." So, if I were to open my terminal now and type: aws s3 cp ./file.txt s3://your-bucket-name/random_key. I would receive an AccessDenied error, but you would be the one to pay for that request. And I don't even need an AWS account to do so. Another question was bugging me: why was over half of my bill coming from the us-east-1 region? I didn't have a single bucket there! The answer to that is that the S3 requests without a specified region default to us-east-1 and are redirected as needed. And the bucket's owner pays extra for that redirected request. The security aspect: We now understand why my S3 bucket was bombarded with millions of requests and why I ended up with a huge S3 bill. At that point, I had one more idea I wanted to explore. If all those misconfigured systems were attempting to back up their data into my S3 bucket, why not just let them do so? I opened my bucket for public writes and collected over 10GB of data within less than 30 seconds. Of course, I can't disclose whose data it was. But it left me amazed at how an innocent configuration oversight could lead to a dangerous data leak! Lesson 1: Anyone who knows the name of any of your S3 buckets can ramp up your AWS bill as they like. Other than deleting the bucket, there's nothing you can do to prevent it. You can't protect your bucket with services like CloudFront or WAF when it's being accessed directly through the S3 API. Standard S3 PUT requests are priced at just $0.005 per 1,000 requests, but a single machine can easily execute thousands of such requests per second.

Read more of this story at Slashdot.

Alternative Clouds Are Booming As Companies Seek Cheaper Access To GPUs

Par : BeauHD
6 mai 2024 à 22:02
An anonymous reader quotes a report from TechCrunch: CoreWeave, the GPU infrastructure provider that began life as a cryptocurrency mining operation, this week raised $1.1 billion in new funding from investors, including Coatue, Fidelity and Altimeter Capital. The round brings its valuation to $19 billion post-money and its total raised to $5 billion in debt and equity -- a remarkable figure for a company that's less than 10 years old. It's not just CoreWeave. Lambda Labs, which also offers an array of cloud-hosted GPU instances, in early April secured a "special purpose financing vehicle" of up to $500 million, months after closing a $320 million Series C round. The nonprofit Voltage Park, backed by crypto billionaire Jed McCaleb, last October announced that it's investing $500 million in GPU-backed data centers. And Together AI, a cloud GPU host that also conducts generative AI research, in March landed $106 million in a Salesforce-led round. So why all the enthusiasm for -- and cash pouring into -- the alternative cloud space? The answer, as you might expect, is generative AI. As the generative AI boom times continue, so does the demand for the hardware to run and train generative AI models at scale. GPUs, architecturally, are the logical choice for training, fine-tuning and running models because they contain thousands of cores that can work in parallel to perform the linear algebra equations that make up generative models. But installing GPUs is expensive. So most devs and organizations turn to the cloud instead. Incumbents in the cloud computing space -- Amazon Web Services (AWS), Google Cloud and Microsoft Azure -- offer no shortage of GPU and specialty hardware instances optimized for generative AI workloads. But for at least some models and projects, alternative clouds can end up being cheaper -- and delivering better availability. On CoreWeave, renting an Nvidia A100 40GB -- one popular choice for model training and inferencing -- costs $2.39 per hour, which works out to $1,200 per month. On Azure, the same GPU costs $3.40 per hour, or $2,482 per month; on Google Cloud, it's $3.67 per hour, or $2,682 per month. Given generative AI workloads are usually performed on clusters of GPUs, the cost deltas quickly grow. "Companies like CoreWeave participate in a market we call specialty 'GPU as a service' cloud providers," Sid Nag, VP of cloud services and technologies at Gartner, told TechCrunch. "Given the high demand for GPUs, they offers an alternate to the hyperscalers, where they've taken Nvidia GPUs and provided another route to market and access to those GPUs." Nag points out that even some Big Tech firms have begun to lean on alternative cloud providers as they run up against compute capacity challenges. Microsoft signed a multi-billion-dollar deal with CoreWeave last June to help provide enough power to train OpenAI's generative AI models. "Nvidia, the furnisher of the bulk of CoreWeave's chips, sees this as a desirable trend, perhaps for leverage reasons; it's said to have given some alternative cloud providers preferential access to its GPUs," reports TechCrunch.

Read more of this story at Slashdot.

Google Cloud Accidentally Deletes UniSuper's Online Account Due To 'Unprecedented Misconfiguration'

Par : BeauHD
10 mai 2024 à 23:20
A "one-of-a-kind" Google Cloud "misconfiguration" resulted in the deletion of UniSuper's account last week, disrupting the financial services provider's more than half a million members. "Services began being restored for UniSuper customers on Thursday, more than a week after the system went offline," reports The Guardian. "Investment account balances would reflect last week's figures and UniSuper said those would be updated as quickly as possible." From the report: The UniSuper CEO, Peter Chun, wrote to the fund's 620,000 members on Wednesday night, explaining the outage was not the result of a cyber-attack, and no personal data had been exposed as a result of the outage. Chun pinpointed Google's cloud service as the issue. In an extraordinary joint statement from Chun and the global CEO for Google Cloud, Thomas Kurian, the pair apologized to members for the outage, and said it had been "extremely frustrating and disappointing." They said the outage was caused by a misconfiguration that resulted in UniSuper's cloud account being deleted, something that had never happened to Google Cloud before. While UniSuper normally has duplication in place in two geographies, to ensure that if one service goes down or is lost then it can be easily restored, because the fund's cloud subscription was deleted, it caused the deletion across both geographies. UniSuper was able to eventually restore services because the fund had backups in place with another provider. "Google Cloud CEO, Thomas Kurian has confirmed that the disruption arose from an unprecedented sequence of events whereby an inadvertent misconfiguration during provisioning of UniSuper's Private Cloud services ultimately resulted in the deletion of UniSuper's Private Cloud subscription," the pair said. "This is an isolated, 'one-of-a-kind occurrence' that has never before occurred with any of Google Cloud's clients globally. This should not have happened. Google Cloud has identified the events that led to this disruption and taken measures to ensure this does not happen again."

Read more of this story at Slashdot.

How Microsoft and Red Hat Are Collaborating on Cloud Migrations

Par : EditorDavid
11 mai 2024 à 15:34
SiliconANGLE looks at how starting in 2021, Microsoft and Red Hat have formed "an unlikely partnership set to reshape the landscape of cloud computing..." First, their collective open-source capabilities will lead to co-developed solutions to simplify the modernization and migration of Red Hat technologies to the cloud, seamlessly integrating them with Microsoft's Azure platform, according to João Couto, EMEA VP and COO of cloud commercial solutions at Microsoft. "We have acquired GitHub, which is also one of the largest repositories of open source worldwide," he said. "In that context, it makes a lot of sense to work together with Red Hat." Transcribed from their interview: What we have been doing so far is making sure that we are co-developing solutions together with Red Hat. And making these solutions available to our customers — making it easy for customers to transform, to modernize [their] Red Hat technology running on-prem, and moving them into cloud using our own Microsoft cloud technology, but Red Hat solutions, in a very, very seamless, integrated way. And also leveraging all the entire portfolio of Red Hat automation tools, so that they can make it easier for customers not just to do the migration, but also to do management, run the operation, and all the troubleshooting also from the customer-care perspective. So that's basically an end-to-end partnership approach that we are taking... "[Customers] get an integrated support experience from Red Hat technical teams and Microsoft technical teams. And this means that these two technical teams are often colocated, so whenever a customer has a challenge, they are being answered by Microsoft and Red Hat technical teams, all working together to solve this challenge from the customer. So this brings also an increased level of confidence to customers to move to cloud... "We have both engineering teams from both sides working together to achieve this level of integration between the two solutions. So when you talk about Red Hat Enterprise Linux or when you have the Azure Red Hat OpenShift, which is a new solution that we have recently launched — these are solutions that using open source, are bringing in an additional level of integration, flexibility, automation to customers. So that they can migrate, and manage, their solutions in a more seamless way, and in a more easy way. So we are embedding this kind of overlying partnership from an open source perspective to bring these innovations live to customers."

Read more of this story at Slashdot.

❌
❌