Posted on Leave a comment

Introducing Azure DevOps, to help developers ship software faster and with higher quality

Today we are announcing Azure DevOps. Working with our customers and developers around the world, it’s clear DevOps has become increasingly critical to a team’s success. Azure DevOps captures over 15 years of investment and learnings in providing tools to support software development teams. In the last month, over 80,000 internal Microsoft users and thousands of our customers, in teams both small and large, used these services to ship products to you.

The services we are announcing today span the breadth of the development lifecycle to help developers ship software faster and with higher quality. They represent the most complete offering in the public cloud. Azure DevOps includes:

Azure PipelinesAzure Pipelines

CI/CD that works with any language, platform, and cloud. Connect to GitHub or any Git repository and deploy continuously. Learn More >

Azure BoardsAzure Boards

Powerful work tracking with Kanban boards, backlogs, team dashboards, and custom reporting. Learn more >

Azure ArtifactsAzure Artifacts

Maven, npm, and NuGet package feeds from public and private sources. Learn more >

Azure ReposAzure Repos

Unlimited cloud-hosted private Git repos for your project. Collaborative pull requests, advanced file management, and more. Learn more >

Azure Test PlansAzure Test Plans

All in one planned and exploratory testing solution. Learn more >

Each Azure DevOps service is open and extensible. They work great for any type of application regardless of the framework, platform, or cloud. You can use them together for a full DevOps solution or with other services. If you want to use Azure Pipelines to build and test a Node service from a repo in GitHub and deploy it to a container in AWS, go for it. Azure DevOps supports both public and private cloud configurations. Run them in our cloud or in your own data center. No need to purchase different licenses. Learn more about Azure DevOps pricing.

Here’s an example of Azure Pipelines used independently to build a GitHub repo:

pipelinesbuild2x

Alternatively, here’s an example of a developer using all Azure DevOps services together from the vantage point of Azure Boards.

boards2x

Open Source projects receive free CI/CD with Azure Pipelines

As an extension of our commitment to provide open and flexible tools for all developers, Azure Pipelines offers free CI/CD with unlimited minutes and 10 parallel jobs for every open source project. With cloud hosted Linux, macOS and Windows pools, Azure Pipelines is great for all types of projects.

Many of the top open source projects are already using Azure Pipelines for CI/CD, such as Atom, CPython, Pipenv, Tox, Visual Studio Code, and TypeScript – and the list is growing every day.

We want everyone to have extremely high quality of service. Accordingly, we run open source projects on the same infrastructure that our paying customers use.

Azure Pipelines is also now available in the GitHub Marketplace making it easy to get setup for your GitHub repos, open source or otherwise. 

Here’s a walkthrough of Azure Pipelines:

Learn more >

The evolution of Visual Studio Team Services (VSTS) 

Azure DevOps represents the evolution of Visual Studio Team Services (VSTS). VSTS users will be upgraded into Azure DevOps projects automatically. For existing users, there is no loss of functionally, simply more choice and control. The end to end traceability and integration that has been the hallmark of VSTS is all there. Azure DevOps services work great together. Today is the start of a transformation and over the next few months existing users will begin to see changes show up. What does this mean?

  • URLs will change from abc.visualstudio.com to dev.azure.com/abc. We will support redirects from visualstudio.com URLs so there will not be broken links.
  • As part of this change, the services have an updated user experience. We continue to iterate on the experience based on feedback from the preview. Today we’re enabling it by default for new customers. In the coming months we will enable it by default for existing users.
  • Users of the on-premises Team Foundation Server (TFS) will continue to receive updates based on features live in Azure DevOps. Starting with next version of TFS, the product will be called Azure DevOps Server and will continue to be enhanced through our normal cadence of updates.

Learn how to enable these changes for your existing VSTS organizations today.

Learn more

To learn more about Azure DevOps, please join us:


  • Keynote: Watch our live Azure DevOps keynote on September 11, 2018 from 8:00 – 9:30 AM Pacific Time.


  • Live training: Join our live Mixer workshop with interactive Q&A on September 17, 2018 from 8:30 AM – 2:30 PM Pacific Time.

You can save-the-date and watch both live streams on our events page. There you’ll also find additional on-demand videos and other resources to help get you started.

We couldn’t be more excited to offer Azure DevOps to you and your teams. We can’t wait to see what amazing things you create with it.

Posted on Leave a comment

Plan to map UK’s network of heart defibrillators using Azure could save thousands every year

Thousands of people who are at risk of dying every year from cardiac arrest could be saved under new plans to make the public aware of their nearest defibrillator.

There are 30,000 cardiac arrests outside of UK hospitals annually but fewer than one-in-10 of those survive, compared with a 25% survival rate in Norway, 21% in North Holland, and 20% in Seattle, in the US.

A new partnership between the British Heart Foundation (BHF), Microsoft, the NHS and New Signature aims to tackle the problem by mapping all the defibrillators in the UK, so 999 call handlers can tell people helping a cardiac arrest patient where the nearest device is.

Ambulance services currently have their own system of mapping where defibrillators are located but this is not comprehensive.

Printout of heart monitor
It is hoped the partnership can evolve to capture heart data from cardiac arrest patients

“There is huge potential ahead in the impact that technology will have in digitally transforming UK healthcare,” said Clare Barclay, Chief Operating Officer at Microsoft. “This innovative partnership will bring the power of Microsoft technology together with the incredible vision and life-saving work of BHF and the NHS. This project, powered by the cloud, will better equip 999 call handlers with information that can make the difference between life and death and shows the potential that innovative partnerships like this could make to the health of the nation.”

Cardiac arrest occurs when the heart fails to pump effectively, resulting in a sudden loss of blood flow. Symptoms include a loss of consciousness, abnormal or absent breathing, chest pain, shortness of breath and nausea. If not treated within minutes, it usually leads to death.

Defibrillators can save the life of someone suffering from a cardiac arrest by providing a high-energy electric shock to the heart through the chest wall. This allows the body’s natural pacemaker to re-establish the heart’s normal rhythm.

However, defibrillators are used in just 2% of out-of-hospital cardiac arrests, often because bystanders and ambulance services don’t know where the nearest device is located.



Owners of the tens of thousands of defibrillators in workplaces, train stations, leisure centres and public places across the country will register their device with the partnership. That information will be stored in Azure, Microsoft’s cloud computing service, where it will be used by ambulance services during emergency situations. The system will also remind owners to check their defibrillators to make sure they are in working order.

It is hoped that the partnership can evolve to enable defibrillators to self-report their condition, as well as capture heart data from cardiac arrest patients that can be sent to doctors.

Simon Gillespie, Chief Executive of the BHF, said: “Every minute without CPR or defibrillation reduces a person’s chance of surviving a cardiac arrest by around 10%. Thousands more lives could be saved if the public were equipped with vital CPR skills, and had access to a defibrillator in the majority of cases.

Everything you need to know about Microsoft’s cloud

“While we’ve made great progress in improving the uptake of CPR training in schools, public defibrillators are rarely used when someone suffers a cardiac arrest, despite their widespread availability. This unique partnership could transform this overnight, meaning thousands more people get life-saving defibrillation before the emergency services arrive.”

Simon Stevens, Chief Executive of NHS England, added: “This promises to be yet another example of how innovation within the NHS leads to transformative improvements in care for patients.”

The defibrillation network will be piloted by West Midlands Ambulance Service and the Scottish Ambulance Service, before being rolled out across the UK.

Tags: , , , , ,

Posted on Leave a comment

How to upgrade your financial analysis capabilities with Azure

In corporate finance and investment banking, risk analysis is a crucial job. To assess risk, analysts review research, monitor economic and social conditions, stay informed of regulations, and create models for the investment climate. In short, the inputs into an analysis make for a highly complex and dynamic calculation, one that requires enormous computing power. The vast number of calculations and the way the math is structured typically allows for high degrees of parallelization across many separate processes. To satisfy such a need, grid computing employs any number of machines working together to execute a set of parallelized tasks — which is perfect for risk analysis. By using a networked group of computers that work together as a virtual supercomputer, you can assemble and use vast computer grids for specific time periods and purposes, paying, only for what you use. Also, by splitting tasks over multiple machines, processing time is significantly reduced to increase efficiency and minimize wasted resources.

image
The Azure Industry Experiences team has recently authored two documents to help those involved in banking scenarios. We show how to implement a risk assessment solution that takes advantage of cloud grid computing technologies.

The first document is a short overview for technical decision makers, especially those considering a burst-to-cloud scenario. The second is a solution guide. It is aimed at solution architects, lead developers and others who want a deeper technical illumination of the strategy and technology.

Recommended next steps

  1. Read the Risk Grid Computing Overview
  2. Read the Risk Grid Computing Solution Guide
Posted on Leave a comment

Nimble Collective now powered by Microsoft Azure accelerates the future of animation

Nimble Collective and Microsoft Azure

— Nimble Collective on Microsoft Azure extends global reach;  plans to revolutionize animated content creation —

June 28, 2018 – MOUNTAIN VIEW, CA  – Today, Nimble Collective, the leading cloud-based animation technology platform, announced that it is available and optimized on the Microsoft Azure cloud platform. As Nimble Collective’s exclusive cloud partner, Microsoft Azure provides storage and compute services to power Nimble’s high-end animation platform, rendering workloads and media asset management. As a Microsoft partner, Nimble will receive go-to-market and market development support from one of the world’s largest enterprise salesforces.

Microsoft Azure’s support of Nimble lets us expand our global reach to bring new voices, stories and artists to the world of animation,” said Rex Grignon, Co-Founder, CEO, Nimble Collective. “It will help us further reduce cost, complexity and enable new and established studios to accelerate their animation production. Nimble Collective is excited to partner with a platform company, with a trusted public cloud – Microsoft Azure – to deliver a world-class cloud-based animation production and management solution.”

Nimble Collective’s cloud-based animation production platform streamlines the traditional animation studio infrastructure. With full end-to-end production capabilities hosted in Microsoft Azure’s secure hyperscale environment, Nimble’s platform integrates everything digital animators need, including streaming workstations, asset management, license brokering, versioning, elastic compute farm to MPAA certification. All of which is seamlessly woven together in a browser-based workflow that dramatically simplifies and improves archaic animation processes.

“We go beyond data and storage to deliver a complete industrial-strength platform that dramatically lowers barriers to entry for every animator.” said Grignon. “Microsoft Azure uniquely gives us the features, performance, and scalability to offer our customers better user experiences, reliability and security a fraction of the cost of in-house studios.”

“Microsoft is committed to helping content creators achieve more using the cloud with a partner-focused approach to this industries transformation,” said Tad Brockway, General Manager, Azure Storage, Media and Edge at Microsoft Corp.  “We’re excited to work with innovators like Nimble Collective to help them transform how animated content is produced, managed and delivered.”

“This partnership with Azure represents further validation of the promise of Nimble Collective and we can’t wait to see what the future holds” said James Bennett, Co-founder/Creative Director with Shomen Productions. “Nimble is our solution of choice for our remote based production studio. We’re impressed with their user-friendly pipeline and amazing support team.”

About Nimble Collective

Founded in 2014 by Academy Award-winning animators and technology entrepreneurs, Nimble Collective is revolutionizing the animation content market by offering studio-level capabilities without the costly and complex infrastructure. With all the production capabilities of the animation pipeline hosted in a secure cloud environment, animators and their collaborators, wherever they are in the world, are able to spend more time creating instead of managing infrastructure. With Nimble Collective, studios can spend more time creating, and save up to 75% on overhead and drive faster time to market. Nimble Collective is the brainchild of animators Rex Grignon (Toy Story, Madagascar, and founding head of character animation at DreamWorks), Jason Schleifer (Lord of the Rings, Megamind) and Scott LaFleur (How to Train Your Dragon, Megamind). Learn more at NimbleCollective.com.

Media Contact for Nimble Collective

Juliet Travis
Liftoff Communications
510-612-9622
[email protected]

Posted on Leave a comment

New Azure innovations advance the intelligent cloud and intelligent edge

Today, I gathered with the tech community in the Seattle area at the GeekWire Cloud Tech Summit to talk about how customers are using the cloud and what the future holds. I joined GeekWire’s Todd Bishop and Tom Krazit on stage for a fireside chat to share more about Microsoft’s vision for emerging cloud innovation, but I also got to connect with many of you directly. In those conversations, it became even more apparent just how many of you are turning to the intelligent cloud to explore how emerging innovation like serverless, blockchain, edge computing, and AI can help you create solutions that can change your business — and people’s lives.

At Microsoft, we’re continually releasing technology that’s inspired by our customers and what you tell us you need to make the intelligent cloud and intelligent edge a reality for your businesses. For example, you may need to build applications to work in remote areas with low connectivity. Or you need to store, access, and drive insights from your data faster because of competitive pressures. And, you need confidence that your data and applications will be secure, resilient, and highly available across the globe.

During my time with Tom and Todd, I announced a few of our latest Azure solutions and newest regions and availability zones designed to bring this vision to you, our customers, and I’d like to share more on these new technologies.

Introducing the next level in big data analytics

Azure Data Lake Storage Gen2 and general availability of Azure Data Factory capabilities

Data is currency for enterprises today, and we know you need to be able to easily store and quickly access your data to drive actionable insights. You also need to be able to ingest and integrate big data quickly and easily to get to insights more readily. Today, we are introducing a preview of a highly scalable, highly performant, and cost-effective data lake solution for big data analytics, Azure Data Lake Storage Gen 2, to deliver the scale, performance, and security needed by your most demanding and sensitive workloads. 

Because Azure Data Lake Gen2 is built on the foundations of Azure blob storage, all data — from data that is constantly in use through to data that only needs to be referenced occasionally or stored for regulatory reasons — can coexist in a single store without having to copy data. This means that you will have greater speed to insight over your data along with rich security at a more cost-effective price. Azure Data Lake Storage also provides a unified data store where unstructured object data and file data can be accessed concurrently via Blob Storage and Hadoop File System protocols.

Today also brings the general availability of new features in Azure Data Factory to deliver data movement as a service capacities so you can build analytics across hybrid and multicloud environments, and drive raw data into actionable insights. The new features include a web-based graphical user interface to create, schedule and manage data pipelines, code-free data ingestion from over 70 data source connectors to accelerate data movement across on-premises and cloud, and ability to easily lift SQL Server Integration Services packages to Azure and run in managed execution environment in Azure Data Factory. You can also start taking advantage of native ADF connector for Azure Data Lake Storage to load your data lake at scale.

Enabling the intelligent edge

Azure IoT Edge is generally available

In the next 10 years, nearly all our everyday devices and many new devices will be connected. These devices are all becoming so “smart” that they can power advanced algorithms that help them see, listen, reason, predict and more, without a 24/7 dependence on the cloud. This is the intelligent edge, and it will define the next wave of innovation in how we address world issues: distributing resources like water and oil, increasing food production and quality, and responding to natural disasters.

As key part of our strategy to deliver the promise of edge computing is Azure IoT Edge, which enables consistency between cloud and edge. This means you can push AI and machine learning to the edge, providing the most comprehensive and innovative edge offering on the market. As of today, Azure IoT Edge is generally available globally, with new updates for increased flexibility, scalability, security, and more.

Also today, the Azure IoT Edge runtime is open sourced and available on GitHub. If you are a developer, this gives you even greater flexibility and control of your edge solutions, so you can modify the runtime and debug issues. Azure IoT Edge now also supports more languages than another other edge solution including C#, C, Node.js, Python, and Java, and we’ve added support for the Moby container management system. Additionally, we’ve released a Device Provisioning Service that enables you to provision tens of thousands of devices with zero touch. The new Security Manager for Azure IoT Edge acts as a well-bounded security core for protecting the IoT Edge device and all its components by abstracting the secure silicon hardware.

Azure IoT Edge customers like Schneider Electric and a farmer in Carnation, Washington are building sophisticated solutions that deliver real-time insights in areas with unreliable connectivity. Now that the solution is production-ready, with enhanced features, we can’t wait to see what else you build.

Azure global infrastructure

New Azure regions

We continuously invest in our cloud infrastructure to give you more compute power to enable the intelligent cloud and intelligent edge. We’ve announced 54 Azure regions to help you deliver cloud services and apps to nearly every corner of the globe and to provide everything that’s needed to run mission-critical applications, across scenarios, with a full set of resiliency solutions.

Today, we expanded our Azure presence in China, one of the most dynamic cloud markets in the world, with two additional regions now generally available. We were the first international cloud provider in China in 2014 (in partnership with 21Vianet), and today’s announcement doubles the number of Azure regions available there. We continue to see immense opportunity in China for cloud services to fuel innovation and multinational corporations including Adobe, Coke, Costco, Daimler, Ford, Nuance, P&G, and Toyota, which are choosing our intelligent cloud services to help deliver for their customers in China. This builds on our recently announced plans to expand our cloud infrastructure in Europe and the Middle East and announced plans for new regions coming to Norway

We’re also constantly increasing Azure’s resiliency capabilities with the addition of new Azure Availability Zones. Our Availability Zone in the Netherlands is now generally available, adding to the Zones already available in Iowa and Paris. The combination of region pairs and Availability Zones not only increases Azure’s resiliency capabilities, but broadens customer choice for business continuity architectures and delivers an industry-leading SLA for virtual machines.

It’s an exciting future

It was great to see all of you at GeekWire Cloud Tech Summit today. For those of you who weren’t able to be there live, you can follow along online. As always, we will continue to focus on building the technologies you need to drive innovation and disruption with the intelligent cloud and intelligent edge. Let us know what you think about these new solutions by sharing your feedback and comments.

Posted on Leave a comment

Azure Kubernetes Services – with new regions, more features – now generally available

They say time flies when you’re having fun, and as I approach two years working on containers in Azure, I see the truth in that saying. Over the last two years we have launched a Kubernetes service in Azure, acquired Deis, joined the Linux foundation, launched the Draft and Brigade open source projects, launched the first serverless container infrastructure in the major public clouds, and most recently acquired GitHub where Kubernetes was born. We’ve also seen incredible growth in Kubernetes on Azure, with five times the number of customers and ten times the usage of a year ago. To say that the excitement never ends at Microsoft and Azure is an understatement!

Today, I’m incredibly excited to announce that the Azure Kubernetes Service (AKS) is now generally available. We are also adding five new regions including Australia East, UK South, West US, West US 2, and North Europe. AKS is now generally available in ten regions across three continents, and we expect to add ten more regions in the coming months!

With AKS in all these regions, users from around the world, or with applications that span the world, can deploy and manage their production Kubernetes applications with the confidence that Azure’s engineers are providing constant monitoring, operations, and support for our customers’ fully managed Kubernetes clusters. Azure was also the first cloud to offer a free managed Kubernetes Service and we continue to offer it for free in GA. We think you should be able to use Kubernetes without paying for our management infrastructure.

Going from preview to general availability requires dedication and hard work by both the AKS engineering team as well as the customers who volunteered their time and patience to try out our new service. I’m extremely grateful to everyone inside and outside of Microsoft who contributed their time to improving AKS and making the general availability possible. The product that we ship today is better because of your hard work. Thank you!

In addition to the work on AKS, the team has also been engaged with the upstream open source Kubernetes community. With open source, it is insufficient to just consume software, it is critical to be engaged with and contributing to the projects that you use. Consequently, I’m incredibly proud of the nearly seventy Microsoft employees who have made contributions to Kubernetes.

The Kubernetes API is just the beginning. From its inception, a core component of Microsoft’s DNA has been building the platforms to empower and enable developers to become more productive. It has been awesome to see this heritage pull through into a new generation of tools to enable builders of cloud native applications. As we showed this past May at the Microsoft Build conference, Azure is the most complete and capable cloud for cloud native application development. On Azure, our tools make it easier to build and debug your containerized applications with Kubernetes. To make this even better, we’re excited to announce even more features now available in all AKS regions including Kubernetes role-based access control (RBAC), Azure Active Directory based identity, the ability to deploy clusters into pre-existing custom virtual networks, and more. Furthermore, you can even deploy and manage your clusters using both open source tools like Terraform and Azure’s own Resource Manager templates. Come check them out!

In addition to making Azure an industry leading place to run Kubernetes, we are also strongly committed to giving back to the Kubernetes user community, no matter where they want to run Kubernetes. To that end:

  • We lead and support the development of the Helm package manager for Kubernetes, which was recently elevated to be a top-level project in the Cloud Native Compute Foundation.
  • We have built and released Draft and Brigade to make Kubernetes more approachable for novice users.
  • We continue to improve the Kubernetes plugins for Visual Studio and Visual Studio Code, enabling an intuitive, easy to use integration between development and operations environments.
  • These integrations extend into DevOps tools where you can simply integrate AKS into your favorite CI/CD tools such as Jenkins and Visual Studio Team Services.
  • Additionally, with our work on the virtual kubelet, we are leading a cross-industry effort that brings Kubernetes management to environments without VMs using innovative technology like Azure Container Instances.

All of this work wouldn’t mean anything unless we delivered a useful service that meets users where they are and enables them to achieve great things. It’s awesome to see companies like Siemens and Varian Health find success using Kubernetes on Azure.

As proud as I am about everything that the Azure Kubernetes team has done to get us to this point, I’m even more excited to see what production applications our customers build on top of our production-grade, worldwide Azure Kubernetes Service. As for the team, we’re already hard at work on the next set of exciting features for all our AKS users and Kubernetes friends around the world!

If you want to learn more, I encourage you to come create a cluster, deploy some applications, and see what Azure Kubernetes Service has to offer, or try out one of our sample apps.

Thanks!

Brendan

Posted on Leave a comment

New Azure offerings further commitment to offer largest scale, broadest choice for SAP HANA in the cloud

Microsoft at SAPPHIRE NOW 2018

Enterprises have been embarking on a journey of digital transformation for many years. For many enterprises this journey cannot start or gain momentum until core SAP Enterprise Resource Planning (ERP) landscapes are transformed. The last year has seen an acceleration of this transformation with SAP customers of all sizes like Penti, Malaysia Airlines, Guyana Goldfields , Rio Tinto, Co-op, and Coats migrating to the cloud on Microsoft Azure. This cloud migration, which is central to digital transformation, helps to increase business agility, lower costs, and enable new business processes to fuel growth. In addition, it has allowed them to take advantage of advancements in technology such as big data analytics, self-service business intelligence (BI), and Internet of Things (IOT).

As leaders in enterprise software, SAP and Microsoft provide the preferred foundation for enabling the safe and trusted path to digital transformation. Together we enable the inevitable move to SAP S/4HANA which will help accelerate digital transformation for customers of all sizes.

Microsoft has collaborated with SAP for 20+ years to enable enterprise SAP deployments with Windows Server and SQL Server. In 2016 we partnered to offer SAP certified, purpose-built, SAP HANA on Azure Large Instances supporting up to 4 TB of memory. Last year at SAPPHIRENOW, we announced the largest scale for SAP HANA in the public-cloud with support up to 20 TB on a single node and our M-series VM sizes up to 4 TB. With the success of M-series VMs and our SAP HANA on Azure Large instances, customers have asked us for even more choices to address a wider variety of SAP HANA workloads.

Microsoft is committed to offering the most scale and performance for SAP HANA in the public cloud, and yesterday announced additional SAP HANA offerings on Azure which include:

  • Largest SAP HANA optimized VM size in the cloud: We are happy to announce that the Azure M-series will support large memory virtual machines with sizes up to 12 TB. These new sizes will  be launching soon, pushing the limits of virtualization in the cloud for SAP HANA. These new sizes are based on Intel Xeon Scalable (Skylake) processors and will offer the most memory available of any VM in the public cloud.
  • Wide range of SAP HANA certified VMs: For customers needing smaller instances we have expanded our offering with smaller M-series VM sizes, extending Azure’s SAP HANA certified M-series VM range from 192 GB – 4 TB with 10 different VM sizes. These sizes offer on-demand and SAP certified instances with flexibility to spin-up or scale-up in minutes and to spin-down to save costs all in a pay-as-you-go model available worldwide. This flexibility and agility is something that is not possible with a private cloud or on-premises SAP HANA deployment.
  • 24 TB bare metal instance and optimized price per TB: For customers that need a higher performance dedicated offering for SAP HANA, we are increasing our investments in our purpose-built bare metal SAP HANA infrastructure. We now offer additional SAP HANA TDIv5 options of 6 TB, 12 TB, 18 TB, and 24 TB configurations in addition to our current configurations from 0.7TB to 20 TB. This enables customers who need more memory but the same number of cores to get a better price per TB deployed.
  • Most choice for SAP HANA in the cloud: With 26 distinct SAP HANA offerings from 192 GB to 24 TB, scale-up certification up to 20 TB and scale-out certification up to 60 TB, global availability in 12 regions with plans to increase to 22 regions in the next 6 months, Azure now offers the most choice for SAP HANA workloads of any public cloud.

Microsoft Azure also enables customers to derive insights and analytics from SAP data with services such as Azure Data Factory SAP HANA connector to automate data pipelines, Azure Data Lake Store for hyper scale data storage and Power BI, an industry leading self-service visualization tool, to create rich dashboards and reports from SAP ERP data.

Our unique partnership with SAP to enable customer success

Last November, Microsoft and SAP announced an expanded partnership to help customers accelerate their business transformation with S/4HANA on Azure. Microsoft has been a long time SAP customer for many of our core business processes such as financials and supply chain. As part of this renewed partnership, Microsoft announced it will use S/4HANA for Central Finance, and SAP announced it will use Azure to host 17 internal business critical systems.

I am very pleased to share an update on SAP’s migration to Azure from Thomas Saueressig, CIO of SAP:

“In 2017 we started to leverage Azure as IaaS Platform. By the end of 2018 we will have moved 17 systems including an S/4HANA system for our Concur Business Unit. We are expecting significant operational efficiencies and increased agility which will be a foundational element for our digital transformation.”

Correspondingly here’s an update on Microsoft’s migration to S/4HANA on Azure from Mike Taylor, GM Partner; Enterprise Applications Services at Microsoft.

“In 2017 we started the internal migration of our SAP system that we have been running for over 25 years, to S/4HANA. As part of that journey we felt it was necessary to first move our SAP environment completely onto Azure, which we completed in February 2018. With the agility that Azure offers we have already stood up multiple sandbox environments to help our business realize the powerful new capabilities of S/4HANA.”

As large enterprises, we are going through our business transformation with SAP S/4HANA on Azure and we will jointly share lessons from our journey and reference architectures at several SAPPHIRE NOW sessions.

Last November, we announced availability of SAP HANA Enterprise Cloud (HEC) with Azure, to offer customers an accelerated path to SAP HANA. We see several customers, such as Avianca and Aker embark on their business transformation by leveraging the best of both worlds, SAP’s managed services, and the most robust cloud infrastructure for SAP HANA on Azure.

“In Avianca, we are committed on providing the best customer experience through the use of digital technologies. We have the customer as the center of our strategy, and to do that, we are in a digital transformation of our customer experience and of our enterprise to provide our employees with the best tools to increase their productivity. Our new implementation of SAP HANA Enterprise Cloud on Microsoft Azure is a significant step forward in our enterprise digital transformation,” said Mr. Santiago Aldana Sanin, Chief Technology Officer, Chief Digital Officer at Avianca. “The SAP and Microsoft partnership continues to create powerful solutions that combine application management and product expertise from SAP with a global, trusted and intelligent cloud from Microsoft Azure. At Avianca, we are leveraging the strengths of both companies to further our journey to the cloud.”

Learn more about HEC with Azure.

Announcing SAP Cloud Platform general availability on Azure

The SAP Cloud Platform offers developers a choice to build their SAP applications and extensions using a PaaS development platform with integrated services. Today, I’m excited to announce that SAP Cloud Platform is now generally available on Azure. Developers can now deploy Cloud Foundry based SAP Cloud Platform on Azure in the West Europe region. We’re working with SAP to enable more regions in the months ahead.

“We are excited to announce general availability for SAP Cloud Platform on Azure. With our expanded partnership last November, we have been working on a number of co-engineering initiatives for the benefit of our mutual customers. SAP Cloud Platform offers the best of PaaS services for developers building apps around SAP and Azure offers world-class infrastructure for SAP solutions with cloud services for application development. With this choice, developers can spin up infrastructure on-demand in a global cloud co-located with other business apps, scale up as necessary in minutes boosting developer productivity and accelerating time to market for innovative applications with SAP solutions,” said Björn Goerke, CTO of SAP and President of SAP Cloud Platform.

Microsoft has a long history of working with developers with our .NET, Visual Studio, and Windows community. With our focus on open source support on Azure for Java, Node.js, Red Hat, SUSE, Docker, Kubernetes, Redis, and PostgreSQL to name a few, Azure offers the most developer friendly cloud according to the latest development-only public cloud platforms report from Forrester. We recently published an ABAP SDK for Azure on GitHub, to enable SAP developers to seamlessly connect into Azure services from SAP applications.

SAP application developers can now use a single cloud platform to co-locate application development next to SAP ERP data and boost development productivity with Azure’s Platform services such as Azure Event Hubs for event data processing and Azure Storage for unlimited inexpensive storage, while accessing SAP ERP data at low latencies for faster application performance. To get started with SAP Cloud Platform on Azure, sign up for a free trial account.

Today, we are also excited to announce another milestone in our partnership. SAP’s Hybris Commerce Cloud now runs on Azure as a “Software as a service” offering.

Customers embarking on digital transformation with SAP on Azure

With our broadest scale global offerings for SAP on Azure, we are seeing increased momentum with customers moving to the cloud. Here are some digital transformation stories from recent customer deployments.

  • Daimler AG: One of the world’s most successful automotive companies, Daimler AG is modernizing its purchasing system with a new SAP S/4HANA on Azure solution. The Azure-based approach is a foundational step in an overall digital transformation initiative to ensure agility and flexibility for the contracting and sourcing of its passenger cars, commercial vehicles, and International Procurement Services on a global basis.
  • Devon Energy: Fully committed to its digital transformation, Devon Energy is pioneering efforts to deploy SAP applications on Azure across all its systems. The Oklahoma City-based independent oil and natural gas exploration and production company is strategically partnering with Microsoft on multiple fronts such as AI, IT modernization, and SAP on Azure. Learn more about Devon’s digital transformation at their SAPPHIRE NOW session.
  • MMG: MMG, a global mining company, recognized that existing SAP infrastructure was approaching end-of-life, and that moving to the cloud would deliver the lowest long-term cost whilst providing flexibility to grow and enable new capabilities. Immediate benefits have been realized in relation to the overall performance of SAP, in particular, data loads into Business Warehouse.

For more on how you can accelerate your digital transformation with SAP on Azure, please check out our website.

In closing, at Microsoft, we are committed to ensuring Azure offers the best enterprise-grade option for all your SAP workload needs, whether you are ready to move to HANA now or later. I will be at SAPPHIRE NOW 2018 and encourage you to check our SAPPHIRE NOW event website for details on 40+ sessions and several demos that we’ll be showcasing at the event. Stop by and see us at the Microsoft booth #358 to learn more.

Posted on Leave a comment

Azure Stack now available in 92 countries to help build, deploy and operate hybrid cloud apps

I love all the amazing things our partners and customers are doing on Azure! Adobe, for example, is using a combination of infrastructure and platform services to deliver the Adobe Experience Manager globally. HP is using AI to help improve their customer experiences. Jet is using microservices and containers on IaaS VMs to deliver a unique ecommerce experience. These three customers are just a few examples where major businesses have bet on Azure, the most productive, hybrid, trusted and intelligent cloud.

For the last few years, Infrastructure-as-a-service (IaaS) has been the primary service hosting customer applications. Azure VMs are the easiest to migrate from on-premises while still enabling you to modernize your IT infrastructure, improve efficiency, enhance security, manage apps better and reduce costs. And I am proud that Azure continues to be recognized as a leader in this key area.

As you are considering your movement and deployment in the cloud, I want to share a few key reasons you should bet on Azure for your infrastructure needs.

Infrastructure for every workload

We are committed to providing the right infrastructure for every workload. Across our 50 Azure regions, you can pick from a broad array of virtual machines with varying CPU, GPU, Memory and disk configurations for your application needs. For HPC workloads that need extremely fast interconnect, we have InfiniBand and for supercomputing workloads, we offer Cray hardware in Azure. For SAP HANA, we provide virtual machines up to 4 TB of RAM and provide purpose-built bare metal infrastructure up to 20 TB of RAM. Not only do we have some of the most high-powered infrastructure out there, but we also provide confidential computing capabilities for securing your data while in use. These latest computing capabilities even include quantum computing. Whether it’s Windows Server, Red Hat, Ubuntu, CoreOS, SQL, Postgres or Oracle, we support over 4,000 pre-built applications to run on this broad set of hardware. With the broad set of infrastructure we provide, enterprises such as Coats are moving their entire datacenter footprint to Azure.

Today, I am announcing some exciting new capabilities:

  • Industry leading M-series VM sizes offering memory up to 12 TB on a single VM, the largest memory for a single VM in the public cloud for in-memory workloads such as SAP HANA. With these larger VM configurations, we are not only advancing the limits of virtualization in the cloud but also the performance of SAP HANA on VMs. These new sizes will be based on Intel Xeon Scalable (Skylake) processors, with more details available in the coming months.
  • Newer M-series VM sizes with memory as low as 192 GB, extending M-series VM range from 192 GB to 4 TB in RAM, available now, enabling fast scale-up and scale-down with 10 different size choices. M-series VMs are certified for SAP HANA and available worldwide in 12 regions. Using Azure ARM template automation scripts for SAP HANA, you can deploy entire SAP HANA environments in just minutes compared to weeks on-premises.
  • New SAP HANA TDIv5 optimized configurations for SAP HANA availability on Azure Large Instances with memory sizes of 6 TB, 12 TB, 18 TB. In addition to this, we now offer the industry-leading public cloud instance scale for SAP HANA with our new 24TB TDIv5 configuration. This extends our purpose-built SAP HANA offering to 15 different instance choices. With these new configurations, you can benefit from a lower price for TDIv5 configuration with an unparalleled 99.99% SLA for SAP HANA infrastructure and the ability to step up to larger configurations.
  • New Standard SSDs provide a low-cost SSD-based Azure Disk solution, optimized for test and entry-level production workloads requiring consistent performance and high throughput. You will experience improved latency, reliability and scalability as compared to Standard HDDs. Standard SSDs can be easily upgraded to Premium SSDs for more demanding and latency-sensitive enterprise workloads. Standard SSDs come with the same industry leading durability and availability that our clients expect from Azure Disks.

Truly consistent hybrid capabilities

When I talk with many of our customers about their cloud strategy, there is a clear need for choice and flexibility on where to run workloads and applications. Like most customers, you want to be able to bridge your on-premises and cloud investments. From VPN and ExpressRoute to File Sync and Azure Security Center, Azure offers a variety of services that help you enable, connect and manage your on-premises and cloud environments creating a truly hybrid infrastructure. In addition, with Azure Stack, you can extend Azure services and capabilities to on-premises and the edge, allowing you to build, deploy and operate hybrid cloud applications seamlessly.

Today, I’m happy to announce that we are expanding the geographical coverage of Azure Stack to meet the growing demand of the customers globally. Azure Stack will now be available in 92 countries throughout the world. Given your excitement over Azure Stack, we continue to expand opportunities for you to deploy this unique service. For a full list of the supported countries, please visit the Azure Stack overview page.

Liquid Telecom, a leading data, voice and IP provider in eastern, central and southern Africa, plans to use Azure Stack to deliver value to its customers and partners in some of the most remote parts of the world.

“We have a long history of delivering future-focused and innovative services to our customers. Microsoft Azure Stack strengthens this mission while also enabling us to deliver value to a new set of customers and partners in some of the most remote parts of Africa. By using Azure Stack, alongside Azure and ExpressRoute over our award-winning pan-African fiber network, we can now guide our customers through increasingly complex business challenges, such as data privacy, compliance and overall governance in the cloud. This helps not only us, but our entire channel of distribution and value-added partners in enhancing the customer experience on their digital journey.” David Behr, Chief Product Officer, Liquid Telecom

Built-in security and management

I frequently get asked about best practices for security and management in Azure. We have a unique set of services that make it incredibly easy for you to follow best practices whether running a single VM or 1,000s of VMs, including built-in backup, policy, advisor, security detection, monitoring, log analytics and patching. This will help you proactively protect your VMs and detect potential threats to your environment. These services are built upon decades of experience at Microsoft in delivering services across Xbox, Office, Windows and Azure with thousands of security professionals and more than 70 compliance certifications. We have also taken a leadership position on topics such as privacy and compliance to standards such as General Data Protection Regulation (GDPR), ISO 27001, HIPAA and more. Last week we announced the general availability of Azure Policy which is a free service to help you control and govern your Azure resources at scale.

Today, I’m excited to announce a few additional built-in security and management capabilities:

  • Disaster recovery for Azure IaaS virtual machines general availability: You likely need disaster recovery capabilities to ensure your applications are compliant with regulations that require a business continuity plan (such as ISO27001). You also may need your application to run continuously in the unlikely event of a natural disaster that could impact an entire region. With the general availability of this new service, you can configure disaster recovery within minutes, not days or weeks, with a built-in disaster recovery as a service that is unique to Azure. Learn more about how to get started by visiting our documentation.

“ASR has helped Finastra refine our DR posture through intuitive configuration of replication between Azure regions. It’s currently our standard platform for disaster recovery and handling thousands of systems with no issues regarding scale or adherence to our tight RPO/RTO requirements.” Bryan Heymann, Director of Systems & Architecture, D+H

  • Azure Backup for SQL in Azure Virtual Machines preview: Today we are extending the Azure backup capability beyond virtual machines and files to also include backup of a SQL instance running on a VM. This is a zero-infrastructure backup service that provides freedom from managing backups scripts, agents, backup servers or even backup storage. Moreover, customers can perform SQL log backups with 15-minute intervals on SQL Servers and SQL Always On Availability groups. Learn more on the key benefits of this capability and how to get started.
  • VM Run command: Customers can easily run scripts on an Azure VM directly from the Azure portal without having to connect to the machine. You can run either PowerShell scripts or Bash scripts and you can even troubleshoot a machine that has lost connection to the network. Learn more about Run Command for Windows and Linux.

More ways to save money, manage costs and optimize infrastructure

Given the agility offered by cloud infrastructure, I know you not only want freedom to deploy but also want tight control on your costs. You want to optimize your spending as you transition to the cloud. We can help drive higher ROI by reducing and optimizing infrastructure costs.

Azure offers innovative products and services to help reduce costs, like low priority VMs, burstable VMs, vCPU-constrained VMs for Oracle and SQL databases, and archive storage so customers can choose the right cost optimized infrastructure option for their app. Azure also uniquely offers free Cost Management so customers can manage and optimize their overall budget better.

With Azure Reserved VM Instances (RIs), you can save up to 72 percent. By combining RIs with Azure Hybrid Benefit, you can save up to 80 percent on Windows Server virtual machines, and up to 73* percent compared to AWS RIs for Windows VMs – making Azure the most cost-effective cloud to run Windows Server workloads. Customers like Smithfield Foods have been able to slash datacenter costs significantly, reduce new-application delivery time and optimize their infrastructure spend.

I hope you enjoyed this overview of some of the coolest new capabilities and services in Azure. We are constantly working to improve the platform and make a simpler and easier infrastructure service for you! Please let us know how we can make it even better.

Get started today with Azure IaaS. You can also register now to the Azure IaaS webcast I am hosting on June 18, 2018 on many of these topics.

Thanks,

Corey

*Disclaimer:

  1. Sample annual cost comparison of two D2V3 Windows Server VMs. Savings based two D2V3 VMs in US West 2 Region running 744 hours/month for 12 months; Base compute rate at SUSE Linux Enterprise rate for US West 2. Azure pricing as of April 24, 2018. AWS pricing updated as of April, 24, 2018. Price subject to change.
  2. The 80 percent of saving is based on the combined cost of Azure Hybrid Benefit for Windows Server and 3-year Azure Reserved Instance. It does not include Software Assurance cost.
  3. Actual savings may vary based on location, instance type, or usage.
Posted on Leave a comment

Unravel, now on Azure HDInsight, helps devs build fast, reliable big data apps

Unravel on HDInsight enables developers and IT Admins to manage performance, auto scaling & cost optimization better than ever.

We are pleased to announce Unravel on Azure HDInsight Application Platform. Azure HDInsight is a fully-managed open-source big data analytics service for enterprises. You can use popular open-source frameworks (Hadoop, Spark, LLAP, Kafka, HBase, etc.) to cover broad range of scenarios such as ETL, Data Warehousing, Machine Learning, IoT and more. Unravel provides comprehensive application performance management (APM) for these scenarios and more. The application helps customers analyze, optimize, and troubleshoot application performance issues and meet SLAs in a seamless, easy to use, and frictionless manner. Some customers report up to 200 percent more jobs at 50 percent lower cost using Unravel’s tuning capability on HDInsight.

To learn more please join Pranav Rastogi, Program Manager at Microsoft Azure Big Data, and Shivnath Babu, CTO at Unravel, in a webinar on June 13 for how to build fast and reliable big data apps on Azure while keeping cloud expenses within your budget.

How complex is guaranteeing an SLA on a Big Data solution?

The inherent complexity of big data systems, disparate set of tools for monitoring, and lack of expertise in optimizing these open source frameworks create significant challenges for end-users who are responsible for guaranteeing SLAs. Users today have to monitor their applications with Ambari which only provides infrastructure metrics to administer the cluster health, performance and utilization. Big Data solutions use a variety of open source frameworks. Monitoring applications running across all of these frameworks is a daunting task. Users have to troubleshoot issues manually by analyzing logs from YARN, Hive, Tez, LLAP, Pig, Spark, Kafka, etc. To get good performance, users may have to change settings in Spark executors, YARN queues, Kafka topic configuration, region servers in HBase, storage throttling, sizing of compute and more. Unravelling this complexity is an art and science.

Monitoring Big Data applications now made easy with Unravel

Unravel on HDInsight provides intelligent applications and operations management for Big Data a breeze. Its Application Performance Management correlates full-stack performance and provides automated insights and recommendations. Users can now analyze troubleshoot and optimize performance with ease. Here are the key value propositions of Unravel on HDInsight:

Proactive alerting and automatic actions

  • Proactive alerts on applications missing SLAs, violating usage policies or affecting other applications running on the cluster.
  • Automatic actions to resolve above issues using dynamic thresholds and company defined policies such as killing bad applications, re-directing apps based on priority levels.

Analyze app performance

  • Intuitive, end-to-end view of application performance with drill-down capabilities into bottlenecks, problem areas and errors.
  • Correlated insights into all factors affecting app performance such as resource allocation, container utilization, poor configuration settings, task execution pattern, data layout, resource contention and more.
  • Rapid detection of performance and cost issues caused by applications.

Following is an example of how Unravel diagnosed poor resource usage and high cost caused by a Hive on Tez application. Tuning the application using recommendations provided by Unravel reduced the cost of running this application by 10 times.

cost_wastage

Here is an example of cluster utilization after using Unravel. Unravel enables you to utilize resources efficiently and auto scale the cluster based on SLA needs, which results in cost savings.

auto_scaling

AI-driven intelligent recommendation engine

  • Recommend optimal values for fastest execution and/or least resource utilization including: data parallelism, optimal container size, number of tasks, etc.

The example below shows how Unravel’s AI-driven engine provides actionable recommendations for optimal performance of a Hive query.

tez_recommendations

  • Identify and automatically fix issues related to poor execution, skew, expensive joins, too many mappers/reducers, caching, etc.

Following is an example of Unravel automatically detecting lag in a real-time IoT application using Spark Streaming & Kafka and recommending a solution.

Streaming recommendations

Get started with Unravel on HDInsight

Customers can easily install Unravel on HDInsight using a single click in Azure portal. Unravel provides a live view into the behavior of big data applications using open source frameworks such as Hadoop, Hive, Spark, Kafka, LLAP and more on Azure HDInsight.

Installing unravel

After installing you can launch the Unravel application from the Applications blade of the cluster as shown below.

Launching unravel data

Try Unravel now on HDInsight!

Attend the webinar!

To learn more please join Pranav Rastogi, Program Manager at Microsoft Azure Big Data, and Shivnath Babu, CTO at Unravel, in a webinar on June 13 on how to build fast and reliable big data apps on Azure while keeping cloud expenses within your budget.

Summary

Azure HDInsight is the most comprehensive platform offering a wide range of fully-managed frameworks such as Hadoop, Spark, Hive, LLAP, Kafka, HBase, Storm, and more. We are pleased to announce the expansion of HDInsight Application Platform to include Unravel. Unravel provides comprehensive Application Performance Management (APM) across various open source analytical frameworks to help customers analyze, optimize, and troubleshoot application performance issues and meet SLAs in a seamless, easy to use, and frictionless manner.

Stay up-to-date on the latest Azure HDInsight news and features by following us on Twitter #HDInsight and @AzureHDInsight. For questions and feedback, please reach out to [email protected].

Posted on Leave a comment

Announcing ASP.NET Providers Connected Service Visual Studio Extension

Provider pattern was introduced in ASP.NET 2.0 and it gives the developers the flexibility of where to store the state of ASP.NET features (e.g. Session State, Membership, Output Cache etc.). In ASP.NET 4.6.2, we added async support for Session State Provider and Output Cache Provider.  These providers provide much better scalability, and enables the web application to adapt to the cloud environment.  Furthermore, , we also released SqlSessionStateProviderAsync, CosmosDBSessionStateProviderAsync, RedisSessionStateProvider and SQLAsyncOutputCacheProvider.  Through these providers the web applications can store the Session State in Azure resources like, SQL Azure, CosmosDB, and Redis Cache, and Output Cache in SQL Azure.  With these options, it may be not very straightforward to pick one and configure it right in the application.  Today we are releasing ASP.NET Providers Connected Service Visual Studio Extension to help you pick the right provider and configure it properly to work with Azure resources.  This extension will be your one-stop shop where you can install and configure all the ASP.NET providers that are Azure ready.

How to install the extension

The ASP.NET Providers Connected Service Extension can be installed on Visual Studio 2017. You can install it through Extensions and Updates in Visual Studio and type “ASP.NET Providers Connected Service” in the search box. Or you can download the extension from Visual Studio MarketPlace.

How to use the extension

To use the Extension, you need to make sure that your web application targets to .NET Framework 4.6.2 or higher.  You can open the extension through right clicking on the project, selecting Add and clicking on Connected Service. You will see all the Connected Services installed on your VS which apply to your project.

After clicking on Microsoft ASP.NET Providers extension. You will see the following wizard window, you can choose the provider you want to install and configure for your ASP.NET web application. Currently we have two sets of providers, Session State providers and Output Cache provider.

Select a provider and click on the Next button. You will see a list of providers that apply to your application, which connects with Azure resources. Currently we have SQL SessionState provider, CosmosDB SessionState provider, RedisCache Sessionstate provider and SQL OutputCache provider.

After the provider is chosen, the wizard window will lead you to select an Azure instance which will be used by the provider selected.  In order to fetch the Azure instances that apply to the selected provider, you will need to sign in with your account in Visual Studio.   Then Select an Azure instance and click on the Finish button, the extension will install the relevant Nuget packages and update the web.config file to connect the provider with that selected Azure instance.

Things to be aware of

  1. If the application is already configured with a provider and you want to install a same type of provider, you need to remove that provider first. E.g. your application is using SQL SessionState provider and you want to switch to CosmosDB SessionState provider. In this case, you need to remove the SessionState Provider settings in the web.config, then you can use ASP.NET Providers Connected Services to install and configure the CosmosDB SessionState provider.
  2. If you are installing Async SQL SessionState provider or Async SQL OutputCache provider, you need to replace the user name and password in the connection string in web.config added by ASP.NET Providers Connected Services. As you may have multiple accounts in your Azure SQL Database instance.

Summary

ASP.NET Providers Connected Services helps you install and configure ASP.NET providers for your web application to consume Azure services. Our goal of this Visual Studio extension is to make it easier and provide a central place to help you configure different providers for the ASP.NET web applications and connect your web applications with Azure. Please install the extension from Visual Studio Marketplace today and let us know your feedback.