post

Announcing the public preview of Azure AD support for FIDO2-based passwordless sign-in

Howdy folks,

I’m thrilled to let you know that you can now go passwordless with the public preview of FIDO2 security keys support in Azure Active Directory (Azure AD)! Many teams across Microsoft have been involved in this effort, and we’re proud to deliver on our vision of making FIDO2 technologies a reality to provide you with seamless, secure, and passwordless access to all your Azure AD-connected apps and services.

In addition, we turned on a new set of admin capabilities in the Azure AD portal that enable you to manage authentication factors for users and groups in your organization. In this first release, you can use them to manage a staged rollout of passwordless authentication using FIDO2 security keys and/or the Microsoft Authenticator application. Going forward you’ll see us add the ability to manage all our traditional authentication factors (Multi-Factor Authentication (MFA), OATH Tokens, phone number sign in, etc.). Our goal is to enable you to use this one tool to manage all your authentication factors.

Why do we feel so strongly about passwordless?

Every day, more and more of our customers move to cloud services and applications. They need to know that the data and services stored in these services are secure. Unfortunately, passwords are no longer an effective security mechanism. We know from industry analysts that 81 percent of successful cyberattacks begin with a compromised username and password. Additionally, traditional MFA, while very effective, can be hard to use and has a very low adoption rate.

It’s clear we need to provide our customers with authentication options that are secure and easy to use, so they can confidently access information without having to worry about hackers taking over their accounts.

This is where passwordless authentication comes in. We believe it will help to significantly and permanently reduce the risk of account compromise.

Passwordless sign in flow 2.png

Now, all Azure AD users can sign in password-free using a FIDO2 security key, the Microsoft Authenticator app, or Windows Hello. These strong authentication factors are based off the same world class, public key/private key encryption standards and protocols, which are protected by a biometric factor (fingerprint or facial recognition) or a PIN. Users apply the biometric factor or PIN to unlock the private key stored securely on the device. The key is then used to prove who the user and the device are to the service. 

Public preview of Azure AD support for FIDO2 based passwordless 2.jpg

Check out this video where Joy Chik, corporate vice president of Identity, and I talk more about this new standard for signing in. To learn more about why this should be a priority for you and your organization, read our whitepaper.

Let’s get you started!

To help you get started on your own passwordless journey, this week we’re rolling out a bonanza of public preview capabilities. These new features include:

  • A new Authentication methods blade in your Azure AD admin portal that allows you to assign passwordless credentials using FIDO2 security keys and passwordless sign-in with Microsoft Authenticator to users and groups.

Public preview of Azure AD support for FIDO2 based passwordless 3.png

Public preview of Azure AD support for FIDO2 based passwordless 4.png

Public preview of Azure AD support for FIDO2 based passwordless 5.png

FIDO2 hardware

Microsoft has teamed up with leading hardware partners, Feitian Technologies, HID Global, and Yubico, to make sure we have a range of FIDO2 form factors available at launch, including keys connecting via USB and NFC protocols. Sue Bohn has more details on those partnerships.

Please be sure to verify that any FIDO2 security keys you’re considering for your organization meet the additional options required to be compatible with Microsoft’s implementation.

passwordless.jpg

Our passwordless strategy

Our passwordless strategy is a four-step approach where we deploy replacement offerings, reduce the password surface area, transition to password deployment, and finally eliminate passwords:

Public preview of Azure AD support for FIDO2 based passwordless 8.png

Today’s product launches are an important milestone for getting to passwordless. In addition, the engineering work we did to provide authentication methods management for administrators and user registration and management, will allow us to move even faster to improve credentials management experiences, as well as bring new capabilities and credentials online more simply. We’re working with our Windows security engineering team to make FIDO2 authentication work for hybrid-joined devices.

Of course, we look forward to feedback from you across all of these features, to help us improve before we make them generally available.

Regards,

 Alex (Twitter: @Alex_A_Simons)

 Corporate VP of Program Management

 Microsoft Identity Division

Additional links

post

Azure Data Box Heavy now generally available

Our customers continue to use the Azure Data Box family to move massive amounts of data into Azure. One of the regular requests that we receive is for a larger capacity option that retains the simplicity, security, and speed of the original Data Box. Last year at Ignite, we announced a new addition to the Data Box family that did just that – a preview of the petabyte-scale Data Box Heavy

With thanks to those customers who provided feedback during the preview phase, I’m excited to announce that Azure Data Box Heavy has reached general availability in the US and EU!

How Data Box Heavy works

Azure Data Box HeavyIn many ways, Data Box Heavy is just like the original Data Box. You can order Data Box Heavy directly from the Azure portal, and copy data to Data Box Heavy using standard files or object protocols. Data is automatically secured on the appliance using AES 256-bit encryption. After your data is transferred to Azure, the appliance is wiped clean according to National Institute of Standards and Technology (NIST) standards.

But Data Box Heavy is also designed for a much larger scale than the original Data Box. Data Box Heavy’s one petabyte of raw capacity and multiple 40 Gbps connectors mean that a datacenter’s worth of data can be moved into Azure in just a few weeks.

Data Box Heavy

  • 1 PB per order
  • 770 TB usable capacity per order
  • Supports Block Blobs, Page Blobs, Azure Files, and Managed Disk
  • Copy to 10 storage accounts
  • 4 x RJ45 10/1 Gbps, 4 x QSFP 10/40 Gbps Ethernet
  • Copy data using standard NAS protocols (SMB/CIFS, NFS, Azure Blob Storage)

Data Box

  • 100 TB per order
  • 80 TB usable capacity per order
  • Supports Block Blobs, Page Blobs, Azure Files, and Managed Disk
  • Copy to 10 storage accounts
  • 2 x RJ45 10/1 Gbps, 2 x SFP+ 10 Gbps Ethernet
  • Copy data using standard NAS protocols (SMB/CIFS, NFS, Azure Blob Storage)

Data Box Disk

  • 40 TB per order/8 TB per disk
  • 35 TB usable capacity per order
  • Supports Block Blobs, Page Blobs, Azure Files, and Managed Disk
  • Copy to 10 storage accounts
  • USB 3.1, SATA II or III
  • Copy data using standard NAS protocols (SMB/CIFS, NFS, Azure Blob Storage)

Expanded regional availability

We’re also expanding regional availability for Data Box and Data Box Disk.

Data Box Heavy US, EU
Data Box US, EU, Japan, Canada, and Australia
Data Box Disk US, EU, Japan, Canada, Australia, Korea, Southeast Asia, and US Government

Sign up today

Here’s how you can get started with Data Box Heavy:

We’ll be at Microsoft Inspire again this year, so stop by our booth to say hello to the team!

post

Configuring a Server-side Blazor app with Azure App Configuration

Avatar

With .NET Core 3.0 Preview 6, we added authentication & authorization support to server-side Blazor apps. It only takes a matter of seconds to wire up an app to Azure Active Directory with support for single or multiple organizations. Once the project is created, it contains all the configuration elements in its appsettings.json to function. This is great, but in a team environment – or in a distributed topology – configuration files lead to all sorts of problems. In this post, we’ll take a look at how we can extract those configuration values out of JSON files and into an Azure App Configuration instance, where they can be used by other teammates or apps.

Setting up Multi-org Authentication

In the .NET Core 3.0 Preview 6 blog post we explored how to use the Individual User Accounts option in the authentication dialog to set up a Blazor app with ASP.NET Identity, so we won’t go into too much detail. Essentially, you click the Change link during project creation.

Click Change Auth during project creation

In this example I’ll be using an Azure Active Directory application to allow anyone with a Microsoft account to log into the app, so I’ll select Work or School Accounts and then select Cloud – Multiple Organizations in the Change Authentication dialog.

The Visual Studio add authentication dialog.

Once the project is created, my AzureAD configuration node contains the 3 key pieces of information my app’s code will need to authenticate against Azure Active Directory; my tenant URL, the client ID for the AAD app Visual Studio created for me during the project’s creation, and the callback URI so users can get back to my app once they’ve authenticated.

The appsettings.json inclusive of the settings.

Whilst this is conveniently placed here in my appsettings.json file, it’d be more convenient if I didn’t need any local configuration files. Having a centralized configuration-management solution would be easier to manage, as well as give me the ability to keep my config out of source control, should there come a point when things like connection strings need to be shared amongst developers.

Azure App Configuration

Azure App Configuration is a cloud-based solution for managing all of your configuration values. Once I have an Azure App Configuration instance set up in my subscription, adding the configuration settings is simple. By default, they’re hidden from view, but I can click Show Values or select an individual setting for editing or viewing.

The config values in Azure App Configuration

Convenient .NET Core IConfiguration Integration

The Azure App Configuration team has shipped a NuGet package containing extensions to ASP.NET and .NET Core that enable developers the ability of using the service, but without needing to change all your code that already makes use of IConfiguration. To start with, install the Microsoft.Extensions.Configuration.AzureAppConfiguration NuGet package.

Adding the NuGet Package for Azure App Configuration

You’ll need to copy the connection string from the Azure Portal to enable connectivity between your app and Azure App Configuration.

Copying the Azure App Configuration connection string

Once that value has been copied, you can use it with either dotnet user-secrets to configure your app, or using a debug-time environment variable. Though it seems like we’ve created yet one more configuration value to track, think about it this way: this is the only value you’ll have to set using an environment variable; all your other configuration can be set via Azure App Configuration in the portal.

Setting up the Azure App Configuration connection string in an environment variable

Using the Azure App Configuration Provider for .NET Core

Once the NuGet package is installed, the code to instruct my .NET Core code to use Azure App Configuration whenever it reads any configuration values from IConfiguration is simple. In Program.cs I’ll call the ConfigureAppConfiguration middleware method, then use the AddAzureAppConfiguration extension method to get the connection string from my ASPNETCORE_AzureAppConfigConnectionString environment variable. If the environment variable isn’t set, the call will noop and the other configuration providers will do the work.

This is great, because I won’t even need to change existing – or in this case, template-generated code – I just tell my app to use Azure App Configuration and I’m off to the races. The full update to Program.cs is shown below.

// using Microsoft.Extensions.Configuration.AzureAppConfiguration; public static IHostBuilder CreateHostBuilder(string[] args) => Host.CreateDefaultBuilder(args) .ConfigureAppConfiguration((hostingContext, config) => { config.AddAzureAppConfiguration(options => { var azureAppConfigConnectionString = hostingContext.Configuration["AzureAppConfigConnectionString"]; options.Connect(azureAppConfigConnectionString); }); }) .ConfigureWebHostDefaults(webBuilder => { webBuilder.UseStartup<Startup>(); });

When I run the app, it first reaches out to Azure App Configuration to get all the settings it needs to run and then works as if it were configured locally using appsettings.json. As long as my teammates or other services needing these values have the connection string to the Azure App Configuration instance holding the settings for the app, they’re good.

Running the authenticated app

Now, I can remove the configuration values entirely from the appsettings.json file. If I want to control the logging behavior using Azure App Configuration, I could move these left-over settings out, too. Even though I’ll be using Azure App Configuration as, the other providers are still there.

The appsettings.json with the settings removed.

Dynamic Re-loading

Log levels are a good example of how the Azure App Configuration service can enable dynamic reloading of configuration settings you might need to tweak frequently. By moving my logging configuration into Azure App Configuration, I can change the log level right in the portal. In Program.cs, I can use the Watch method to specify which configuration settings I’ll want to reload when they change.

public static IHostBuilder CreateHostBuilder(string[] args) => Host.CreateDefaultBuilder(args) .ConfigureAppConfiguration((hostingContext, config) => { config.AddAzureAppConfiguration(options => { var azureAppConfigConnectionString = hostingContext.Configuration["AzureAppConfigConnectionString"]; options.Connect(azureAppConfigConnectionString) .Watch("Logging:LogLevel:Default") .Watch("Logging:LogLevel:Microsoft") .Watch("Logging:LogLevel:Microsoft.Hosting.Lifetime"); }); }) .ConfigureWebHostDefaults(webBuilder => { webBuilder.UseStartup<Startup>(); });

The default load-time is 30 seconds, but now, should I need to turn up the volume on my logs to get a better view of what’s happening in my site, I don’t need to re-deploy or even stop my site. Simply changing the values in the portal will be enough – 30 seconds later the values will be re-loaded from Azure App Configuration and my logging will be more verbose.

Changing configuration values in the portal

Configuration Source Ordering

The JsonConfigurationSource configuration sources – those which load settings from appsettings.json and appsettings.{Environment}.json – are loaded during the call to CreateDefaultBuilder. So, by the time I call AddAzureAppConfiguration to load in the AzureAppConfigurationSource, the JSON file providers are already in the configuration sources list.

The importance of ordering is evident here; should I want to override the configuration values coming from Azure App Configuration with my local appsettings.json or appsettings.Development.json files, I’d need to re-order the providers in the call to ConfigureAppConfiguration. Otherwise, the JSON file values will be loaded first, then the last source (the one that will “win”) will be the Azure App Configuration source.

Try it Out

Any multi-node or microservice-based application topology benefits from centralized configuration, and teams benefit from it by not having to keep track of so many configuration settings, environment variables, and so on. Take a look over the Azure App Configuration documentation. You’ll see that there are a multitude of other features, like Feature Flags and dark deployment support. Then, create an instance and try wiring your existing ASP.NET Code up to read configuration values from the cloud.

Avatar
Brady Gaster

Senior Program Manager, ASP.NET Core

Follow    

post

Azure premium files now generally available

Highly performant, fully managed file service in the cloud!

Today, we are excited to announce the general availability of Azure premium files for customers optimizing their cloud-based file shares on Azure. Premium files offers a higher level of performance built on solid-state drives (SSD) for fully managed file services in Azure.

Premium tier is optimized to deliver consistent performance for IO-intensive workloads that require high-throughput and low latency. Premium file shares store data on the latest SSDs, making them suitable for a wide variety of workloads like databases, persistent volumes for containers, home directories, content and collaboration repositories, media and analytics, high variable and batch workloads, and enterprise applications that are performance sensitive. Our existing standard tier continues to provide reliable performance at a low cost for workloads less sensitive to performance variability, and is well-suited for general purpose file storage, development/test, backups, and applications that do not require low latency.

Through our initial introduction and preview journey, we’ve heard from hundreds of our customers from different industries about their unique experiences. They’ve shared their learnings and success stories with us and have helped make premium file shares even better.

“Working with clients that have large amounts of data that is under FDA or HIPAA regulations, we always struggled in locating a good cloud storage solution that provided SMB access and high bandwidth… until Azure Files premium tier. When it comes to a secure cloud-based storage that offers high upload and download speeds for cloud and on-premises VM clients, Azure premium files definitely stands out.”

– Christian Manasseh, Chief Executive Officer, Mobius Logic

“The speeds are excellent. The I/O intensive actuarial CloudMaster software tasks ran more than 10 times faster in the Azure Batch solution using Azure Files premium tier. Our application has been run by our clients using 1000’s of cores and the Azure premium files has greatly decreased our run times.”

– Scott Bright, Manager Client Data Services, PolySystems

Below are the key benefits of the premium tier. If you’re looking for more technical details, read the previous blog post “Premium files redefine limits for Azure Files.”

Performant, dynamic, and flexible

With premium tier, performance is what you define. Premium file shares’ performance can instantly scale up and down to fit your workload performance characteristics. Premium file shares can massively scale up to 100 TiB capacity and 100K IOPS with a target total throughput of 10 GiB/s. Not only do premium shares include the ability to dynamically tune performance, but also offer bursting capability to meet highly variable workload requirements with short peak periods of intense IOPS.

“We recently migrated our retail POS microservices to Azure Kubernetes Service with premium files. Our experience has been simply amazing – premium files permitted us to securely deploy our 1.2K performant Firebird databases. No problem with size or performance, just adapt the size of the premium file share to instantly scale. It improved our business agility, much needed to serve our rapidly growing customer base across multiple retail chains in France.”

– Arnaud Le Roy, Chief Technology Officer, Menlog

We partnered with our internal Azure SQL and Microsoft Power BI teams to build solutions on premium files. As a result, Azure Database for PostgreSQL and Azure Database for MySQL recently opened a preview of increased scale of 16 TiB databases with 20,000 IOPS powered by premium files. Microsoft Power BI announced a powerful 20 times faster enhanced dataflows compute engine preview built upon Azure Files premium tier.

Global availability with predictable cost

Azure Files premium tier is currently available in 19 Azure regions globally. We are continually expanding regional coverage. You can check the Azure region availability page for the latest information.

Premium tier provides the most cost-effective way to create highly-performant and highly-available file shares in Azure. Pricing is simple and cost is predictable–you only pay a single price per provisioned GiB. Refer to the pricing page for additional details.

Seamless Azure experience

Customers receive all features of Azure Files in this new offering, including snapshot/restore, Azure Kubernetes Service and Azure Backup integration, monitoring, hybrid support via Azure File Sync, Azure portal, PowerShell/CLI/Cloud Shell, AzCopy, Azure Storage Explorer support, and the list goes on. Developers can leverage their existing code and skills to migrate applications using familiar Azure Storage client libraries or Azure Files REST APIs. The opportunities for future integration are limitless. Reach out to us if you would like to see more.

With the availability of premium tier, we’re also enhancing the standard tier. To learn more, visit the onboarding instructions for the standard files 100 TiB preview.

Get started and share your experiences

It is simple and takes two minutes to get started with premium file shares. Please see detailed steps for how to create a premium file share.

Visit Azure Files premium tier documentation to learn more. As always, you can share your feedback and experiences on the Azure Storage forum or email us at azurefiles@microsoft.com. Post your ideas and suggestions about Azure Storage on our feedback forum.

post

First Microsoft cloud regions in Middle East now available

This blog post was co-authored by Paul Lorimer, Distinguished Engineer, Office 365.

35400-DC-Launch-BLOGS-(960x540)

Azure and Office 365 generally available today, Dynamics 365 and Power Platform available by end of 2019

Today, Microsoft Azure and Microsoft Office 365 are taking a major step together to help support the digital transformation of our customers. Both Azure and Office 365 are now generally available from our first cloud datacenter regions in the Middle East, located in the United Arab Emirates (UAE). Dynamics 365 and Power Platform, offering the next generation of intelligent business applications and tools, are anticipated to be available from the cloud regions in UAE by the end of 2019.

The opening of the new cloud regions in Abu Dhabi and Dubai marks the first time Microsoft will deliver cloud services directly from datacenter locations in UAE and expands upon Microsoft’s existing investments in the Gulf and the wider Middle East region. By delivering the complete Microsoft cloud – Azure, Office 365, and Dynamics 365 – from datacenters in a given geography, we offer scalable, highly available, and resilient cloud services for organizations while helping them meet their data residency, security, and compliance needs.

Our new cloud regions adhere to Microsoft’s trusted cloud principles and join one of the largest and most secure cloud infrastructures in the world, already serving more than a billion customers and 20 million businesses. Microsoft has deep expertise in data protection, security, and privacy, including the broadest set of compliance certifications in the industry, and we are the first cloud service provider in UAE to achieve the Dubai Electronic Security Center certification for its cloud services. Our continued focus on our trusted cloud principles and leadership in compliance means customers in the region can accelerate their digital transformation with confidence and with the foundation to achieve compliance for their own applications.

Local datacenter infrastructure stimulates economic development for both customers and partners alike, enabling companies, governments, and regulated industries to realize the benefits of the cloud for innovation, as well as bolstering the technology ecosystem that supports the innovation. We anticipate the cloud services delivered from UAE to have a positive impact on job creation, entrepreneurship, and economic growth across the region. The International Data Corporation (IDC) predicts that cloud services could bring more than half a million jobs to the Middle East, including the potential of more than 55,000 new jobs in UAE, between 2017 and 2022.

Microsoft also continues to help bridge the skills gap amongst the IT community and to enhance technical acumen for cloud services. Cloud Society, a Middle East and Africa focused program building upon Microsoft Learn, has trained over 150,000 IT professionals in MEA. The community will further benefit from the increased availability and performance of cloud services delivered from UAE to help realize enterprise benefits of cloud, upskill in migration, and more effectively manage their cloud infrastructure.

You can learn more by following these links: Microsoft Middle East and Africa News Center, Microsoft Azure United Arab Emirates, Microsoft Office 365, Microsoft Dynamics 365, and Microsoft Power Platform.

post

What’s new in Azure SignalR 1.1.0 Preview 1

Avatar

Ken

We just shipped 1.1.0 Preview 1 of Azure SignalR Service SDK to support some new features in ASP.NET Core 3.0, including endpoint routing and server-side Blazor. Let’s take a look how you can use them in your Azure SignalR application.

Here is the list of what’s new in this release:

  • Endpoint routing support for ASP.NET Core 3
  • Use SignalR service in server-side Blazor apps
  • Server stickiness

Endpoint routing support for ASP.NET Core 3

For those who are using Azure SignalR, you should be familiar with AddAzureSignalR() and UseAzureSignalR(). These two methods are required if you want to switch your app server from self-hosted SignalR to use Azure SignalR.

A typical Azure SignalR application usually looks like this in Startup.cs (note where AddAzureSignalR() and UseAzureSignalR() are used):

public void ConfigureServices(IServiceCollection services)
{ ... services.AddSignalR() .AddAzureSignalR(); ...
} public void Configure(IApplicationBuilder app)
{ ... app.UseAzureSignalR(routes => { routes.MapHub<Chat>("/chat"); }); ...
}

ASP.NET Core 3.0 introduced a new endpoint routing support which allows routable things like MVC and SignalR to be mixed together in a unified UseEndpoints() interface.

For example, you can call MapGet() and MapHub() in a single UseEndpoints() call, like this:

app.UseEndpoints(routes =>
{ routes.MapGet("/foo", async context => { await context.Response.WriteAsync("bar"); }); routes.MapHub<Chat>("/chat");
});

This new syntax is also supported in the latest Azure SignalR SDK so you don’t need to use a separate UseAzureSignalR() to map hubs.

Now your Azure SignalR application looks like this:

public void ConfigureServices(IServiceCollection services)
{ ... services.AddSignalR() .AddAzureSignalR(); ...
} public void Configure(IApplicationBuilder app)
{ ... app.UseRouting(); app.UseEndpoints(routes => { routes.MapHub<Chat>("/chat"); }); ...
}

The only change you need to make is to call AddAzureSignalR() after AddSignalR().

This will be very useful in the case that SignalR is deeply integrated in your code base or the library you’re using. For example, when you’re using server-side Blazor.

Use SignalR service in server-side Blazor apps

Server-side Blazor is a new way to build interactive client-side web UI in ASP.NET Core 3. In server-side Blazor, UI updates are rendered at server side, then sent to browser through a SignalR connection. Since it uses SignalR, there is a natural need to use Azure SignalR service to handle the SignalR traffic so your application can easily scale.

blazor

If you look at some server-side Blazor code samples, you’ll see they have a call to MapBlazorHub() to setup the communication channel between client and server.

app.UseEndpoints(endpoints =>
{ ... endpoints.MapBlazorHub(); ...
});

The implementation of this method calls MapHub() to create a SignalR hub at server side. Before this release there is no way to change the implementation of MapBlazorHub() to use SignalR service. Now if you call AddAzureSignalR(), MapBlazorHub() will also use SignalR service to host the hub instead of hosting it on the server.

Please follow these steps to change your server-side Blazor app to use SignalR service:

  1. Open your Startup.cs, add services.AddSignalR().AddAzureSignalR() in ConfigureServices().
  2. Create a new SignalR service instance.
  3. Get connection string and set it to environment variable Azure:SignalR:ConnectionString.

Then run your app you’ll see the WebSocket connection is going through SignalR service.

Check out this repo for a complete code sample.

Server stickiness

The typical connection flow when using SignalR service is that client first negotiates with app server to get the url of SignalR service, then service routes client to app server.

When you have multiple app servers, there is no guarantee that two servers (the one who does negotiation and the one who gets the hub invocation) will be the same one.

We hear a lot of customers asking about whether it’s possible to make the two servers the same one so they can share some states between negotiation and hub invocation. In this release we have added a new “server sticky mode” to support this scenario.

To enable this, you just need to set ServerStickyMode to Required in AddAzureSignalR():

services.AddSignalR().AddAzureSignalR(options => { options.ServerStickyMode = ServerStickyMode.Required;
});

Now for any connection, SignalR service will guarantee negotiation and hub invocation go to the same app server (called “server sticky”).

This feature is very useful when you have client state information maintained locally on the app server. For example, when using server-side Blazor, UI state is maintained at server side so you want all client requests go to the same server including the SignalR connection. So you need to set server sticky mode to Required when using server-side Blazor together with SignalR service.

Please note in this mode, there may be additional cost for the service to route connection to the right app server. So there may be some negative impact in message latency. If you don’t want the performance penalty, there is another Preferred mode you can use. In this mode stickiness is not always guaranteed (only when there is no additional cost to do the routing). But you can still gain some performance benefits as message delivery is more efficient if sender and receiver are on the same app server. Also when sticky mode is enabled, service won’t balance connections between app servers (by default SignalR service balances the traffic by routing to a server with least connections). So we recommend to set sticky mode to Disabled (this is also the default value) and only enable it when there is a need.

You can refer to this doc for more details about server sticky mode.

Avatar
Ken Chen

Principal Software Engineering Manager

Follow Ken   

post

UCLA Health adopts Microsoft Azure to accelerate medical research and improve patient care

Cloud computing will help enable the delivery of more personalized health care for UCLA patients

LOS ANGELES and REDMOND, Wash. — May 30, 2019 — UCLA Health is deploying Microsoft cloud computing services to enable UCLA Health and the David Geffen School of Medicine to synthesize vast amounts of clinical and research data to speed medical discoveries and improve patient care.

Microsoft Azure, a cloud computing service, will provide UCLA Health with a standard platform through which disparate clinical information and research tools can be secured and managed by the health system. The solution will provide UCLA Health with advanced computing tools to more rapidly interpret and mobilize insights from such data and to enhance collaboration among researchers.

“Our data capabilities with Microsoft Azure will bring more medical discoveries and effective therapies to patients faster,” said Michael Pfeffer, M.D., assistant vice chancellor and chief information officer for UCLA Health Sciences. “The integration of information from structured data, like lab results and medication information, with unstructured data, like documentation, genomics and medical images, creates an incredibly powerful big-data learning platform for discovery.”

UCLA scientists will use the cloud computing tools to more efficiently analyze a variety of data sources. The artificial intelligence (AI) embedded in the tools enables speedy processing of data to glean insights for use by physicians and researchers. Machine learning enables software to recognize and act on important data patterns without the need for human instruction, producing discoveries as never before.

“Analyzing large data sets to make scientific discoveries is a race against time,” said Mohammed Mahbouba, M.D., chief data officer for UCLA Health Sciences. “Using machine learning to analyze a combination of clinical and genomics data can provide critical insights, but doing so with a traditional computing infrastructure can require significant processing time. Azure enables us to quickly deploy and scale high-performance computing environments that can reduce the required processing time sometimes from months to days to make discoveries.”

UCLA Health’s move to cloud computing is intended to advance the health system’s delivery of precision health, or the use of data and a patient’s individual circumstances, to tailor a more effective treatment. In 2017, UCLA Health and the David Geffen School of Medicine launched the UCLA Institute for Precision Health, led by Daniel Geschwind, M.D., Ph.D., to bring together faculty across multiple disciplines to make large-scale genetic and genomic research actionable for patient care. The David Geffen School of Medicine also partnered with the UCLA Samueli School of Engineering to establish a department of computational medicine, chaired by Eleazar Eskin, Ph.D., to leverage scholarship in data sciences to discover new approaches to analyzing health data.

“We are committed to creating better patient outcomes by providing UCLA Health with Microsoft Azure cloud and AI solutions to improve treatments and lives,” said Peter Lee, corporate vice president, Microsoft Healthcare. “By connecting health data and systems in the cloud in an interoperable way, we’re excited we can help advance health care data for more efficient and personalized care.”

The Azure platform employs industry-leading technology to help protect and secure sensitive data, allowing UCLA Health to continue to ensure compliance with the Health Insurance Portability and Accountability Act, or HIPAA. Patient data in UCLA Health’s platform will not be shared with Microsoft as part of this agreement.

“Another advantage of cloud computing is the way it enables UCLA researchers to more efficiently and securely work with their peers,” said Paul Boutros, Ph.D., director of Cancer Data Science at UCLA Jonsson Comprehensive Cancer Center.

“Cloud computing will allow researchers from different fields and institutions to collaborate, joining data sets and software from different formats that could not previously be integrated in a simple way,” Boutros said. “We’re bringing together new communities of experts — including computer scientists, engineers, material scientists and others — to solve the biggest health care questions. This platform allows us to provide our research collaborators with secure access to important data in one place, without moving sensitive, private health information.”

The platform’s capabilities will also enable UCLA Health to use predictive analytics, or the analysis of historical data to model and assess what might happen in the future, to aid in disease prevention.

More about UCLA Health’s efforts in precision health can be found at https://www.uclahealth.org/precision-health/.

About UCLA Health

UCLA Health is among the world’s most comprehensive academic health systems, with a mission to provide state-of-the-art patient care, train top medical professionals and support pioneering research and discovery. UCLA Health includes four hospitals on two campuses — Ronald Reagan UCLA Medical CenterUCLA Medical Center, Santa MonicaUCLA Mattel Children’s Hospital, and the Stewart and Lynda Resnick Neuropsychiatric Hospital at UCLA — as well as the David Geffen School of Medicine at UCLA. UCLA Health also offers an array of primary and specialty services at more than 170 clinics throughout Southern California. UCLA Health hospitals in Westwood and Santa Monica ranked No. 1 in Los Angeles and No. 7 nationally in the 2018-19 U.S. News & World Report assessment.

About Microsoft

Microsoft (Nasdaq “MSFT” @microsoft) enables digital transformation for the era of an intelligent cloud and an intelligent edge. Its mission is to empower every person and every organization on the planet to achieve more.

For more information, press only:

Microsoft Media Relations, WE Communications for Microsoft, (425) 638-7777, rrt@we-worldwide.com

Ryan Hatoum, UCLA Health, (310) 267-8304, rhatoum@mednet.ucla.edu

Note to editors: For more information, news and perspectives from Microsoft, please visit the Microsoft News Center at http://news.microsoft.com. Web links, telephone numbers and titles were correct at time of publication, but may have changed. For additional assistance, journalists and analysts may contact Microsoft’s Rapid Response Team or other appropriate contacts listed at http://news.microsoft.com/microsoft-public-relations-contacts.

post

All US Azure regions now approved for FedRAMP High impact level

Today, I’m excited to share our ability to provide Azure public services that meet US Federal Risk and Authorization Management Program (FedRAMP) High impact level and extend FedRAMP High Provisional Authorization to Operate (P-ATO) to all of our Azure public regions in the United States. In October, we told customers of our plan to expand public cloud services and regions from FedRAMP Moderate to FedRAMP High impact level. FedRAMP High was previously available only to customers using Azure Government. Additionally, we’ve increased the number of services available at High impact level to 90, including powerful services like Azure Policy and Azure Security Center, as we continue to drive to 100 percent FedRAMP compliance for all Azure services per our published listings and roadmap. Azure continues to support more services at FedRAMP High impact levels than any other cloud provider.

Achieving FedRAMP High means that both Azure public and Azure Government data centers and services meet the demanding requirements of FedRAMP High, making it easier for more federal agencies to benefit from the cost savings and rigorous security of the Microsoft Commercial Cloud.

While FedRAMP High in the Azure public cloud will meet the needs of many US government customers, certain agencies with more stringent requirements will continue to rely on Azure Government, which provides additional safeguards such as the heightened screening of personnel. We announced earlier the availability of new FedRAMP High services available for Azure Government.

FedRAMP was established to provide a standardized approach for assessing, monitoring, and authorizing cloud computing products and services to federal agencies, and to accelerate the adoption of secure cloud solutions by federal agencies. The Office of Management and Budget now requires all executive federal agencies to use FedRAMP to validate the security of cloud services. Cloud service providers demonstrate FedRAMP compliance through an Authority to Operate (ATO) or a Provisional Authority to Operate (P-ATO) from the Joint Authorization Board (JAB). FedRAMP authorizations are granted at three impact levels based on NIST guideline slow, medium, and high.

Microsoft is working closely with our stakeholders to simplify our approach to regulatory compliance for federal agencies, so that our government customers can gain access to innovation more rapidly by reducing the time required to take a service from available to certified. Our published FedRAMP services roadmap lists all services currently available in Azure Government to our FedRAMP High boundary, as well as services planned for the current year. We are committed to ensuring that Azure services to government provides the best the cloud has to offer and that all Azure offerings are certified at the highest level of FedRAMP compliance.

New FedRAMP High Azure Government Services include:

  • Azure DB for MySQL
  • Azure DB for PostgreSQL
  • Azure DDoS Protection
  • Azure File Sync
  • Azure Lab Services
  • Azure Migrate
  • Azure Policy
  • Azure Security Center
  • Microsoft Flow
  • Microsoft PowerApps

We will continue our commitment to provide our customers the broadest compliance in the industry, as Azure now supports 91 compliance offerings, more than any other cloud service provider. For a full listing of our compliance offerings, visit the Microsoft Trust Center.

post

Azure Artifacts updates include pay-per-GB pricing

Alex Mullans

Alex

Azure Artifacts is the one place for all of the packages, binaries, tools, and scripts your software team needs. It’s part of Azure DevOps, a suite of tools that helps teams plan, build, and ship software. For Microsoft Build 2019, we’re excited to announce some long-requested changes to the service.

Until now, a separate, additional license was required for anyone using Azure Artifacts, beyond the Azure DevOps Basic license. We heard your feedback that this was inflexible, hard to manage, and often not cost-effective, and we’ve removed it. Now, Azure Artifacts charges only for the storage you use, so that every user in your organization can access and share packages.

Every organization gets 2 GB of free storage. Additional storage usage is charged according to tiered rates starting at $2 per GB and decreasing to $0.25 per GB. Full details can be found on our pricing page.

We’ve had support for Python packages, as well as our own Universal Packages, in public preview for some time. As of now, both are generally available and ready for all of your production workloads.

If you’re developing an open source project using a public Azure Repo or a repo on GitHub, you might want to share nightly or pre-release versions of your packages with your project team. Azure Artifacts public feeds will enable you to do just that, backed by the same scale and reliability guarantees as the private feeds you use for internal development. Interested in joining the preview? Get in touch (@alexmullans on Twitter).

With Azure Artifacts, your teams can manage all of their artifacts in one place, with easy-to-configure permissions that help you share packages across the entire organization, or just with people you choose. Azure Artifacts hosts common package types:

  • Maven (for Java development)
  • npm (for Node.js and JavaScript development)
  • NuGet (for .NET, C#, etc. development)
  • Python

Screenshot of Azure Artifacts

If none of those are what you need, Azure Artifacts provides Universal Packages, an easy-to-use and lightweight package format that can take any file or set of files and version them as a single entity. Universal Packages are fast, using deduplication to minimize the amount of content you upload to the service.

Azure Artifacts is also a symbol server. Publishing your symbols to Azure Artifacts enables engineers in the next room or on the next continent to easily debug the packages you share.

Artifacts are most commonly used as part of DevOps processes and pipelines, so we’ve naturally integrated Azure Artifacts with Azure Pipelines. It’s easy to consume and publish packages to Azure Artifacts in your builds and releases.

We’re excited for you to try Azure Artifacts. If you’ve got questions, comments, or feature suggestions, get in touch on Twitter (@alexmullans) or leave a comment.

Alex Mullans
Alex Mullans

Senior Program Manager, Azure Artifacts

Follow Alex   

post

Azure introduces new innovations for SAP HANA, expanded AI collaboration with SAP

For many enterprises modernizing ERP systems is key to achieving their digital transformation goals. At Microsoft we are committed to supporting our customers by offering the single best infrastructure choice that exists for SAP HANA, bar none. 

In terms of raw capabilities we not only have the largest number of SAP HANA-certified offerings (25 configurations that span virtual machines and purpose-built bare metal instances from 192GB to 24TB), but also the widest footprint of regions with SAP HANA certified infrastructure (26 with plans to launch 8 more by end of 2019). We also support some of the largest deployments of SAP HANA in the public cloud, such as CONA Services.

We, in partnership with SAP, are very happy to announce multiple enhancements to SAP on Azure at SAPPHIRE NOW. We will offer our customers even more choices in infrastructure giving them greater VM memory, even more options around bare metal instances and business continuity.

In addition to this we are announcing deeper integration between SAP and Azure around AI, data protection and identity integration. These integrations will help our joint customers accelerate their digital transformation with the power of the cloud.

Here’s what’s new:

  • 6 TB and 12 TB VMs for SAP HANA: Azure’s Mv2 VM series will be available on May 13, offering virtual machines with up to 6TB RAM on a single VM. This is by far the largest-memory SAP HANA-certified configuration offered on any virtual machine in the public cloud. 6TB Mv2 VMs will be generally available and production certified in U.S. East and U.S. East 2 regions. U.S. West 2, Europe West, Europe North and Southeast Asia regions will be available in the coming months.

    In addition, 12TB Mv2 VMs will become available and production certified for SAP HANA in Q3 2019. With this, customers with large-scale SAP HANA deployments can take advantage of the agility offered by Azure Virtual Machines to speed SAP release cycles by spinning up dev/test systems in minutes and simplify operational processes with Azure’s integrated tools for automated patching, monitoring, backup and disaster recovery.

  • Largest Bare Metal Instance with Intel Optane for SAP HANA: In Q4 2019 we plan to launch the largest Intel Optane optimized bare metal instances in the cloud with our SAP HANA on Azure Large Instances, including general availability of a 4 socket, 9TB memory instance and a preview of an 8 socket, 18TB memory instance. These instances enable customers to benefit from faster load times for SAP HANA data in case of a restart, offering lower Recovery Time Objective (RTO) and a reduced TCO. To learn more, please get in touch with your Microsoft representative.
  • Integration of Azure AI in SAP’s digital platform: SAP’s machine learning capabilities will leverage Azure Cognitive Services containers in preview for face recognition and text recognition. By deploying Cognitive Services in containers, SAP will be able to analyze information closer to the physical world where the data resides and deliver real-time insights and immersive experiences that are highly responsive and contextually aware.

    SAP’s Machine Learning team is working with Microsoft Azure Cognitive services team to augment its own portfolio of home grown and partner services by leveraging the containerized Vision and Text Recognition services for solving identity validation and text understanding use cases.” – Dr. Sebastian Wieczorek, VP, Head of SAP Leonardo Machine Learning Foundation

  • SAP Data Custodian on Microsoft Azure is now available: In September 2018, we announced our intent to make SAP Data Custodian, a SaaS offering, available on Microsoft Azure. We deliver on that promise today. Together, SAP and Microsoft offer unprecedented levels of data governance and compliance for our joint customers. Additionally, Microsoft will be a beta customer for SAP Data Custodian for our implementation of SAP SuccessFactors on Azure. For more information, you can read this blog from SAP.
  • Managed business continuity with Azure Backup for SAP HANA: Azure Backup support for SAP HANA databases is now in public preview. With this, customers can manage large-scale SAP HANA implementations with no infrastructure for backup. For more information, please refer to the Azure Backup for SAP HANA documentation.
  • Simplified integrations with Logic Apps connector for SAP: Today, the Logic Apps connector for SAP ECC and SAP S/4HANA is generally available for all customers. Azure Logic Apps is an integration platform-as-a-service offering connectors to 250+ applications and SaaS services. With this, customers can dramatically reduce time to market for integrations between SAP and best-in-class SaaS applications. For more information, check out our Logic Apps SAP connector documentation.
  • Boosted productivity and enhanced security with Azure Active Directory and SAP Cloud Platform: Today, Standards-based integration between Azure Active Directory and SAP Cloud Platform is in preview, enabling enhanced business security and experience. For example, when using SAP Cloud Platform Identity Provisioning and Identity Authentication Services, customers can integrate SAP SuccessFactors with Azure Active Directory and ensure seamless access to SAP applications such as SAP S/4HANA, improving end-user productivity while meeting enterprise security needs.

Customers benefiting from SAP on Azure

With more than 90% of the Fortune 500 using Microsoft Azure and SAP, our 25-year partnership with SAP has always been about mutual customer success. We are confident the announcements made today will help customers using SAP on Azure grow and innovate even more than they already are: Forrester’s Total Economic Impact Study found that SAP customers on Azure, on average, can realize an ROI of 102% with a payback in under nine months from their cloud investments.

Here are five reasons SAP customers increasingly choose Azure for their digital transformation, and some customers who are benefitting:

  • Business agility: With Azure’s on-demand SAP certified infrastructure, customers can speed up dev/test processes, shorten SAP release cycles and scale instantaneously on demand to meet peak business usage. Daimler AG sped up procurement processes to deliver months faster than would have been possible in its on-premises environment. It powers 400,000 suppliers worldwide by moving to SAP S/4HANA on Azure’s M-series virtual machines.
  • Efficient insights: Dairy Farmers of America migrated its fragmented IT application landscape spread across 18 data centers, including mission critical SAP systems over to Azure. It leverages Azure Data Services and PowerBI to enable remote users easily access SAP data in a simplified and secure manner.
  • Real-time operations with IoT: Coats, a world leader in industrial threads, migrated away from SAP on Oracle to SAP HANA on Azure several years ago, enabling Coats to optimize operations with newer IoT-driven processes. With IoT monitoring, Coats now predicts inventory, manufacturing and sales trends more accurately than ever before.
  • Transforming with AI: Carlsberg, a world leader in beer brewing, migrated 80% of its enterprise applications to Microsoft Azure, including mission critical SAP apps. By leveraging Azure AI and sensors from research universities in Denmark, Carlsberg’s Beer Fingerprinting Project enabled them to map a flavor fingerprint for each sample and reduce the time it takes to research taste combinations and processes by up to a third, helping the company get more distinct beers to market faster.
  • Mission-critical infrastructure: CONA Services, the services arm for Coca-Cola bottlers, chose Azure to run its 24 TB mission critical SAP BW on HANA system on Azure’s purpose-built SAP HANA Infrastructure, powering 160,000 orders a day, which represents an annual $21B of net sales value.

Over the past few years, we have seen customers across all industries and geographies running their mission critical SAP workloads on Azure. Whether it’s customers in Retail such as Co-op and Coca-Cola, Accenture and Malaysia Airlines in services or Astellas Pharma and Zeullig Pharma in Pharmaceuticals, Rio Tinto and Devon Energy in Oil & Gas, SAP on Azure helps businesses around the world with their digital transformation.

If you are at SAPPHIRE NOW, drop by the Microsoft booth #729 to learn about these product enhancements and to experience hands-on demos of these scenarios.