post

New integrations from Microsoft and NVIDIA unlock GPU acceleration for devs and data scientists

With ever-increasing data volume and latency requirements, GPUs have become an indispensable tool for doing machine learning (ML) at scale. This week, we are excited to announce two integrations that Microsoft and NVIDIA have built together to unlock industry-leading GPU acceleration for more developers and data scientists.

  • Azure Machine Learning service is the first major cloud ML service to integrate RAPIDS, an open source software library from NVIDIA that allows traditional machine learning practitioners to easily accelerate their pipelines with NVIDIA GPUs
  • ONNX Runtime has integrated the NVIDIA TensorRT acceleration library, enabling deep learning practitioners to achieve lightning-fast inferencing regardless of their choice of framework.

These integrations build on an already-rich infusion of NVIDIA GPU technology on Azure to speed up the entire ML pipeline.

“NVIDIA and Microsoft are committed to accelerating the end-to-end data science pipeline for developers and data scientists regardless of their choice of framework,” says Kari Briski, Senior Director of Product Management for Accelerated Computing Software at NVIDIA. “By integrating NVIDIA TensorRT with ONNX Runtime and RAPIDS with Azure Machine Learning service, we’ve made it easier for machine learning practitioners to leverage NVIDIA GPUs across their data science workflows.”

Azure Machine Learning service integration with NVIDIA RAPIDS

Azure Machine Learning service is the first major cloud ML service to integrate RAPIDS, providing up to 20x speedup for traditional machine learning pipelines. RAPIDS is a suite of libraries built on NVIDIA CUDA for doing GPU-accelerated machine learning, enabling faster data preparation and model training. RAPIDS dramatically accelerates common data science tasks by leveraging the power of NVIDIA GPUs.

Exposed on Azure Machine Learning service as a simple Jupyter Notebook, RAPIDS uses NVIDIA CUDA for high-performance GPU execution, exposing GPU parallelism and high memory bandwidth through a user-friendly Python interface. It includes a dataframe library called cuDF which will be familiar to Pandas users, as well as an ML library called cuML that provides GPU versions of all machine learning algorithms available in Scikit-learn. And with DASK, RAPIDS can take advantage of multi-node, multi-GPU configurations on Azure.

Learn more about RAPIDS on Azure Machine Learning service or attend the RAPIDS on Azure session at NVIDIA GTC.

ONNX Runtime integration with NVIDIA TensorRT in preview

We are excited to open source the preview of the NVIDIA TensorRT execution provider in ONNX Runtime. With this release, we are taking another step towards open and interoperable AI by enabling developers to easily leverage industry-leading GPU acceleration regardless of their choice of framework. Developers can now tap into the power of TensorRT through ONNX Runtime to accelerate inferencing of ONNX models, which can be exported or converted from PyTorch, TensorFlow, MXNet and many other popular frameworks. Today, ONNX Runtime powers core scenarios that serve billions of users in Bing, Office, and more.

With the TensorRT execution provider, ONNX Runtime delivers better inferencing performance on the same hardware compared to generic GPU acceleration. We have seen up to 2X improved performance using the TensorRT execution provider on internal workloads from Bing MultiMedia services.

To learn more, check out our in-depth blog on the ONNX Runtime and TensorRT integration or attend the ONNX session at NVIDIA GTC.

Accelerating machine learning for all

Our collaboration with NVIDIA marks another milestone in our venture to help developers and data scientists deliver innovation faster. We are committed to accelerating the productivity of all machine learning practitioners regardless of their choice of framework, tool, and application. We hope these new integrations make it easier to drive AI innovation and strongly encourage the community to try it out. Looking forward to your feedback!

post

Now available: Azure Backup for SQL Server Virtual Machines – the simple, modern way to do backup in the cloud

How do you back up your SQL Servers today? You could be using backup software that require you to manage backup servers, agents, and storage, or you could be writing elaborate custom scripts which need you to manage the backups on each server individually. With the modernization of IT infrastructure and the world rapidly moving to the cloud, do you want to continue using the legacy backup methods that are tedious, infrastructure-heavy, and difficult to scale? Azure Backup for SQL Server Virtual Machines (VMs) is the modern way of doing backup in cloud, and we are excited to announce that it is now generally available! It is an enterprise scale, zero-infrastructure solution that eliminates the need to deploy and manage backup infrastructure while providing a simple and consistent experience to centrally manage and monitor the backups on standalone SQL instances and Always On Availability Groups.

Azure Backup for SQL Server running in Azure Virtual Machines

Built into Azure, the solution combines the core cloud promises of simplicity, scalability, security and cost effectiveness with inherent SQL backup capabilities that are leveraged by using native APIs, to yield high fidelity backups and restores. The key value propositions of this solution are:

  1. 15-minute Recovery Point Objective (RPO): Working with uber critical data and have a low RPO? Schedule a log backup to happen every 15 minutes.
  2. One-click, point-in-time restores: Tired of elaborate manual restore procedures? Restore databases to a point in time up to a second in one click, without having to manually apply a chain of logs over differential and full backups.
  3. Long-term retention: Rigorous compliance and audit needs? Retain your backups for years, based on the retention duration, beyond which the recovery points will be pruned automatically by the built-in lifecycle management capability.
  4. Protection for encrypted databases: Concerned about security of your data and backups? Back-up SQL encrypted databases and secure backups with built-in encryption at rest while controlling backup and restore operations with Role-Based Access Control.
  5. Auto-protection: Dealing with a dynamic environment where new databases get added frequently? Auto-protect your server to automatically detect and protect the newly added databases.
  6. Central management and monitoring: Losing too much time managing and monitoring backups for each server in isolation? Scale smartly by creating centrally managed backup policies that can be applied across databases. Monitor jobs and get alerts and emails across servers and even vaults from a single pane of glass.
  7. Cost effective: No infrastructure and no overhead of managing the scale, seems like value for the money already? Enjoy reduced total cost of ownership and flexible pay-as-you-go option.

Get started

Click on the image below to watch the video, “How to back up SQL Server running in Azure VMs with Azure Backup.”

Azure portal with ‘Restore’ blade open inside the vault view that shows the graphical view of continuous log backups.

Customer feedback

We have been in preview for a few months now, and have seen an overwhelming response from our customers:

“Our experience with Azure SQL Server Backup has been fantastic. It’s a solution you can put in place in a couple of minutes and not have to worry about it. To restore DBs, we don’t have to deal with rolling logs and only have to choose a date and time. It gives us great peace of mind to know the data is safely stored in the Recovery Services Vaults with our other protected items.”

– Steven Hayes, Principal Architect, Acuity Brands Lighting, Inc

“We have been using Azure Backup for SQL Server for the past few months and have found it simple to use and easy to set up. The backup and restore operations are performant and reliable as well as easy to monitor. We plan to continue using it in the future.”

– Celica E. Candido, Cloud Operations Analyst, Willis Towers Watson

Additional resources

post

Microsoft and Intel partner to speed deep learning workloads on Azure

This post is co-authored with Ravi Panchumarthy and Mattson Thieme from Intel.

We are happy to announce that Microsoft and Intel are partnering to bring optimized deep learning frameworks to Azure. These optimizations are available in a new offering on the Azure marketplace called the Intel Optimized Data Science VM for Linux (Ubuntu).

Over the last few years, deep learning has become the state of the art for several machine learning and cognitive applications. Deep learning is a machine learning technique that leverages neural networks with multiple layers of non-linear transformations, so that the system can learn from data and build accurate models for a wide range of machine learning problems. Computer vision, language understanding, and speech recognition are all examples of deep learning at play today. Innovations in deep neural networks in these domains have enabled these algorithms to reach human level performance in vision, speech recognition and machine translation. Advances in this field continually excite data scientists, organizations and media outlets alike. To many organizations and data scientists, doing deep learning well at scale poses challenges due to technical limitations.

Often, default builds of popular deep learning frameworks like TensorFlow are not fully optimized for training and inference on CPU. In response, Intel has open-sourced framework optimizations for Intel® Xeon processors. Now, through partnering with Microsoft, Intel is helping you accelerate your own deep learning workloads on Microsoft Azure with this new marketplace offering.

“Microsoft is always looking at ways in which our customers can get the best performance for a wide range of machine learning scenarios on Azure. We are happy to partner with Intel to combine the toolsets from both the companies and offer them in a convenient pre-integrated package on the Azure marketplace for our users” 

– Venky Veeraraghavan, Partner Group Program manager, ML platform team, Microsoft.

Accelerating Deep Learning Workloads on Azure

Built on the top of the popular Data Science Virtual Machine (DSVM), this offer adds on new Python environments that contain Intel’s optimized versions of TensorFlow and MXNet. These optimizations leverage the Intel® Advanced Vector Extensions 512 (Intel® AVX-512) and Intel® Math Kernel Library for Deep Neural Networks (Intel® MKL-DNN) to accelerate training and inference on Intel® Xeon® Processors. When running on an Azure F72s_v2 VM instance, these optimizations yielded an average of 7.7X speedup in training throughput across all standard CNN topologies. You can find more details on the optimization practice here.

As a data scientist or AI developer, this change is quite transparent. You still code with the standard TensorFlow or MXNet frameworks. You can also use the new set of Python (conda) environments (intel_tensorflow_p36, intel_mxnet_p36) on the DSVM to run your code to take full advantage of all the optimizations on an Intel® Xeon Processor based F-Series or H-Series VM instance on Azure. Since this product is built using the DSVM as the base image, all the rich tools for data science and machine learning are still available to you. Once you develop your code and train your models, you can deploy them for inferencing on either the cloud or edge.

“Intel and Microsoft are committed to democratizing artificial intelligence by making it easy for developers and data scientists to take advantage of Intel hardware and software optimizations on Azure for machine learning applications. The Intel Optimized Data Science Virtual Machine (DSVM) provides up to a 7.7X speedup on existing frameworks without code modifications, benefiting Microsoft and Intel customers worldwide”

Binay Ackalloor, Director Business Development, AI Products Group, Intel.

Performance

In Intel’s benchmark tests run on Azure F72s_v2 instance, here are the results comparing the optimized version of TensorFlow with the standard TensorFlow builds.

Bar graph of Default TensorFlow vs. Intel Optimization for TensorFlow

Figure 1: Intel® Optimization for TensorFlow provides an average of 7.7X increase (average indicated by the red line) in training throughput on major CNN topologies. Run your own benchmarks using tf_cnn_benchmarks. Performance results are based on Intel testing as of 01/15/2019. Find the complete testing configuration here.

Getting Started

To get started with the Intel Optimized DSVM, click on the offer in the Azure Marketplace, then click “GET IT NOW”. Once you answer a few simple questions on the Azure Portal, your VM is created with all the DSVM tool sets and the Intel optimized deep learning frameworks pre-configured and ready to use.

Screenshot of Intel Optimized Data Science VM for Linux in Azure marketplace

The Intel Optimized Data Science VM is an ideal environment to develop and train deep learning models on Intel Xeon processor-based VM instances on Azure. Microsoft and Intel will continue their long partnership to explore additional AI solutions and framework optimizations to other services on Azure like the Azure Machine Learning service and Azure IoT Edge.

Next steps

post

Azure Hybrid Benefit for SQL Server

Angelos Petropoulos

Angelos

In my previous blog post I talked about how to migrate data from existing on-prem SQL Server instances to Azure SQL Database. If you haven’t heard SQL Server 2008 end of support is coming this summer, so it’s a good time to evaluate moving to an Azure SQL Database.

If you decide to try Azure, chances are you will not be able to immediately move 100% of your on-prem databases and relevant applications in one go. You’ll probably come up with a migration plan that spans weeks, months or even years. During the migration phase you will be spinning up new instances of Azure SQL Database and turning off on-prem SQL Server instances, but for a little bit of time there will be overlap.

To help manage the cost during such transitions we offer the Azure Hybrid Benefit. You can convert 1 core of SQL Enterprise edition to get up to 4 vCores of Azure SQL Database at a reduced rate. For example, if you have 4 core licenses of SQL Enterprise edition, you can receive up to 16 vCores of Azure SQL Database.

If you want to learn more check out the Azure Hybrid Benefit FAQ and don’t forget, if you have any questions around migrating your .NET applications to Azure you can always ask for free assistance. If you have any questions about this post just leave us a comment below.

Angelos Petropoulos

post

Migrating your existing on-prem SQL Server database to Azure SQL DB

If you are in the process of moving an existing .NET application to Azure, it’s likely you’ll have to migrate an existing, on-prem SQL database as well. There are a few different ways you can go about this, so let’s go through them.

Data Migration Assistant (downtime required)

The Data Migration Assistant (download | documentation) is free, easy to use, slick and extremely powerful! It can:

  • Evaluate if your database is ready to migrate and produce a readiness report (command line support included)
  • Provide recommendations for how to remediate migration blocking issues
  • Recommend the minimum Azure SQL Database SKU based on performance counter data of your existing database
  • Perform the actual migration of schema, data and objects (server roles, logins, etc.)

After a successful migration, applications will be able to connect to the target SQL server databases seamlessly. There are currently a couple of limitations, but the majority of databases shouldn’t be impacted. If this sounds interesting, check out the full tutorials on how to migrate to Azure SQL DB and how to migrate to Azure SQL DB Managed Instance.

Azure Data Migration Service (no downtime required)

The Azure Data Migration Service allows you to move your on-prem database to Azure without taking it offline during the migration. Applications can keep on running while the migration is taking place. Once the database in Azure is ready you can switch your applications over immediately.

If this sounds interesting, check out the full tutorials on how to migrate to Azure SQL DB and how to migrate to Azure SQL DB Managed Instance without downtime.

SQL Server Management Studio (downtime required)

You are probably already familiar with SQL Server Management Studio (download | documentation), but if you are not it’s basically an IDE for SQL Server built on top of the Visual Studio shell and it’s free! Unlike the Data Migration Assistant, it cannot produce readiness reports nor can it suggest remediating actions, but it can perform the actual migration two different ways.

The first way is by selecting the command “Deploy Database to Microsoft Azure SQL Database…” which will bring up the migration wizard to take you through the process step by step:

The second way is by exporting the existing, on-prem database as a .bacpac file (docs to help with that) and then importing the .backpac file into Azure:

Resolving database migration compatibility issues

There are a wide variety of compatibility issues that you might encounter, depending both on the version of SQL Server in the source database and the complexity of the database you are migrating. Use the following resources, in addition to a targeted Internet search using your search engine of choices:

In addition to searching the Internet and using these resources, use the MSDN SQL Server community forums or StackOverflow. If you have any questions or problems just leave us a comment below.

post

Venture Beat: Microsoft, Moovit and TomTom team up for multi-modal transport platform

A triumvirate of tech companies today announced what they’re touting as the “world’s first truly comprehensive multi-modal trip planner.”

Taking the stage at the Move mobility conference in London, executives from Microsoft, TomTom, and Moovit outlined how they’re pooling their various transport, data, and cloud processing smarts so developers can integrate more extensive transport options into their own applications.

“Over the last few years, cities have experienced rising urban sprawl, where residents of metropolitan areas have been pushed out toward suburban areas, often beyond the limits of public transit lines,” noted Azure Maps head Chris Pendleton. “With most jobs still residing in densely populated cities, the typical commute is becoming multi-modal.”

Moovit, for the uninitiated, constitutes two core elements: a consumer-facing app that gives travelers the easiest way to get around a city and a mobility-as-a-service (MaaS) platform that provides municipalities with data and analytics to improve city transport infrastructure.

A few months back, Moovit and Microsoft announced a partnership that would allow developers who use Azure Maps to hook into Moovit’s transit data. Last week, GPS navigation stalwart TomTom announced an expanded tie-up with Microsoft, one that will bring TomTom’s extensive maps and traffic data into the Azure fold — in effect, TomTom will serve as the primary location data provider for Microsoft’s cloud platform, Bing Maps, and Cortana.

Fast-forward to today, and the three companies are now pooling their respective capabilities for an urban transport offering that any third-party developer can leverage.

“This will lead to commuters having the best option to plan a trip [by] combining legs on public transit, ride-sharing, bike, or scooter and other legs by car, including finding available parking lot spaces in real time,” added Moovit cofounder and CEO Nir Erez.

Through Azure Maps, Microsoft’s developer-focused mapping platform, users will be able to access not only Moovit’s public transit data, but also TomTom’s real-time driving and parking data. So developers will be able to include the full gamut of transport options inside their own apps — including buses, trains, metros, ferries, carpooling, bike-sharing, and now driving and parking.

“Blending TomTom’s specific auto and parking lot data with Moovit’s multi-modal trip planning gives Azure Maps an unprecedented view of every facet of urban mobility,” Pendleton continued. “No one else has provided this comprehensive level of service in one solution.”

Multi-modal

We’ve seen a growing number of integrations between transport service providers, and the term “multi-modal” is springing up more frequently in urban mobility conversations.

Back in 2016, Moovit combined its public transport data with Uber to find the best route in dozens of cities. A year later, Uber revealed that it would make it easier for its ride-hailing customers to keep tabs on other transit options by displaying real-time public transport data alongside private vehicles. Earlier this month, Uber revealed it was doubling down on efforts to show full transit options directly inside the Uber app.

Elsewhere, navigation giant Here recently launched a new social transport app for planning and sharing rides, merging all modes of transport, including public, private, and personal, into a single platform.

Tying these myriad transport options together is maps and data, which is the currency for urban mobility companies. TomTom was pretty much blindsided by the arrival of iOS and Android a decade ago, with smartphones enabling portable navigation in everyone’s pockets. In recently times, TomTom has been striving to refocus its core business to better compete with Google in maps and navigation, and it recently offloaded its telematics unit for $1 billion to help with this push.

Today’s announcement, alongside TomTom’s broader Azure partnership announced last week, should go some way toward helping the company gain more traction.

“Location data has become more relevant and important than ever before, and no one knows this better than TomTom,” added managing director Anders Truelsen.

post

Announcing an easier way to use latest certificates from Key Vault

Posting on behalf of Prashanth Yerramilli

When we launched Azure Key Vault a few years ago, it solved a major problem users had which was that storing sensitive and/or secret information in code or config files in plain text causes multiple problems including security exposure. Users stored their secrets in a safe store like Key Vault and used a URI to fetch the secret material. This service has been wildly popular and has become a standard for cloud applications. It is used by fledling startups to Fortune 500 companies world over.

Developers use Key Vault to store their adhoc secrets, certificates and keys used for encryption. And to follow best security practices they create secrets that are short lived. An example of typical flow in this case could be

  • Step 1: Developer creates a certificate in Key Vault
  • Step 2: Developer sets the lifetime of the secret to be 30 day. In other words developer asks Key Vault to re-create the certificate every 30 days. Developer also chooses to receive an email when a certificate is about to expire
  • Step 3: Developer writes a polling service to check if the certificate has indeed expired

In the above scenario there are few challenges for the customer. They would have to write a polling service that constantly checks if the certificate has expired and if so they wait for the new certificate and then bind it in Windows Certificate manager.
Now what if developer doesn’t have to poll. And also if the developer doesn’t have to bind the new certificate in Windows Certificate manager. To solve this exact problem we built a Key Vault Virtual Machine Extension.

Azure virtual machine (VM) extensions are small applications that provide post-deployment configuration and automation tasks on Azure VMs. For example, if a virtual machine requires software installation, anti-virus protection, or to run a script inside of it, a VM extension can be used. Azure VM extensions can be run with the Azure CLI, PowerShell, Azure Resource Manager templates, and the Azure portal. Extensions can be bundled with a new VM deployment, or run against any existing system.
To learn more about VM Extensions please click here

Key Vault VM Extension is supposed to do just that as explained in the steps below

  • Step 1: Create a Key Vault and create an Azure Windows Virtual Machine
  • Step 2: Install the Key Vault VM Extension on the VM
  • Step 3: Configure Key Vault VM Extension to monitor a specific vault by specifying how often it should fetch the certificate

By doing the above steps the latest certificate is bound correctly in Windows Certificate Manager. This feature enables auto-rotation of SSL certificates, without necessitating a re-deployment or binding.

In the lifecycle of secrets management fetching the latest version of the secret (for the purpose of this article a certificate) is just as important as storing it securely. To solve this problem, on an Azure Virtual Machine, we’ve created a VM Extension for Windows. A Linux version is coming soon.
Virtual Machine Extensions are small applications that provide post-deployment configuration and automation tasks on Azure VMs. In this case the Key Vault Virtual Machine extension once installed fetches the latest version of the certificate at a specified interval and automatically binds the latest version of the certificate in the certificate store on Windows. As you can see this feature enables auto-rotation of SSL certificates, without necessitating a re-deployment or binding.

Also before we begin going through the tutorial, we need to understand a concept called Managed Identities.
Your code needs credentials to authenticate to cloud services, but you want to limit the visibility of those credentials as much as possible. Ideally, they never appear on a developer’s workstation or get checked-in to source control. Azure Key Vault can store credentials securely so they aren’t in your code, but to retrieve them you need to authenticate to Azure Key Vault. To authenticate to Key Vault, you need a credential! A classic bootstrap problem. Through the magic of Azure and Azure AD, MI provides a “bootstrap identity” that makes it much simpler to get things started.

Here’s how it works: When you enable MI for an Azure resource such as a virtual machine, Azure creates a Service Principal (an identity) for that resource in Azure AD, and injects the credentials (of that identity) into the resource (in this case a virtual machine).

  1. Your code calls a local MI endpoint to get an access token
  2. MI uses the locally injected credentials to get an access token from Azure AD
  3. Your code uses this access token to authenticate to an Azure service

Managed Identities

Now within Managed Identities there are 2 types

  1. System Assigned managed identity is enabled directly on an Azure service instance. When the identity is enabled, Azure creates an identity for the instance in the Azure AD tenant that’s trusted by the subscription. The lifecycle of the identity is managed by Azure and is tied to the Azure service instance.
  2. User Assigned managed identity is created as a standalone Azure resource. Users first create an identity and then assign that identity to one or more Azure resources.

In this tutorial I will demonstrate how to create a Azure Virtual Machine with an ARM template which also includes creating a Key Vault VM Extension on the VM.

Prerequisites

Step 1

After the prerequisites are complete, create an System Assigned identity by following this tutorial

Step 2

Assign the newly created System Assigned identity to access to your Key Vault

  • Go to https://portal.azure.com and navigate to your Key Vault
  • Select Access Policies section and Add New by searching for the User Assigned identity
    AccessPolicies

Step 3

Create or Update a VM with the following ARM template
You can view full the ARM template here and the ARM Parameters file here.

The most minimal settings in the ARM template are shown below:

 {
 "secretsManagementSettings": {
 "observedCertificates": [
 "<KeyVault URI of a secret to be monitored/retrieved, in versionless format: https://myVaultName.vault.azure.net/secrets/myCertName">,
 "<more entries here>", 
 "pollingIntervalInS": "[parameters('kvvmextPollingInterval')]",
 ]
 }
 }

As you can see we only specify the observedCertificates parameter and polling Interval in seconds


Note: Your observedCertificates urls should be of the form:

https://myVaultName.vault.azure.net/secrets/myCertName 

and not:

https://myVaultName.vault.azure.net/certificates/myCertName 

Reason being the /secrets path returns the full certificate, inluding the private key, while the /certificates path does not.

By following this tutorial you can create a VM with the above specified template

The above tutorial assumes that you are storing your certificates on Windows Certificate Manager. And so the VM Extension pulls down the latest certificates at a specified interval and automatically binds those certificates in your certificate manager.

That’s all folks!

Linux Version: We’re actively working on a VM Extension for Linux and would love to hear any feedback you might have.

We are eager to hear from you about your use cases and how we can evolve the VM Extension to help you. So please reach out to us and add your feature requests to the Azure feedback forum. If you run into issues using the VM extension please reach out to us on StackOverflow.

Prashanth Yerramilli, Senior Program Manager, Azure Key Vault

Prashanth Yerramilli Profile Pic Prashanth Yerramilli is the Key Vault Program Manager on the Azure Security team. He has over 10 years of Software Engineering experience and brings to the team love for creating the ultimate development experience.

Prashanth can be reached at:
-Twitter @yvprashanth1
-GitHub https://github.com/yvprashanth

post

Azure Data Explorer, now available, can query 1 billion records in under a second

As Julia White mentioned in her blog today, we’re pleased to announce the general availability of Azure Data Lake Storage Gen2 and Azure Data Explorer. We also announced the preview of Azure Data Factory Mapping Data Flow. With these updates, Azure continues to be the best cloud for analytics with unmatched price-performance and security. In this blog post we’ll take a closer look at the technical capabilities of these new features.

Azure Data Lake Storage – The no compromise Data Lake

Azure Data Lake Storage (ADLS) combines the scalability, cost effectiveness, security model, and rich capabilities of Azure Blob Storage with a high-performance file system that is built for analytics and is compatible with the Hadoop Distributed File System. Customers no longer have to tradeoff between cost effectiveness and performance when choosing a cloud data lake.

One of our key priorities was to ensure that ADLS is compatible with the Apache ecosystem. We accomplished this by developing the Azure Blob File System (ABFS) driver. The ABFS driver is officially part of Apache Hadoop and Spark and is incorporated in many commercial distributions. The ABFS driver defines a URI scheme that allows files and folders to be distinctly addressed in the following manner:


abfs[s]://file_system@account_name.dfs.core.windows.net/<path>/<path>/<filename>

It is important to note that the file system semantics are implemented server-side. This approach eliminates the need for a complex client-side driver and ensures high fidelity file system transactions.

To further boost analytics performance, we implemented a hierarchical namespace (HNS) which supports atomic file and folder operations. This is important because it reduces the overhead associated with processing big data on blob storage. This speeds up job execution and lowers cost because fewer compute operations are required.

The ABFS driver and HNS significantly improve ADLS’ performance, removing scale and performance bottlenecks.  This performance enhancement is now available at the same low cost as Azure Blob Storage.

ADLS offers the same powerful data security capabilities built into Azure Blob Storage, such as:

  • Encryption of data in transit and at rest via TLS 1.2
  • Storage account firewalls
  • Virtual network integration
  • Role-based access security

In addition, ADLS’ file system provides support for POSIX compliant access control lists (ACLs). With this approach, you can provide granular security protection that restricts access to only authorized users, groups, or service principals and provides file and object data protection.

Azure Data Lake Storage diagram.jpg

ADLS is tightly integrated with Azure Databricks, Azure HDInsight, Azure Data Factory, Azure SQL Data Warehouse, and Power BI, enabling an end-to-end analytics workflow that delivers powerful business insights throughout all levels of your organization. Furthermore, ADLS is supported by a global network of big data analytics ISV’s and system integrators, including Cloudera and Hortonworks.

Next steps

Azure Data Explorer – The fast and highly scalable data analytics service

Azure Data Explorer (ADX) is a fast, fully managed data analytics service for real-time analysis on large volumes of streaming data. ADX is capable of querying 1 billion records in under a second with no modification of the data or metadata required. ADX also includes native connectors to Azure Data Lake Storage, Azure SQL Data Warehouse, and Power BI and comes with an intuitive query language so that customers can get insights in minutes.

Designed for speed and simplicity, ADX is architected with two distinct services that work in tandem: The Engine and Data Management (DM) service. Both services are deployed as clusters of compute nodes (virtual machines) in Azure.

Azure Data Explorer diagram

The Data Management (DM) service ingests various types of raw data and manages failure, backpressure, and data grooming tasks when necessary. The DM service also enables fast data ingestion through a unique method of automatic indexing and compression.

The Engine service is responsible for processing the incoming raw data and serving user queries. It uses a combination of auto scaling and data sharding to achieve speed and scale. The read-only query language is designed to make the syntax easy to read, author, and automate. The language provides a natural progression from one-line queries to complex data processing scripts for efficient query execution.

ADX is available in 41 Azure regions and is supported by a growing ecosystem of partners, including ISV’s and system integrators.

Next steps

Azure Data Factory Mapping Data Flow – Visual, zero-code experience for data transformation

Azure Data Factory (ADF) is a hybrid cloud-based data integration service for orchestrating and automating data movement and transformation. ADF provides over 80 built-in connectors to structured, semi-structured, and unstructured data sources.

With Mapping Data Flow in ADF, customers can visually design, build, and manage data transformation processes without learning Spark or having a deep understanding of their distributed infrastructure.

Azure Data Factory Mapping Data Flow

Mapping Data Flow combines a rich expression language with an interactive debugger to easily execute, trigger, and monitor ETL jobs and data integration processes.

Azure Data Factory is available in 21 regions and expanding, and is supported by a broad ecosystem of partners including ISV’s and system integrators.

Next steps

Azure is the best place for data analytics

With these technical innovations announced today, Azure continues to be the best cloud for analytics. Learn more why analytics in Azure is simply unmatched.

post

TomTom expands partnership with Microsoft to power Microsoft cloud offerings with location-based services

TomTom selects Microsoft Azure as its preferred cloud provider; TomTom location-based services will be utilized across Microsoft technologies for cloud services including Microsoft Azure, Bing Maps and Cortana

AMSTERDAM and REDMOND, Wash. — Feb. 4, 2019 — TomTom (TOM2) and Microsoft Corp. (MSFT) today announced that they are expanding their partnership, bringing TomTom’s maps and traffic data into a multitude of mapping scenarios across Microsoft’s cloud services. With this broadened integration, TomTom will be a leading location data provider for Microsoft Azure and Bing Maps. TomTom is also expanding its relationship with Microsoft, selecting Microsoft Azure as its preferred cloud provider.

TomTom logoAzure Maps delivers secured location APIs to provide geospatial context to data. The Azure Maps service enhances the value of the Microsoft Azure cloud platform that is helping enterprises and developers create IoT, mobility, logistics and asset tracking solutions. TomTom providing their map data and services is a significant component for completing these enterprise customer scenarios.

Anders Truelsen, Managing Director, TomTom Enterprise said, “TomTom is proud of the relationship we’ve built with Microsoft to offer Microsoft Azure customers access to build location-aware applications and look forward to deepening that relationship as we extend our high-quality location technologies to an even larger audience base. We’re excited to be chosen as the location data provider to power mapping services across all of Microsoft, including Bing, Cortana, Windows and many other leading products and the innovations that will come forward in this continued relationship.”

“This deep partnership with TomTom is very different from anything Microsoft has done in maps before,” said Tara Prakriya, Partner Group Program Manager of Azure Maps and Connected Vehicles. “TomTom hosting their services in the Azure cloud brings with it their graph of map data. Manufacturing maps in Azure reduces the latency to customer applications, ensuring we offer the freshest data through Azure Maps. Azure customers across industries end up winning when their geospatial data and analytics, TomTom data, and Azure Maps services are all running together in the same cloud.”

Azure Maps lights up a multitude of location scenarios for Microsoft. Azure customers now have native support ranging from building map-based dashboards to visualize IoT spatial analytics to mobility scenarios for vehicle movement. For example, in agriculture, customers can easily track utilization of farm sensors for crops, livestock, tractors and more to optimize production. Using the Azure Maps routing services powered by TomTom allows for insightful distribution of goods originating from farmlands to retail, restaurants and home delivery. Using the freshest maps and traffic information can determine delivery range, optimize delivery routes and provide customer insights into delivery status.

TomTom providing the freshest map and traffic information in combination with Azure Maps services and SDKs will help perpetuate improved smart city applications. Azure Maps SDKs using TomTom services make it simple to render a multitude of data sets from a variety of sources – such as real-time parking meter rates, street-specific traffic, addressing carbon footprint, reducing noise pollution and more in a consolidated, map-based application for visualization of pertinent city information crucial to its citizens.          

About Microsoft

Microsoft (Nasdaq “MSFT” @microsoft) enables digital transformation for the era of an intelligent cloud and an intelligent edge. Its mission is to empower every person and every organization on the planet to achieve more.

About TomTom

TomTom is the leading independent location technology specialist, shaping mobility with highly accurate maps, navigation software, real-time traffic information and services.

To achieve our vision of a safer world, free of congestion and emissions, we create innovative technologies that keep the world moving. By combining our extensive experience with leading business and technology partners, we power connected vehicles, smart mobility and, ultimately, autonomous driving.

Headquartered in Amsterdam with offices in 30 countries, TomTom’s technologies are trusted by hundreds of millions of people worldwide.
www.tomtom.com

For further Information:

TomTom Media:

Remco Meerstra

+31 6 55 69 04 80

remco.meerstra@tomtom.com

TomTom Investor Relations:

ir@tomtom.com

Microsoft Media Relations:

WE Communications for Microsoft

(425) 638-7777

rrt@we-worldwide.com

post

Make the most of your monthly Azure Credits

Angelos Petropoulos

Angelos

If you weren’t aware, Visual Studio subscribers have free monthly Azure credits, that are ideal for experimenting with and learning about Azure services. When you activate this benefit, it creates a separate Azure subscription with a monthly credit balance that renews each month while you remain an active Visual Studio subscriber. If the credits run out before the end of the month the subscription is suspended until more credits are available. No surprises, no cost, and no credit card required for any of it!

The table below shows you how many Azure credits you get based on your type of Visual Studio subscription:

Visual Studio subscription type Monthly Azure Credits Activate Credits
Visual Studio Professional (standard subscription) $50 activate
Visual Studio Test Professional $50 activate
MSDN Platforms $100 activate
Visual Studio Enterprise (standard subscription) $150 activate
Visual Studio Enterprise (BizSpark) $150 activate
Visual Studio Enterprise (MPN) $150 activate

Now that you know how many Azure Credits you get every month for free, you are probably wondering what you can spend it on! We have put together the following simple table to help you get going:

Azure Service Tier Estimated Monthly Cost
App Service Shared $9.49
Storage General Purpose V2 (1GB) $1.06
SQL Single DB (5DTUs, 2GB) $4.90
CosmosDB 1GB $23.61
Functions Dynamic First 1M executions free
(up to 400K GB-s)
Monitor Application Insights First 1GB free
Key Vault Standard $0.03 (per 10k operations)
Service Bus Basic $0.05 (per 1M operations)
Redis Basic (C0: 250MB Cache) $16.06

Hopefully, you found this information helpful and you are on your way to make use of your Azure credits. If you are interested in the cost of an Azure service you didn’t see in the table above, try the Azure price calculator. If you have any questions or problems just leave us a comment below.

Angelos Petropoulos