post

Johnson Controls tackles a $15B building industry problem with Azure Cosmos DB

This blog post was co-authored by Nikisha Reyes-Grange, Senior Product Marketing Manager, Azure Marketing, and Balamurugan Balakreshnan, Cloud Solution Architect, CSU-Data and AI.

Johnson Controls has been a pioneer in building management solutions and services since founder Warren Johnson invented the first electric room thermostat in 1885. Johnson Controls has introduced many innovations to the building industry over the years and are now tackling a problem that costs the building industry billions each year.

Modern buildings include multiple systems that handle everything from building management to HVAC to security. These systems are managed by protocols, proprietary systems, and applications without a common data model, which prevents interoperability, limits scalability, and costs the industry an estimated $15 billion annually.

Solution

To help building operators gather and understand data about their buildings, operations, and occupants, Johnson Controls created Digital Vault to integrate internal and external data sources and present a harmonized view of energy usage, security breaches, fire alarm status, temperature controls, and other building management systems. Digital Vault, powered by Azure Cosmos DB, simplifies object relationship management through a single Application Programming Interface (API) layer for IoT data and events while at rest and in motion. Another API layer leverages Azure Cosmos DB’s auto-indexing and multi-modal capabilities to accept data in forms ranging from key-value to graph. Digital Vault is then able to create a virtual version of a physical structure, like a floor or room, making it easy to conceptualize the relationships between building assets.

Johnson Controls Digital Vault technical architecture

Johnson Controls Digital Vault technical architecture

Combining and processing data from Operational Technology (OT) and Information Technology (IT) of smart environments, Digital Vault creates a real-time knowledge graph that can answer questions about a building’s past and present conditions and make predictions about its future. OT data comes from devices and sensors in the building, and from HVAC, fire, lighting, security, IT, asset tracking, and more. The challenge in such an environment is the variety, volume, and velocity of data. As devices evolve over time, so do their schema and data models. With multiple generations of devices present in buildings, a data store must flexibly handle a variety of schemas. Sensors can save data at high rates, dramatically increasing the volume of saved data as the number of sensors increase. Velocity increases as data stores power real-time views, dashboards, and decisions, while also enabling historical data to be leveraged for advanced analytics and AI.

By processing incoming OT data, Digital Vault can perform common data cleaning and analytic functions automatically. Building owners and application developers do not have to spend time building out a processing system for streaming data. The raw and processed data from devices is available in one spot, with common algorithms for filling missing values, cleaning and smoothing outliers, and many analytic and normalization functions already applied to the stream.

Why Azure Cosmos DB?

In order to service thousands of buildings around the world generating billions of daily data samples, Johnson Controls knew they needed a platform-as-a-solution (PaaS) data store that could provide global service, manage huge volumes of data, scale as needed, reduce costs, and keep operational complexity low. They turned to Azure Cosmos DB to meet these needs while also easily managing data in a variety of models, auto-indexing data, and integrating with other Azure services.

Digital Vault uses Azure IoT Hub to collect data streams from devices in buildings, Azure Event Hubs to manage the streams of data, and Spark in an HDInsight cluster to process the full stream and apply the analytical algorithms that embody Johnson Control’s 135 years of experience in building management. Azure Cosmos DB is at the heart of the processing system, feeding in additional data as needed and storing processed data, with its Graph API allowing developers to process graph data in any language with minimal integration time.

Digital Vault architecture

Digital Vault architecture

For a building to be truly smart, it must have access to a significant amount of IT data – data that has traditionally been held in multiple silos, with no connection to OT data. To store and manage the IT portion of the knowledge graph, Johnson Controls again turned to Azure Cosmos DB. Just as with the OT data, there is a huge variety in schema and shape of IT data. Digital Vault ingests, stores, and incorporates IT data into a building’s knowledge graph using the Azure Cosmos DB Graph API. This makes it possible for Johnson Controls’ developers to use industry-standard APIs for processing graph data from any language.

This graph understands the relationships between different types and sources of data and makes it possible to navigate among the many sensors, assets, people, and processes in the building. Digital Vault is, in turn, able to assess the current health and maintenance history of equipment and apply predictive models to anticipate future status and trends.

post

Introducing the Azure Blockchain Development Kit

“Developers! Developers! Developers!” That phrase is synonymous with Microsoft’s history of democratizing complex technologies and empowering anyone with an idea to build software.

Over four decades, we’ve lowered barriers to development with developer tooling, enterprise integration, DevOps, PaaS, and SaaS. Today, serverless offerings from Functions and Logic Apps to Azure DevOps and IoT Central remove friction for development in the cloud.

This morning, we’re excited to announce the initial release of the Azure Blockchain Development Kit which is built on Microsoft’s serverless technologies and seamlessly integrates blockchain with the best of Microsoft and third-party SaaS.

This kit extends the capabilities of our blockchain developer templates and Azure Blockchain Workbench, which incorporates Azure services for key management, off-chain identity and data, monitoring, and messaging APIs into a reference architecture that can be used to rapidly build blockchain-based applications.

These tools have become the first step for many organizations on their journey to re-invent the way they do business. Apps have been built for everything from democratizing supply chain financing in Nigeria to securing the food supply in the UK, but as patterns emerged across use cases, our teams identified new ways for Microsoft to help developers go farther, faster.

This initial release prioritizes capabilities related to three key themes: connecting interfaces, integrating data and systems, and deploying smart contracts and blockchain networks.

Connect

To deliver end to end blockchain solutions for consortiums, developers need to enable organizations, people, and devices to connect to the blockchain and do it from a heterogenous set of user interfaces.

Take for example an end to end supply chain for a commodity such as cocoa.

  • SMS and voice interfaces enable small hold farmers in Africa to transact and track their goods at the first mile of the supply chain.
  • Internet of Things (IoT) devices deliver sensor data to track the conditions of the goods at different points in their journey to market – tracking the humidity in the containers where the beans are held to the temperature of the end product of ice cream that it is incorporated into.
  • Mobile clients enable logistics providers to accept and transfer responsibility for products on their journey from manufacturer to retail using the compute power that already exists in the pockets of its employees. Mobile devices also have sensors such as GPS and cameras that can add complementary data that can help attest to the what, where, and when of deliveries.
  • Backend Systems and Data in the form of ERP systems such as Dynamics and SAP are used to manage core processes for different participants. These systems also become clients via extension and need to interact with smart contracts to provide and receive attestable data on behalf of an organization.
  • Bots and assistants enable manufacturers and retailers to interact with the supply chain. This includes interacting with smart contracts for orders and provenance using natural language and using attestable data from the blockchain to direct actions taken on behalf of a user.
  • Web clients enable end consumers to query the origin of the product purchased at retail, typically a mix of provenance and story of their journey of their product from “farm to fork”

The Azure Blockchain Development Kit includes samples for all of these scenarios, including inbound and outbound SMS, IVR, IoT Hub and IoT Central, Xamarin mobile client for iOS and Android, Dynamics integration via Common Data Service (CDS), bots and assistants (Cortana, Alexa, Google Assistant) and web UX.

Integrate

Businesses are using blockchain and smart contracts to facilitate multi-party processes. Blockchain also delivers real-time transparency of the states and events of those contracts to appropriate participants.

End to end blockchain solutions require integration with data, software, and media that live “off chain”. External updates and events can trigger actions on smart contracts. Smart contract events and state changes can then trigger actions and data updates to “off chain” systems and data. These external systems and AI will also need the ability to query attestable data from smart contracts to inform action.

Specifically, there are two areas of integration where guidance is most needed:

Documents and Media: Documents and media do not belong on chain, but business processes often involve images, videos, audio, Office documents, CAD files for 3D printers or other file types.

The common pattern is to generate a unique hash of the media and the metadata that describes it. Those hashes are then placed on a public or private chain. If authenticity of a file is ever questioned, the “off chain” files can be re-hashed at a later time and that hash is compared to the “on chain” hash stored on the blockchain. If the hashes match, the document is authentic, but if so much as a pixel of an image or letter in a document is changed, the hashes will not match and this will make obvious that tampering has occurred.

Today we’re releasing a set of Logic Apps that enable the hashing of files and file related metadata. Also included are smart contracts for files and a file registry to store the hashes on chain.

Logic Apps have been created to deliver this functionality for files added to the most popular sources for documents and media, including Azure Storage, OneDrive, One Drive for Business, SharePoint, Box, Adobe Creative Cloud, and FTP.

Documents and Media

Smart Contract Interaction: Getting blockchain off the whiteboard and into production means dealing with the realities of how counterparties interact today. That reality is that Enterprise integration is messy.

Microsoft brings our decades of experience in this area to blockchain. Our work with integrating Enterprise systems began almost two decades ago with the introduction of BizTalk server, and our focus on database integration traces back to our co-development of Open Database Connectivity (ODBC) in the 1990s. All of our experience has been captured and made available in Azure services. This includes 200+ connectors available in Logic Apps and Flow, and the robust capabilities in our data platform.

Smart Contract Interaction

The Blockchain Application Development Kit includes Workbench integration samples in the following areas:

Logic App Connectors for Blockchain

Today, we are also announcing that we will release a set of Logic App and Flow Connectors to extend these samples to ledgers like Ethereum, Corda, Bitcoin, and others.

“At R3, we are committed to ensuring developers can deploy CorDapps quickly, securely and easily. The Azure Blockchain Development Kit will give our enterprise customers tools to integrate with the applications, software, and devices that people use every day like Outlook, Alexa, SMS, and web UX. Blockchain is moving out of the labs and into everyday business applications.”

– Mike Ward, Head of Product Management, R3

The Ethereum blockchain connector is available today and enables users to deploy contracts, call contract actions, read contract state and trigger other Logic Apps based on events from the ledger.

Logic Apps

Deploy

With the mainstreaming of blockchain technology in enterprise software development, organizations are asking for guidance on how to deliver DevOps for smart contracts and blockchain projects.

Common questions include:

  • My business logic and data schema for that logic is reflected in smart contracts. Smart contracts are written in languages I’m less familiar with like Solidity for Ethereum, and Kotlin for Corda, or Go for Hyperledger Fabric.  What tools can I use to develop those in?
  • How do I do unit testing and debugging on smart contracts?
  • Many blockchain scenarios reflect multi-party transactions and business workflows. These workflows include signed transactions from multiple parties happening in specific sequences. How do I think about data for test environments in that context?
  • Smart contracts are deployed to the blockchain, which is immutable. How do I need to think about things such as infrastructure as code, local dev/test, upgrading contracts, etc.?
  • Blockchain is a data technology shared across multiple organizations in a consortium, what are the impacts on source code control, build and release pipelines in a global, multi-party environment?

While there are some nuances to the approach, the good news is that just like other types of solution development, this model can readily be addressed in a DevOps model.

DevOps Model

Today, we’re announcing the release of the whitepaper, “DevOps for Blockchain Smart Contracts.”

“We’re excited to work with Microsoft to create the canonical DevOps experience for blockchain engineers. Our paper, ‘DevOps for Blockchain Smart Contracts’, goes into rigorous detail and provides examples on how to develop blockchain applications with an eye toward CI/CD in consortium environments.”

– Tim Coulter, Founder of Truffle

Complementing the whitepaper is an implementation guide, available through the Azure Blockchain Development Kit, that shows how to implement CI/CD for smart contracts and infrastructure as code using Visual Studio Code, GitHub, Azure DevOps and OSS from Truffle.

A great platform for blockchain application development

The Azure Blockchain Development Kit is the next step in our journey to make developing end to end blockchain applications accessible, fast, and affordable to anyone with an idea. It is built atop our investments in blockchain and connects to the compute, data, messaging, and integration services available in both Azure and the broader Microsoft Cloud to provide a robust palette for a developer to realize their vision.

Logic Apps and Flow deliver a graphical design environment with more than 200 connectors dramatically simplifying the development of end to end blockchain solutions, and Azure Functions enable the rapid integration of custom code.

A serverless approach also reduces costs and management overhead. With no VMs to manage, built-in scalability, and an approachable pricing model the Azure Blockchain Development Kit is within reach of every developer – from enthusiasts to ISVs to enterprises.

Solutions are written using online visual workflow designers and Visual Studio Code, a free download that provides an integrated development environment on Windows, Mac, and Linux.

The resulting applications will run atop a network that has higher rated cloud performance than other large-scale providers and enable federating identities between participants using Azure Active Directory. With Azure, those applications can be deployed to more regions than any other cloud provider and benefit from more certifications.

We look forward to seeing what you’ll build, and we’ll continue to both listen and look for ways to help as we build a decentralized future together.

To learn more about how to use these samples to build and extend blockchain applications, you can find a host of videos on our Channel 9 show Block Talk.

You can also stay up to date with the latest updates from Azure Blockchain by following us on Twitter @MSFTBlockchain.

post

Azure teams up with Carnegie Mellon University to bring the cloud to the Living Edge Laboratory

Edge computing is one of the greatest trends that is not only transforming the cloud market, but creating new opportunities for business and society. At its core, edge computing is about harnessing compute on a device, closest to where insights need to be realized – on a connected car, a piece of machinery, or a remote oil field, for example – so these insights come without delay and with a high-degree of accuracy. This is known as the “intelligent edge” because these devices are constantly learning, often aided by AI and Machine Learning algorithms powered by the intelligent cloud. At Microsoft we see amazing new applications of the intelligent edge and intelligent cloud every day – and yet this opportunity is so expansive – the surface has barely been scratched.

To further advance inquiry and discovery at the edge, today we are announcing Microsoft is donating cloud hardware and services to Carnegie Mellon University’s Living Edge Laboratory. Carnegie Mellon University is recognized as one of the leading global research institutions. Earlier this year, the university announced a $27.5 million semiconductor research initiative to connect edge devices to the cloud. Today’s announcement builds on existing commitments to discovery in the field of edge computing.

The Living Edge Laboratory is a testbed for exploring applications that generate large data volumes and require intense processing with near-instantaneous response times. The lab is designed to open accessibility to the latest innovations in edge computing and advance discovery for edge computing applications across industries. Microsoft will donate an Azure Data Box Edge, Azure Stack partnering with Intel, and Azure credits to do advanced machine learning and AI at the edge.

This new paradigm of cloud computing requires consistency in how an application is developed, to run in the cloud and at the edge. We are building Azure to make this possible. From app platform to AI, security and management, our customers can architect, develop, and run a distributed application as a single, consistent environment.

We offer the most comprehensive portfolio to enable computing at the edge, covering the full spectrum IoT devices and sensors to bringing the full power of the cloud to the edge with Azure Stack. With this spectrum spanning hardware and software, with security and advanced analytics and AI services, Microsoft brings a robust platform for developers to research and create new applications for the edge.

The Living Edge Lab was established over the past year through the Open Edge Computing Initiative, a collective effort dedicated to driving the business opportunities and technologies surrounding edge computing. With the addition of Microsoft products to the lab, faculty and students will be able to use them to develop new applications and compare their performance with other components already in place in the lab. As part of this donation, Microsoft is also joining the Open Edge Computing Initiative.

Students of Carnegie Mellon are already making exciting discoveries and applications powered by Azure AI and ML services at the edge. One of these applications is designed to help visually impaired people to detect objects or people nearby. The video feeds of a stereoscopic camera on a user are transmitted to a nearby cloudlet, and real-time video analytics is used detect obstacles. This information is transmitted back to the user and communicated via vibro-tactile feedback.

Another, OpenRTiST, allows a user to see the world around them in real time, “through the eyes of an artist.” The video feed from the camera of a mobile device is transmitted to a local application, transformed there by a deep neural network trained offline to learn the artistic features of a famous painting, and returned to the user’s device as a video feed. The entire round trip is fast enough to preserve the illusion that the world around the user as displayed on the device is being continuously repainted by the artist.

I encourage you to learn more about Microsoft’s vision for an intelligent cloud and intelligent edge.

This is the beginning of an exciting new chapter of research at Carnegie Mellon – stemming from a collaboration on edge computing that began 10 years ago –  and we cannot wait to see what new discoveries and scenarios come to life from the Living Edge Lab.

post

When should you right click publish

Some people say ‘friends don’t let friends right click publish’ but is that true? If they mean that there are great benefits to setting up a CI/CD workflow, that’s true and we will talk more about these benefits in just a minute. First, let’s remind ourselves that the goal isn’t always coming up with the best long-term solution.

Technology moves fast and as developers we are constantly learning and experimenting with new languages, frameworks and platforms. Sometimes we just need to prototype something rather quickly in order to evaluate its capabilities. That’s a classic scenario where right click publish in Visual Studio provides the right balance between how much time you are going to spend (just a few seconds) and the options that become available to you (quite a few depending on the project type) such as publish to IIS, FTP  & Folder (great for xcopy deployments and integration with other tools).

Continuing with the theme of prototyping and experimenting, right click publish is the perfect way for existing Visual Studio customers to evaluate Azure App Service (PAAS). By following the right click publish flow you get the opportunity to provision new instances in Azure and publish your application to them without leaving Visual Studio:

When the right click publish flow has been completed, you immediately have a working application running in the cloud:

Platform evaluations and experiments take time and during that time, right click publish helps you focus on the things that matter. When you are ready and the demand rises for automation, repeatability and traceability that’s when investing into a CI/CD workflow starts making a lot of sense:

  • Automation: builds are kicked off and tests are executed as soon as you check in your code
  • Repeatability: it’s impossible to produce binaries without having the source code checked in
  • Traceability: each build can be traced back to a specific version of the codebase in source control which can then be compared with another build and figure out the differences

The right time to adopt CI/CD typically coincides with a milestone related to maturity; either and application milestone or the team’s that is building it. If you are the only developer working on your application you may feel that setting up CI/CD is overkill, but automation and traceability can be extremely valuable even to a single developer once you start shipping to your customers and you have to support multiple versions in production.

With a CI/CD workflow you are guaranteed that all binaries produced by a build can be linked back to the matching version of the source code. You can go from a customer bug report to looking at the matching source code easily, quickly and with certainty. In addition, the automation aspects of CI/CD save you valuable time performing common tasks like running tests and deploying to testing and pre-production environments, lowering the overhead of good practices that ensure high quality.

As always, we want to see you successful, so if you run into any issues using publish in Visual Studio or setting up your CI/CD workload, let me know in the comment section below and I’ll do my best to get your question answered.

post

Engineers from Walmart and Microsoft will work side-by-side in new ‘cloud factory’

Walmart is expanding its technology center in Austin, Texas, to accelerate digital innovation that transforms how associates work and delivers more convenient ways for customers to shop.

About 30 technologists, including engineers from both Walmart and Microsoft, will work together side by side in the cloud factory, expected to open in early 2019 as an extension to a strategic partnership announced in July. The factory will be an expansion of Walmart’s innovation hub, a vibrant workplace opened earlier this year in the center of Austin’s growing technology scene.

The Walmart-Microsoft team – known internally as “4.co” for its location at Fourth and Colorado streets – will initially focus on migrating Walmart’s thousands of internal business applications to Microsoft Azure. The team will also build new, cloud-native applications. The collaboration will be part of a multi-year journey to modernize Walmart’s enterprise application portfolio, create more efficient business processes and decrease operational costs associated with legacy architecture.

large, open, modern workplace with colorful mural in background
Walmart’s innovation hub in Austin, Texas.

The work will continue digital solutions already deployed at Walmart, including thousands of IoT sensors on HVAC and refrigeration systems that process a billion daily data messages from stores worldwide. The solution helps Walmart save energy and prevent product loss.

Based in Arkansas, the global retailer is also deploying Microsoft AI in a number of use cases, including internal chatbots that help associates navigate benefits, find mentors and better manage supplier engagements. Walmart has also deployed a natural language processing platform capable of processing 40 terabytes of unstructured text and providing near real-time insights and actions in support of business operations.

To shed light on 4.co and the company’s innovations, Transform chatted with Clay Johnson, executive vice president and enterprise chief information officer at Walmart.

TRANSFORM: Why is digital transformation important to Walmart?

JOHNSON: Technology is changing the way people live, work and shop – and the pace of change is only getting faster. We want to create new and incredibly convenient ways for our customers to shop, and as technology cycles get shorter and shorter, we must increase our speed and agility.

To do this, we’re digitally transforming processes and empowering our associates with consumer-grade technology and tools. Digital transformation is pervasive across the company and we’re pushing the envelope to accelerate innovation.

TRANSFORM: How is Walmart using Microsoft AI? Where do you see it going?

headshot of Clay Johnson
Clay Johnson, executive vice president and enterprise chief information officer at Walmart.

JOHNSON: We’re layering AI across every facet of our business. We have lots and lots of data that represents value we can leverage. That’s where AI, machine learning and Cognitive Services in Microsoft Azure come in to play.

We took our natural language processing platform and a single business case – post-payment auditing – to build out the platform at massive scale and apply it in other use cases. For associates who are tasked with finding that needle in the haystack, machine learning will help them go through unstructured text quickly. But ultimately, it’s not just about efficiency. A lot of this will drive impact to the bottom line.

With our chatbots, associates can use them to find a mentor or ask questions about benefits. They can get simple questions answered quickly. By saving them time, that’s more time they can spend on the sales floor helping our customers.

TRANSFORM: What are some benefits you’ve seen with other technologies like IoT and edge computing?

JOHNSON: With our IoT work and sensor enablement, we’re looking at our energy consumption and other factors to predict equipment failures before they happen. Improving equipment performance can result in enhanced energy efficiency, which lowers costs and our carbon footprint. That’s good for the customer and the environment. Sustainability is a big deal for us.

Putting IoT data into edge analytics lets us look at data at a store level and backhaul it to Azure to look at it across a region or the whole U.S. We started talking to Microsoft about this concept of a set of stores being a “micro-cloud,” and you roll them into Azure for data analytics and insights. A lot of this work is coming out of our Austin site.

TRANSFORM: How did the concept of 4.co come to be? What are the team’s goals?

JOHNSON: With this partnership with Microsoft, we started talking, ‘‘Hey, what’s the best way to accelerate all the stuff we’re doing here? We need help and expertise. We want to move fast. How do we partner our smart people with Microsoft’s smart people?”

Then it was obvious: ”Why don’t we just co-locate the teams together.” We haven’t done something like this before with co-location, but I think the outcomes are going to be huge and strengthen our partnership even more. You’re going to see a lot more co-innovation around IoT, computer vision, big data and real-time analytics.

TRANSFORM: How do you think both companies will benefit?

JOHNSON: We’re going to learn a lot from each other. We’re going to learn a lot from Microsoft –  which apps make sense to get to the cloud quickly and which don’t.

Microsoft’s going to get to see stuff at a scale they’ve never seen before. [Walmart has 11,200 global stores and 2.2 million associates]. I think they’ll learn a lot from our footprint. Co-locating top engineers from both companies will deepen the technical brainpower for creating disruptive, large-scale enterprise solutions for Walmart.

open, modern workplace
Walmart’s innovation hub in Austin, Texas.

TRANSFORM: Why did Walmart pick Austin for its technology office and 4.co? How are you interacting with the community there?

JOHNSON: You see a lot of tech companies and startups in Austin, and we wanted to blend that culture and the universities with our culture as we continue to grow and add tech talent to Walmart. Austin is a huge recruiting opportunity for us. Because our team in Austin works closely with our technology teams in Bentonville, [Arkansas], a pipeline of technology talent into Bentonville has also emerged.

Austin is also a very meetup-heavy city. So we were really purposeful in picking a location that puts us in the middle of the tech meetup scene. We host deep learning, AI and chatbot meetups. We do a lot of Women Who Code events.

We recently did a hackathon with Microsoft and the University of Texas at Austin, where we worked with a no-kill animal shelter. The students really enjoyed doing something for the local community. [The event focused on helping dogs through predictive modeling of the canine parvovirus disease].

TRANSFORM: What’s your vision for the future of retail? 

JOHNSON: I think you’re going to see an omnichannel-type world that is a blend of e-commerce and brick and mortar, where consumers can have access to things at any point, any time. They can order online and pick up in the store or have it delivered same-day. You can see we’re starting to do that with our online grocery right now.

Top photo: Walmart’s technology center in Austin, Texas. All photos courtesy of Walmart. Follow Clay Johnson @ClayMJohnson.

post

Building an ecosystem for responsible drone use and development on Azure

The next wave of computing is already taking shape around us as IoT enables businesses to sense all aspects of their business in real-time, and take informed action to running cloud workloads on those IoT devices so they don’t require “always on” connectivity to the cloud to make real-time context-aware decisions. This is the intelligent edge, and it will define the next wave of innovation, not just for business, but also how we address some of the world’s most pressing issues.

Drones or unmanned aircraft systems (UAS) are great examples of intelligent edge devices being used today to address many of these challenges, from search and rescue missions and natural disaster recovery, to increasing the world’s food supply with precision agriculture. With the power of AI at the edge, drones can have a profound impact in transforming businesses and improving society, as well as in assisting humans in navigating high-risk areas safely and efficiently.   

With these advanced capabilities also comes great responsibility including respecting the laws that govern responsible drone use in our airspace, as well as the applications of the drones to scan the environment. We believe it is important to protect data wherever it lives, from the cloud to the intelligent edge.

In addition to building the platforms for innovation, we at Microsoft believe it is crucial to also invest in companies and partnerships that will enable the responsible use of drones and associated data. Today we are making two important announcements that further our commitment to responsible use of drones as commercial IoT edge devices running on Microsoft Azure.

AirMap powered by Microsoft Azure

Today we announced that AirMap has selected Microsoft Azure as the company’s exclusive cloud-computing platform for its drone traffic management platform and developer ecosystem.

Drone usage is growing quickly across industries such as transportation, utilities, agriculture, public safety, emergency response, and more, to improve the efficiency and performance of existing business processes. Data generated by drones is infused with intelligence to augment the value that companies and governments deliver to customers and communities. However, concerns about regulatory compliance, privacy, and data protection still prevent organizations from adopting drone technology at scale.

AirMap works with civil aviation authorities, air navigation service providers, and local authorities to implement an airspace management system that supports and enforces secure and responsible access to low-altitude airspace for drones.

With AirMap’s airspace management platform running on Microsoft Azure, the two companies are delivering a platform that will allow state and local authorities to authorize drone flights and enforce local rules and restrictions on how and when they can be operated. Their solution also enables companies to ensure that compliance and security are a core part of their enterprise workflows that incorporate drones.

AirMap selected Microsoft Azure because it provides the critical cloud-computing infrastructure, security, and reliability needed to run these mission-critical airspace services and orchestrate drone operations around the world.

Earlier this year, Swiss Post, the postal service provider for the country of Switzerland, joined with drone manufacturer Matternet and Insel Group, Switzerland’s largest leading medical care system, to fly time-sensitive laboratory samples between Tiefanau Hospital and University Hospital Insel in Bern. This was an alternative to ground transport, where significant traffic can cause life-threatening delays. The operations are supported by Switzerland’s airspace management system for drones, powered by AirMap and Microsoft Azure, for safety and efficiency.

DJI Windows SDK for app development enters public preview

Last May at our Build developer conference, we announced a partnership with DJI, the world’s leader in civilian drones and aerial imaging technology, to bring advanced AI and machine learning capabilities to DJI drones, helping businesses harness the power of commercial drone technology and edge cloud computing.

Today at DJI’s AirWorks conference, we are announcing the public preview of the Windows SDK, which allows applications to be written for Windows 10 PCs that control DJI drones. The SDK will also allow the Windows developer community to integrate and control third-party payloads like multispectral sensors, robotic components like custom actuators, and more, exponentially increasing the ways drones can be used in the enterprise.

With this SDK, we now have three methods to enable Azure AI services to interact with drone imagery and video in real-time:

  1. Drone imagery can be sent directly to Azure for processing by an AI workload.
  2. Drone imagery can be processed on Windows running Azure IoT Edge with an AI workload.
  3. Drone imagery can be processed directly onboard drones running Azure IoT Edge with an AI workload.

We take the security of data seriously, from the cloud to edge devices such as drones. Azure IoT Edge includes an important subsystem, called the security manager, which acts as a core for protecting the IoT Edge device and all its components by abstracting the secure silicon hardware. It is the focal point for security hardening and provides IoT device manufacturers the opportunity to harden their devices based on their choice of hardware secure modules (HSM). Finally, the Azure certified for IoT program only certifies third-party Azure IoT Edge hardware that meets our strict security requirements.

We are encouraged by all the creative applications of drones we are seeing on Azure today across industry sectors, and the following are a few examples of this.

  • SlantRange: Is an aerial remote sensing and data analytics company servicing the information needs of the agriculture industry, with over two decades of experience in earth science, defense, and intelligence applications. The company has patented foundational technologies for aerial crop inspections and introduced innovative analytical methods that deliver valuable agronomic data within minutes of collection, anywhere in the world, using low-power edge computing devices. These technologies have propelled SlantRange’s growth from a few Nebraska corn and soybean farms just a few seasons ago to over 40 countries and a wide variety of crops today, including contracts with many of the world’s leading agricultural suppliers and producers.
  • Clobotics: Is a global AI company helping wind energy companies to improve its productivity by automatically inspecting, processing, and reporting wind turbine blade defects. The company has inspected over 1,000 wind turbines in the past few months around the world. Clobotics’ end-to-end solutions combine computer vision, machine learning, and data analytics software with commercial drones and sensors to help the wind energy industry automate its inspection service. Clobotics’ Wind Turbine Data Platform, along with its computer vision-based edge computing service, provides a first-of-its-kind wind turbine blade life-cycle management service. As a Microsoft WW Azure partner, Clobotics works closely with Microsoft Azure and the IoT platform to provide the most innovative and reliable services to its enterprise customers.
  • eSmart systems: Brings more than 20 years of global intelligence to provide software solutions to the energy industry, service providers, and smart cities. By bringing AI to the edge on Azure IoT Edge, eSmart Systems are revolutionizing grid inspections. Their Connected Drone solution has the capability to analyze 200,000 images in less than one hour, this is more than a human expert can do in a year. The result is reduced operational cost, better insights into the current status of the grid, and fewer outages.

Earlier this year we announced a $5 billion investment in IoT and the intelligent edge to continue innovation, strategic partnerships, and programs. The latest announcements with strategic partners like AirMap and DJI continue to push the boundaries of what is possible, and we look forward to seeing what our joint customers go on to build.

post

Mad? Sad? Glad? Video Indexer now recognizes these human emotions

Many different customers across industries want to have insights into the emotional moments that appear in different parts of their media content. For broadcasters, this can help create more impactful promotion clips and drive viewers to their content; in the sales industry it can be super useful for analyzing sales calls and improve convergence; in advertising it can help identify the best moment to pop up an ad, and the list goes on and on. To that end, we are excited to share Video Indexer’s (VI) new machine learning model that mimics humans’ behavior to detect four cross-cultural emotional states in videos: anger, fear, joy, and sadness.

Endowing machines with cognitive abilities to recognize and interpret human emotions is a challenging task due to their complexity. As humans, we use multiple mediums to analyze emotions. These include facial expressions, voice tonality, and speech content. Eventually, the determination of a specific emotion is a result of a combination of these three modalities to varying degrees.

While traditional sentiment analysis models detect the polarity of content – for example, positive or negative – our new model aims to provide a finer granularity analysis. For example, given a moment with negative sentiment, the new model determines whether the underlying emotion is fear, sadness, or anger. The following figure illustrates VI’s emotion analysis of Microsoft CEO Satya Nadella’s speech on the importance of education. At the very beginning of his speech, a sad moment was detected.

Microsoft CEO Satya Nadella's speech on the importance of education

All the detected emotions and their specific appearances along the video are enumerated in the video index JSON as follows:

video index JSON

Cross-channel emotion detection in VI

The new functionality utilizes deep learning to detect emotional moments in media assets based on speech content and voice tonality. VI detects emotions by capturing semantic properties of the speech content. However, semantic properties of single words are not enough, so the underlying syntax is also analyzed because the same words in a different order can induce different emotions.

Syntax

VI leverages the context of the speech content to infer the dominant emotion. For example, the sentence “… the car was coming at me and accelerating at a very fast speed …” has no negative words, but VI can still detect fear as the underlying emotion.

VI analyzes the vocal tonality of speakers as well. It automatically detects segments with voice activity and fuses the affective information contained within with the speech content component.

Video Indexer

With the new emotion detection capability in VI that relies on speech content and voice tonality, you are able to become more insightful about the content of your videos by leveraging them for marketing, customer care, and sales purposes.

For more information, visit VI’s portal or the VI developer portal, and try this new capability for free. You can also browse videos indexed as to emotional content: sample 1, sample 2, and sample 3.  

Have questions or feedback? We would love to hear from you!

Use our UserVoice to help us prioritize features, or email VISupport@Microsoft.com with any questions.

post

Use Hybrid Connections to Incrementally Migrate Applications to the Cloud

As the software industry shifts to running software in the cloud, organizations are looking to migrate existing applications from on-premises to the cloud. Last week at Microsoft’s Ignite conference, Paul Yuknewicz and I delivered a talk focused on how to get started migrating applications to Azure (watch the talk free) where we walked through the business case for migrating to the cloud, and choosing the right hosting and data services.

If your application is a candidate for running in App Service, one of the most useful pieces of technology that we showed was Hybrid Connections. Hybrid Connections let you host a part of your application in Azure App Service, while calling back into resources and services not running in Azure (e.g. still on-premises). This enables you to try running a small part of your application in the cloud without the need to move your entire application and all of its dependencies at once; which is usually time consuming, and extremely difficult to debug when things don’t work. So, in this post I’ll show you how to host an ASP.NET front application in the cloud, and configure a hybrid connection to connect back to a service on your local machine.

Publishing Our Sample App to the Cloud

For the purposes of this post, I’m going to use the Smart Hotel 360 App sample that uses an ASP.NET front end that calls a WCF service which then accesses a SQL Express LocalDB instance on my machine.

The first thing I need to do is publish the ASP.NET application to App Service. To do this, right click on the “SmartHotel.Registration.Web” project and choose “Publish”

clip_image001

The publish target dialog is already on App Service, and I want to create a new one, so I will just click the “Publish” button.

This will bring up the “Create App Service” dialog.  Next, I will click “Create” and wait for a minute while the resources in the cloud are created and the application is published.

clip_image003

When it’s finished publishing, my web browser will open to my published site. At this point, there will be an error loading the page since it cannot connect to the WCF service. To fix this we’ll add a hybrid connection.

image

Create the Hybrid Connection

To create the Hybrid Connection, I navigate to the App Service I just created in the Azure Portal. One quick way to do this is to click the “Managed in Cloud Explorer” link on the publish summary page

clip_image005

Right click the site, and choose “Open in Portal” (You can manually navigate to the page by logging into the Azure portal, click App Services, and choose your site).

clip_image006

To create the hybrid connection:

Click the “Networking” tab in the Settings section on the left side of the App Service page

Click “Configure your hybrid connection endpoints” in the “Hybrid connections” section

image

Next, click “Add a hybrid connection”

Then click “Create a new hybrid connection”

clip_image010

Fill out the “Create new hybrid connection” form as follows:

  • Hybrid connection Name: any unique name that you want
  • Endpoint Host: This is the machine URL your application is currently using to connect to the on-premises resource. In this case, this is “localhost” (Note: per the documentation, use the hostname rather than a specific IP address if possible as it’s more robust)
  • Endpoint Port: The port the on-premises resource is listening on. In this case, the WCF service on my local machine is listening on 2901
  • Servicebus namespace: If you’ve previously configured hybrid connections you can re-use an existing one, in this case we’ll create a new one, and give it a name

clip_image011

Click “OK”. It will take about 30 seconds to create the hybrid connection, when it’s done you’ll see it appear on the Hybrid connections page.

Configure the Hybrid Connection Locally

Now we need to install the Hybrid Connection Manager on the local machine. To do this, click the “Download connection manager” on the Hybrid connections page and install the MSI.

clip_image013

After the connection manager finishes installing, launch the “Hybrid Connections Manager UI”, it should appear in your Windows Start menu if you type “Hybrid Connections”. (If for some reason it doesn’t appear on the Start Menu, launch it manually from “C:\Program Files\Microsoft\HybridConnectionManager <version#>”)

Click the “Add a new Hybrid Connection” button in the Hybrid Connections Manager UI and login with the same credentials you used to publish your application.

clip_image015

Choose the subscription you used published your application from the “Subscription” dropdown, choose the hybrid connection you just created in the portal, and click “Save”.

clip_image017

In the overview, you should see the status say “Connected”. Note: If the state won’t change from “Not Connected”, I’ve found that rebooting my machine fixes this (it can take a few minutes to connect after the reboot).

clip_image019

Make sure everything is running correctly on your local machine, and then when we open the site running in App Service we can see that it loads with no error. In fact, we can even put a breakpoint in the GetTodayRegistrations() method of Service.svc.cs, hit F5 in Visual Studio, and when the page loads in App Service the breakpoint on the local machine is hit!

clip_image021

Conclusion

If you are looking to move applications to the cloud, I hope that this quick introduction to Hybrid Connections will enable you to try moving things incrementally. Additionally, you may find these resources helpful:

As always, if you have any questions, or problems let me know via Twitter, or in the comments section below.

post

Azure preparedness for Hurricane Florence

As Hurricane Florence continues its journey to the mainland, our thoughts are with those in its path. Please stay safe. We’re actively monitoring Azure infrastructure in the region. We at Microsoft have taken all precautions to protect our customers and our people.

Our datacenters (US East, US East 2, and US Gov Virginia) have been reviewed internally and externally to ensure that we are prepared for this weather event. Our onsite teams are prepared to switch to generators if utility power is unavailable or unreliable. All our emergency operating procedures have been reviewed by our team members across the datacenters, and we are ensuring that our personnel have all necessary supplies throughout the event.

As a best practice, all customers should consider their disaster recovery plans and all mission-critical applications should be taking advantage of geo-replication.

Rest assured that Microsoft is focused on the readiness and safety of our teams, as well as our customers’ business interests that rely on our datacenters. 

You can reach our handle @AzureSupport on Twitter, we are online 24/7. Any business impact to customers will be communicated through Azure Service Health in Azure portal.

If there is any change to the situation, we will keep customers informed of Microsoft’s actions through this announcement.

For guidance on Disaster Recovery best practices see references below: 

post

New Azure Pipelines service helps devs build, test and deploy to any platform or cloud

With the introduction of Azure DevOps today, we’re offering developers a new CI/CD service called Azure Pipelines that enables you to continuously build, test, and deploy to any platform or cloud. It has cloud-hosted agents for Linux, macOS, and Windows, powerful workflows with native container support, and flexible deployments to Kubernetes, VMs, and serverless environments.

Microsoft is committed to fueling open source software development. Our next step in this journey is to provide the best CI/CD experience for open source projects. Starting today, Azure Pipelines provides unlimited CI/CD minutes and 10 parallel jobs to every open source project for free. All open source projects run on the same infrastructure that our paying customers use. That means you’ll have the same fast performance and high quality of service. Many of the top open source projects are already using Azure Pipelines for CI/CD, such as Atom, CPython, Pipenv, Tox, Visual Studio Code, and TypeScript – and the list is growing every day.

In the following, you can see Atom running parallel jobs on Linux, macOS, and Windows for its CI.

atom2x

Azure Pipelines app on GitHub Marketplace

Azure Pipelines has an app in the GitHub Marketplace so it’s easy to get started. After you install the app in your GitHub account, you can start running CI/CD for all your repositories.

pipelines2x

Pull Request and CI Checks

When the GitHub app is setup, you’ll see CI/CD checks on each commit to your default branch and every pull request.

highlight2x

Our integration with the GitHub Checks API makes it easy to see build results in your pull request. If there’s a failure, the call stack is shown as well as the impacted files.

githubchecks2x

More than just open source

Azure Pipelines is also great for private repositories. It is the CI/CD solution for companies like Columbia, Shell, Accenture, and many others. It’s also used by Microsoft’s biggest projects like Azure, Office 365, and Bing. Our free offer for private projects includes a cloud-hosted job with 1,800 minutes of CI/CD a month or you can run unlimited minutes of CI/CD on your own hardware, hosted in the cloud or your on-premises hardware. You can purchase parallel jobs for private projects from Azure DevOps or the GitHub Marketplace.

In addition to CI, Azure Pipelines has flexible deployments to any platform and cloud, including Azure, Amazon Web Services, and Google Cloud Platform, as well as any of your on-premises server running Linux, macOS or Windows. There are built-in tasks for Kubernetes, serverless, and VM deployments. Also, there’s a rich ecosystem of extensions for the most popular languages and tools. The Azure Pipelines agent and tasks are open source and we’re always reviewing feedback and accepting pull requests on GitHub.

Join our upcoming live streams to learn more about Azure Pipelines and other Azure DevOps services.


  • Keynote: Watch our live Azure DevOps keynote on September 11, 2018 from 8:00 – 9:30 AM Pacific Time.


  • Live training: Join our live Mixer workshop with interactive Q&A on September 17, 2018 from 8:30 AM – 2:30 PM Pacific Time.

You can save-the-date and watch both live streams on our events page. There you’ll also find additional on-demand videos and other resources to help get you started.

I’m excited for you to try Azure Pipelines and tell us what you think. You can share your thoughts directly to the product team using @AzureDevOps, Developer Community, or comment on this post.

Jeremy Epling

@jeremy_epling