Posted on Leave a comment

Web and Azure Tool Updates in Visual Studio 2019

Angelos Petropoulos


Hopefully by now you’ve seen that Visual Studio 2019 is now generally available. As you would expect, we’ve added improvements for web and Azure development. As a starting point, Visual Studio 2019 comes with a new experience for getting started with your code and we updated the experience for creating ASP.NET and ASP.NET Core projects to match:

If you are publishing your application to Azure, you can now configure Azure App Service to use Azure Storage and Azure SQL Database instances, right from the publish profile summary page, without leaving Visual Studio. This means that for any existing web application running in App Service, you can add SQL and Storage, it is no longer limited to creation time only.

By clicking the “Add” button you get to select between Azure Storage and Azure SQL Database (more Azure services to be supported in the future):

and then you get to choose between using an existing instance of Azure Storage that you provisioned in the past or provisioning a new one right then and there:

When you configure your Azure App Service through the publish profile as demonstrated above, Visual Studio will update the Azure App Service application settings to include the connection strings you have configured (e.g. in this case azgist). It will also apply hidden tags to the instances in Azure about how they are configured to work together so that this information is not lost and can be re-discovered later by other instances of Visual Studio.

For a 30 minute overview of developing with Azure in Visual Studio, check out the session we gave as part of the launch:

As always, we welcome your feedback. Tell us what you like and what you don’t like, tell us which features you are missing and which parts of the workflow work or don’t work for you. You can do this by submitting issues to Developer Community or contacting us via Twitter.

Angelos Petropoulos

Posted on Leave a comment

The future of manufacturing is open

With the expansion of IoT across all industries data is becoming the currency of innovation. Organizations have both an opportunity and a business imperative to adopt technologies quickly, build digital competencies, and offer new value-added services that will serve their broader ecosystem.

Manufacturing is an industry where IoT is having a transformational impact, yet which also requires many companies to come together for IoT to be effective. We see several challenges that slow down innovation in manufacturing, such as proprietary data structures from legacy industrial assets and closed industrial solutions. These closed structures foster data silos and limit productivity, hindering production and profitability. It takes more than new software to drive transformation—it takes a new approach to open standards, an ecosystem mindset, the ability to break out of the “walled garden” for data as well as new technology.

This is why Microsoft has invested heavily in making Azure work seamlessly with OPC UA. In fact, we are the leading contributor of open source software to the OPC Foundation. To further this open platform approach, we have collaborated with world-leading manufacturers to accelerate innovation in industrial IoT to shorten time to value. But we feel we need to do more, not just directly between Microsoft and our partners but across the industry and between the partners themselves. It’s not about what any one company can deliver within their operations – it’s about what they can share with others across the sector to help everyone achieve at new levels. It’s clearly a much bigger task than any one organization can take on, and today, I’m pleased to share more about the investments we are making to advance innovation in the manufacturing space by enabling open platforms.

Announcing the Open Manufacturing Platform

Today at Hannover Messe 2019, we are launching the Open Manufacturing Platform (OMP) together with the BMW Group, our partner on this initiative. Built on the Microsoft Azure Industrial IoT cloud platform, the OMP will provide a reference architecture and open data model framework for community members who will both contribute to and learn from others around industrial IoT projects. We’ve set up an initial approach and are actively working to bring new community members on board. BMW has an initial use case focused on their IoT platform, built on Microsoft Azure, in the second generation of autonomous transport systems in one of their sites, greatly simplifying their logistics processes and creating greater efficiency. More information about this and the partnership can be found here.

The OMP provides a single open platform architecture that liberates data from legacy industrial assets, standardizes data models for more efficient data correlation, and most importantly, enables manufacturers to share their data with ecosystem partners in a controlled and secure way, allowing others to benefit from their insights. With pre-built industrial use cases and reference designs, community members will work together to address common industrial challenges while maintaining ownership over their own data. Our news release, shared jointly with the BMW Group this morning, can be found here.

A rising tide that lifts all boats

The recognition of the need for an open approach is taking hold across the industry, as evidenced by SAP’s announcement today of the Open Industry 4.0 Alliance. This alliance – focused on factories, plants and warehouses – between SAP and a number of European manufacturing leaders will help create an open ecosystem for the operation of highly automated factories.

OMP and the Open Industry 4.0 Alliance are complementary visions. Both recognize the need for an open platform for the cloud and intelligent edge on the ground in the factory. Both highlight an open data model and standards-based data exchange mechanisms that allow for cross-company collaboration.

We’ve been working closely with SAP on efforts like the Open Data Initiative and across the industry on a wide range of initiatives including the Industrial Internet Consortium, the Plattform Industrie 4.0 and the OPC Foundation. We look forward to continuing this fruitful partnership and working to align OMP and the Open Industry 4.0 Alliance. Collaboration is the lifeblood of future manufacturing and the more we work together, the more we can accomplish.

Read more here.

Posted on Leave a comment

Transforming manufacturing with intelligent business applications

Manufacturing has been a driving force for industrial and societal transformation for centuries. Next week, at Hannover Messe—the world’s leading trade show for industrial technology—decades of industrial technology innovation at Microsoft is intersecting with Industry 4.0. As Industry 4.0 ushers in new technological advances to improve operations, competition and customer demands are keeping pace. Customers expect exceptional products and services, without exception, driving a need for greater innovation.

We are joined at the event by dozens of global manufacturers representing industries from automotive to consumer electronics and construction equipment, all using Microsoft Business Applications as a competitive differentiator; intelligent technologies that help transform the entire connected manufacturing ecosystem.

Optimize manufacturing operations and deliver new services

At Hannover Messe, we are showcasing how we empower manufacturers to connect Internet of Things (IoT) sensors on key business-critical assets to business transactions in Dynamics 365 for Finance and Operations. It’s a transformative solution to manage production and stock in real-time and proactively resolve issues, optimize manufacturing operations, maximize the value of assets, and take business performance and customer satisfaction to new heights. Read more about the solution here.

Annata, a leading Microsoft Partner for automotive, heavy equipment, and industrial machinery industries, helped Iceland’s largest vehicle importer and distributor, Brimborg, rapidly expand into the commercial fleet rental market in response to the 2008-11 Icelandic financial crisis. Booth visitors will learn how Annata first unified Brimborg’s business of importing, distributing, selling and servicing cars and construction equipment, with a solution that could cover their entire business with Dynamics 365 for Finance and Operations along with the Annata 365 solution.

When the crises hit, Brimborg deployed the extensive Rental module that enabled them to manage this new part of their business. Brimborg is using IoT data flowing from its rental cars with the service scheduling capabilities of the Annata solution to optimize and ensure timely servicing of the fleet. All rental cars are automatically called in for service and inspections, which is highly important for keeping the fleet healthy. Brimborg is continuing to automate and innovate using IoT, Dynamics 365, and the Annata solution to run its business more efficiently, but also to open new business models and opportunities with customers in other areas of its business. Targeting four-minute car deliveries and 30-second customer returns—and planning new innovative business models based on Microsoft and Annata solutions.

Make the leap to intelligent, connected field service

Attendees at Hannover Messe are getting a glimpse into the future of field service for manufacturers, powered by Microsoft IoT Central, Dynamics 365 for Field Service, and Dynamics 365 Mixed Reality for HoloLens and mobile devices.

Dynamics 365 Connected Field Service transforms field service organizations with solutions to detect and resolve issues remotely before the customer knows of an issue, and drive efficiency and cut costs in the business cost center. The solution leverages connected devices and machine learning capabilities to combine remote monitoring, digital services, and predictive maintenance.

As recently announced, Dynamics 365 Remote Assist for mobile devices brings the HoloLens experience to mobile devices, allowing floor operators and field technicians to collaborate with remote experts and troubleshoot issues in context, without leaving the job site.  Dynamics 365 Guides is a new mixed reality tool that allows employees to learn by doing, with interactive, step-by-step guidance presented on heads-up, hands-free displays in real work situations.

Dynamics 365 Connected Field Service.

Toyota’s North America Production Engineering team uses Dynamics 365 Layout and Dynamics 365 Remote Assist on Microsoft HoloLens in their North America manufacturing centers. Dynamics 365 Layout can improve business processes in valuable ways, such as the layout of digital twins of equipment on the manufacturing floor for safety and process verification, and creating innovative AR training practices through the use of holographic equipment instead of physical equipment in space. Dynamics 365 Remote Assist is in the early phases of being used to improve safety and reduce costs through the use of remote experts for equipment verification and incident response.

At Hannover Messe, Microsoft Partner Hitachi Solutions will showcase how they are helping global organizations create outcome-driven, connected field services with IoT and mixed reality to support maintenance work by field workers to increase uptime and service continuity.

Empower Your Changing Workforce

Start a conversation about Industry 4.0, and inevitably the topic lands on the workforce problem.

As a generation of highly-skilled, specialized workers eye retirement, manufacturers are in a bind to recruit young engineers and operators with the right skillset while adjusting to a new workforce, challenging traditional work culture.

Dynamics 365 for Talent helps HR teams at manufacturing organizations to solve the skills gap, offering intelligent tools to find, attract, and onboard skilled candidates. By automating many manual, time-consuming HR processes, Dynamics 365 for Talent lets HR professionals spend less time on the mundane, and more time on strategic initiatives that grow the business.

New this month, Dynamics 365 Guides allows employees to learn by doing with step-by-step instructions that guide employees to the tools and parts they need and how to use them in real work situations. Guides represents a new way to improve workflow efficiency. Now employees can learn while staying hands-on with their work. With the accompanying PC app, it’s possible for managers and frontline workers to create interactive training content by attaching photos, videos, and 3D models to information cards that stay with them while they work. When employees use Guides, information is collected to help managers understand how they’re doing and where they need help, further allowing people to improve the process.

Check out the video below to learn how PACCAR is exploring Guides and HoloLens to improve productivity and employee onboarding.

Get the full story at Hannover Messe

These solutions and customer stories are just a peek at how Microsoft Business Applications are helping transform manufacturing. If you are attending Hannover Messe, visit our booth to experience Dynamics 365 and Mixed Reality solutions firsthand, as well as chat with customers and partners.

Find more information about our location and sessions in this schedule, and be sure to check out the resources below.

Posted on Leave a comment

Closing the skills gap in manufacturing with Microsoft 365

The manufacturing industry is being transformed by the rise in new digital industrial technology, known as Industry 4.0. New technologies are changing every stage of production, increasing productivity, optimizing operations, and unlocking new areas of growth. In order for manufacturers to capture the value this technology unlocks, they’ll need to ensure their workforce has the right skills and the right tools.

This is especially true as it relates to an organization’s Firstline Workforce. In manufacturing, Firstline Workers are the employees who deliver products and materials, drive product quality, and keep critical equipment running. To help manufacturers with their digital transformation, we’re enabling new ways to work with Microsoft 365 for Firstline Workers to learn, communicate, and collaborate more effectively.

Upskilling and equipping the Firstline Workforce

With the rise of Industry 4.0, manufacturers must reimagine the roles, skills, and tools to transform work throughout their organization. This means providing digital and soft skills, empowering workers with modern tools, and blurring the boundaries of technology with new immersive experiences. In an increasingly digital and complex landscape, the types of skills that employees need are rapidly evolving, and it is increasingly difficult for the workforce to keep pace.

Solutions in Microsoft 365 that enable Firstline Workers to learn, communicate, and collaborate include:

  • Using Microsoft Teams and SharePoint Online, manufacturers can securely centralize training efforts, easily distribute onboarding and training materials, and connect all levels of the organization to find and share best practices.

  • Using Microsoft Stream, organizations can deliver dynamic, role-based content and video to increase engagement and retention of training programs and support peer-to-peer information sharing.

To help equip workers to operate in a digitally-enabled manufacturing environment, Teams provides a single hub for teamwork to communicate, collaborate, and coordinate production from the engineering rooms to the factory floor.

  • Earlier this year, we announced new capabilities—including urgent messaging, location sharing, and image annotations—which organizations can use to create a safer and more efficient workplace. For example, these features can help workers identify, communicate, and share the location of hazardous spills to help reduce operational disruptions.

Image of three phones displaying urgent messaging, location sharing, and image annotations in Teams.

  • Additionally, Microsoft Teams is extensible and allows companies to transform business processes using Microsoft Flow and PowerApps. These services help to digitize everyday activities—such as documentation during quality assurance, data capture, and inventory management—helping reduce costs and free up time for Firstline Workers to focus on higher value activities.

As Industry 4.0 reshapes the manufacturing industry, finding new innovations to help workers learn, communicate, and collaborate remains a top priority. Microsoft is addressing these challenges through breakthroughs in hardware design, artificial intelligence (AI) experiences, mixed reality with HoloLens 2, and through business-ready solutions with Dynamics 365 and industry partners.

  • Using Dynamics 365 Remote Assist, technicians can solve problems faster by calling in remote experts via Microsoft Teams to help walk through repairs using mixed reality annotations, sharing diagrams and schematics. And with Dynamics 365 Guides, employees can learn new skills with step-by-step instructions that guide employees to the tools they need and how to use them in real work situations.

Helping our customers succeed

Leading manufacturers choose Microsoft 365 to prepare, equip, and empower their employees at all levels:

To accelerate productivity and information flow, Cummins replaced its existing productivity and collaboration tools with Microsoft 365, introducing a modern knowledge management and collaboration framework to reduce skills gaps and anchor a new culture of work.

Our modern, tech-driven workplaces give employees the tools they need to innovate, so we can introduce new energy products and technology solutions to the market. It’s also a key strategy in attracting top talent.”
—Sherry Aaholm, VP and CIO for Cummins

Goodyear is using the integrated and adaptive tools in Microsoft 365 to help accelerate innovation and enable new capabilities inside the company. For example, Goodyear is connecting its workforce via tools like Teams, which is driving productivity and generating efficiencies to deliver the right products to the right place at the right time.

“Enhancing collaboration is crucial to us for improved decision making and to drive innovation, both in tires and beyond tires… Our multigenerational and multicultural global workforce is now sharing perspectives and ideas more quickly and easily than ever.”
—Sherry Neubert, CIO for The Goodyear Tire & Rubber Company

We’re incredibly excited about our opportunity to help manufacturers transform and we are just getting started!

Join us at Hannover Messe and learn more

Next week, members of the Microsoft team will be at Hannover Messe, the annual manufacturing conference. Visit us at Microsoft stand C40 and learn how Microsoft is enabling Intelligent Manufacturing.

Posted on Leave a comment

Re-reading ASP.Net Core request bodies with EnableBuffering()



In some scenarios there’s a need to read the request body multiple times. Some examples include

  • Logging the raw requests to replay in load test environment
  • Middleware that read the request body multiple times to process it

Usually Request.Body does not support rewinding, so it can only be read once. A straightforward solution is to save a copy of the stream in another stream that supports seeking so the content can be read multiple times from the copy.

In ASP.NET framework it was possible to read the body of an HTTP request multiple times using HttpRequest.GetBufferedInputStream method. However, in ASP.NET Core a different approach must be used.

In ASP.NET Core 2.1 we added an extension method EnableBuffering() for HttpRequest. This is the suggested way to enable request body for multiple reads. Here is an example usage in the InvokeAsync() method of a custom ASP.NET middleware:

public async Task InvokeAsync(HttpContext context, RequestDelegate next)
{ context.Request.EnableBuffering(); // Leave the body open so the next middleware can read it. using (var reader = new StreamReader( context.Request.Body, encoding: Encoding.UTF8, detectEncodingFromByteOrderMarks: false, bufferSize: bufferSize, leaveOpen: true)) { var body = await reader.ReadToEndAsync(); // Do some processing with body… // Reset the request body stream position so the next middleware can read it context.Request.Body.Position = 0; } // Call the next delegate/middleware in the pipeline await next(context);

The backing FileBufferingReadStream uses memory stream of a certain size first then falls back to a temporary file stream. By default the size of the memory stream is 30KB. There are also other EnableBuffering() overloads that allow specifying a different threshold, and/or a limit for the total size:

public static void EnableBuffering(this HttpRequest request, int bufferThreshold) public static void EnableBuffering(this HttpRequest request, long bufferLimit) public static void EnableBuffering(this HttpRequest request, int bufferThreshold, long bufferLimit)

For example, a call of

context.Request.EnableBuffering(bufferThreshold: 1024 * 45, bufferLimit: 1024 * 100);

enables a read buffer with limit of 100KB. Data is buffered in memory until the content exceeds 45KB, then it’s moved to a temporary file. By default there’s no limit on the buffer size but if there’s one specified and the content of request body exceeds the limit, an System.IOException will be thrown.

These overloads offer flexibility if there’s a need to fine-tune the buffering behaviors. Just keep in mind that:

  • Even though the memory stream is rented from a pool, it still has memory cost associated with it.
  • After the read is over the bufferThreshold the performance will be slower since a file stream will be used.
Jeremy Meng

Software Development Engineer

Follow Jeremy   



Posted on Leave a comment

ASP.NET Core updates in .NET Core 3.0 Preview 2

.NET Core 3.0 Preview 2 is now available and it includes a bunch of new updates to ASP.NET Core.

Here’s the list of what’s new in this preview:

  • Razor Components
  • SignalR client-to-server streaming
  • Pipes on HttpContext
  • Generic host in templates
  • Endpoint routing updates

Get started

To get started with ASP.NET Core in .NET Core 3.0 Preview 2 install the .NET Core 3.0 Preview 2 SDK

If you’re on Windows using Visual Studio, you’ll also want to install the latest preview of Visual Studio 2019.

Upgrade an existing project

To upgrade an existing an ASP.NET Core app to .NET Core 3.0 Preview 2, follow the migrations steps in the ASP.NET Core docs.

Add package for Json.NET

As part of the work to tidy up the ASP.NET Core shared framework, Json.NET is being removed from the shared framework and now needs to be added as a package.

To add back Json.NET support to an ASP.NET Core 3.0 project:

Runtime compilation removed

As a consequence of cleaning up the ASP.NET Core shared framework to not depend on Roslyn, support for runtime compilation of pages and views has also been removed in this preview release. Instead compilation of pages and views is performed at build time. In a future preview update we will provide a NuGet packages for optionally enabling runtime compilation support in an app.

Other breaking changes and announcements

For a full list of other breaking changes and announcements for this release please see the ASP.NET Core Announcements repo.

Build modern web UI with Razor Components

Razor Components are a new way to build interactive client-side web UI with ASP.NET Core. This release of .NET Core 3.0 Preview 2 adds support for Razor Components to ASP.NET Core and for hosting Razor Components on the server. For those of you who have been following along with the experimental Blazor project, Razor Components represent the integration of the Blazor component model into ASP.NET Core along with the server-side Blazor hosting model. ASP.NET Core Razor Components is a new capability in ASP.NET Core to host Razor Components on the server over a real-time connection.

Working with Razor Components

Razor Components are self-contained chunks of user interface (UI), such as a page, dialog, or form. Razor Components are normal .NET classes that define UI rendering logic and client-side event handlers, so you can write rich interactive web apps without having to write any JavaScript. Razor components are typically authored using Razor syntax, a natural blend of HTML and C#. Razor Components are similar to Razor Pages and MVC Views in that they both use Razor. But unlike pages and views, which are built around a request/reply model, components are used specifically for handling UI composition.

To create, build, and run your first ASP.NET Core app with Razor Components run the following from the command line:

dotnet new razorcomponents -o WebApplication1
cd WebApplication1
dotnet run

Or create an ASP.NET Core Razor Components in Visual Studio 2019:

Razor Components template

The generated solution has two projects: a server project (WebApplication1.Server), and a project with client-side web UI logic written using Razor Components (WebApplication1.App). The server project is an ASP.NET Core project setup to host the Razor Components.

Razor Components solution

Why two projects? In part it’s to separate the UI logic from the rest of the application. There is also a technical limitation in this preview that we are using the same Razor file extension (.cshtml) for Razor Components that we also use for Razor Pages and Views, but they have different compilation models, so they need to kept separate. In a future preview we plan to introduce a new file extension for Razor Components (.razor) so that you can easily host your components, pages, and views all in the same project.

When you run the app you should see multiple pages (Home, Counter, and Fetch data) on different tabs. On the Counter page you can click a button to increment a counter without any page refresh. Normally this would require writing JavaScript, but here everything is written using Razor Components in C#!

Razor Components app

Here’s what the Counter component code looks like:

@page "/counter"


<p>Current count: @currentCount</p>

<button class="btn btn-primary" onclick="@IncrementCount">Click me</button>

@functions {
 int currentCount = 0;

 void IncrementCount()

Making a request to /counter, as specified by the @page directive at the top, causes the component to render its content. Components render into an in-memory representation of the render tree that can then be used to update the UI in a very flexible and efficient way. Each time the “Click me” button is clicked the onclick event is fired and the IncrementCount method is called. The currentCount gets incremented and the component is rendered again. The runtime compares the newly rendered content with what was rendered previously and only the changes are then applied to the DOM (i.e. the updated count).

You can use components from other components using an HTML-like syntax where component parameters are specified using attributes or child content. For example, you can add a Counter component to the app’s home page like this:

@page "/"

<h1>Hello, world!</h1>

Welcome to your new app.

<Counter />

To add a parameter to the Counter component update the @functions block to add a property decorated with the [Parameter] attribute:

@functions {
 int currentCount = 0;

 [Parameter] int IncrementAmount { get; set; } = 1;

 void IncrementCount()

Now you can specify IncrementAmount parameter value using an attribute like this:

<Counter IncrementAmount="10" />

The Home page then has it’s own counter that increments by tens:

Count by tens

This is just an intro to what Razor Components are capable of. Razor Components are based on the Blazor component model and they support all of the same features (parameters, child content, templates, lifecycle events, component references, etc.). To learn more about Razor Components check out the component model docs and try out building your first Razor Components app yourself.

Hosting Razor Components

Because Razor Components decouple a component’s rendering logic from how the UI updates get applied, there is a lot of flexibility in how Razor Components can be hosted. ASP.NET Core Razor Components in .NET Core 3.0 adds support for hosting Razor Components on the server in an ASP.NET Core app where all UI updates are handled over a SignalR connection. The runtime handles sending UI events from the browser to the server and then applies UI updates sent by the server back to the browser after running the components. The same connection is also used to handle JavaScript interop calls.

ASP.NET Core Razor Components

Alternatively, Blazor is an experimental single page app framework that runs Razor Components directly in the browser using a WebAssembly based .NET runtime. In Blazor apps the UI updates from the Razor Components are all applied in process directly to the DOM.


Support for the client-side Blazor hosting model using WebAssembly won’t ship with ASP.NET Core 3.0, but we are working towards shipping it with a later release.

Regardless of which hosting model you use for your Razor Components, the component model is the same. The same Razor Components can be used with either hosting model. You can even switch your app back and forth from being a client-side Blazor app or Razor Components running in ASP.NET Core using the same components as long as your components haven’t taken any server specific dependencies.

JavaScript interop

Razor Components can also use client-side JavaScript if needed. From a Razor Component you can call into any browser API or into an existing JavaScript library running in the browser. .NET library authors can use JavaScript interop to provide .NET wrappers for JavaScript APIs, so that they can be conveniently called from Razor Components.

public class ExampleJsInterop
 public static Task<string> Prompt(this IJSRuntime js, string text)
 // showPrompt is implemented in wwwroot/exampleJsInterop.js
 return js.InvokeAsync<string>("exampleJsFunctions.showPrompt", text);
@inject IJSRuntime JS

<button onclick="@OnClick">Show prompt</button>

@functions {
 string name;

 async Task OnClick() {
 name = await JS.Prompt("Hi! What's you're name?");

Both Razor Components and Blazor share the same JavaScript interop abstraction, so .NET libraries relying on JavaScript interop are usable by both types of apps. Check out the JavaScript interop docs for more details on using JavaScript interop and the Blazor community page for existing JavaScript interop libraries.

Sharing component libraries

Components can be easily shared and reused just like you would normal .NET classes. Razor Components can be built into component libraries and then shared as NuGet packages. You can find existing component libraries on the Blazor community page.

The .NET Core 3.0 Preview 2 SDK doesn’t include a project template for Razor Component Class Libraries yet, but we expect to add one in a future preview. In meantime, you can use Blazor Component Class Library template.

dotnet new -i Microsoft.AspNetCore.Blazor.Templates
dotnet new blazorlib

In this preview release ASP.NET Core Razor Components don’t yet support using static assets in component libraries, so the support for component class libraries is pretty limited. However, in a future preview we expect to add this support for using static assets from a library just like you can in Blazor today.

Integration with MVC Views and Razor Pages

Razor Components can be used with your existing Razor Pages and MVC apps. There is no need to rewrite existing views or pages to use Razor Components. Components can be used from within a view or page. When the page or view is rendered, any components used will be prerendered at the same time.

To render a component from a Razor Page or MVC View in this release, use the RenderComponentAsync<TComponent> HTML helper method:

<div id="Counter">
 @(await Html.RenderComponentAsync<Counter>(new { IncrementAmount = 10 }))

Components rendered from pages and views will be prerendered, but are not yet interactive (i.e. clicking the Counter button doesn’t do anything in this release). This will get addressed in a future preview, along with adding support for rendering components from pages and views using the normal element and attribute syntax.

While views and pages can use components the converse is not true: components can’t use views and pages specific features, like partial views and sections. If you want to use a logic from partial view in a component you’ll need to factor that logic out first as a component.

Cross platform tooling

Visual Studio 2019 comes with built-in editor support for Razor Components including completions and diagnostics in the editor. You don’t need to install any additional extensions.

Razor Components tooling

Razor Component tooling isn’t available yet in Visual Studio for Mac or Visual Studio Code, but it’s something we are actively working on.

A bright future for Blazor

In parallel with the ASP.NET Core 3.0 work, we will continue ship updated experimental releases of Blazor to support hosting Razor Components client-side in the browser (we’ll have more to share on the latest Blazor update shortly!). While in ASP.NET Core 3.0 we will only support hosting Razor Components in ASP.NET Core, we are also working towards shipping Blazor and support for running Razor Components in the browser on WebAssembly in a future release.

SignalR client-to-server streaming

With ASP.NET Core SignalR we added Streaming support, which enables streaming return values from server-side methods. This is useful for when fragments of data will come in over a period of time.

With .NET Core 3.0 Preview 2 we’ve added client-to-server streaming. With client-to-server streaming, your server-side methods can take instances of a ChannelReader<T>. In the C# code sample below, the StartStream method on the Hub will receive a stream of strings from the client.

public async Task StartStream(string streamName, ChannelReader<string> streamContent)
 // read from and process stream items
 while (await streamContent.WaitToReadAsync(Context.ConnectionAborted))
 while (streamContent.TryRead(out var content))
 // process content

Clients would use the SignalR Subject (or an RxJS Subject) as an argument to the streamContent parameter of the Hub method above.

let subject = new signalR.Subject();
await connection.send("StartStream", "MyAsciiArtStream", subject);

The JavaScript code would then use the method to handle strings as they are captured and ready to be sent to the server."example");

Using code like the two snippets above, you can create real-time streaming experiences. For a preview of what you can do with client-side streaming with SignalR, take a look at the demo site, If you create your own stream, you can stream ASCII art representations of image data being captured by your local web cam to the server, where it will be bounced out to other clients who are watching your stream.

Client-to-server Streaming with SignalR

System.IO.Pipelines on HttpContext

In ASP.NET Core 3.0, we’re working on consuming the System.IO.Pipelines API and exposing it in ASP.NET Core to allow you to write more performant applications.

In Preview 2, we’re exposing the request body pipe and response body pipe on the HttpContext that you can directly read from and write to respectively in addition to maintaining the existing Stream-based APIs.
While these pipes are currently just wrappers over the existing streams, we will directly expose the underlying pipes in a future preview.

Here’s an example that demonstrates using both the request body and response body pipes directly.

public void Configure(IApplicationBuilder app, IHostingEnvironment env)
 if (env.IsDevelopment())

 app.UseRouting(routes =>
 routes.MapGet("/", async context =>
 await context.Response.WriteAsync("Hello World");

 routes.MapPost("/", async context =>
 while (true)
 var result = await context.Request.BodyPipe.ReadAsync();
 var buffer = result.Buffer;

 if (result.IsCompleted)


Generic host in templates

The templates have been updated to use the Generic Host instead of WebHostBuilder as they have in the past:

public static void Main(string[] args)

public static IHostBuilder CreateHostBuilder(string[] args) =>
 .ConfigureWebHostDefaults(webBuilder =>

This is part of the ongoing plan started in 2.0 to better integrate ASP.NET Core with other server scenarios that are not web specific.

What about IWebHostBuilder?

The IWebHostBuilder interface that is used with WebHostBuilder today will be kept, and is the type of the webBuilder used in the sample code above. We intend to deprecate and eventually remove WebHostBuilder itself as its functionality will be replaced by HostBuilder, though the interface will remain.

The biggest difference between WebHostBuilder and HostBuilder is that you can no longer inject arbitrary services into your Startup.cs. Instead you will be limited to the IHostingEnvironment and IConfiguration interfaces. This removes a behavior quirk related to injecting services into Startup.cs before the ConfigureServices method is called. We will publish more details on the differences between WebHostBuilder and HostBuilder in a future deep-dive post.

Endpoint routing updates

We’re excited to start introducing more of the Endpoint Routing story that began in 2.2. Endpoint routing allows frameworks like MVC as well as other routable things to mix with middleware in a way that hasn’t been possible before. This is now present in the project templates in 3.0.0-preview-2, we’ll continue to add more richness as we move closer to a final release.

Here’s an example:

public void Configure(IApplicationBuilder app, IHostingEnvironment env)
 if (env.IsDevelopment())


 app.UseRouting(routes =>

 routes.MapGet("/hello", context =>
 return context.Response.WriteAsync("Hi there!"); 



There’s a few things to unpack here.

First, the UseRouting(...) call adds a new Endpoint Routing middleware. UseRouting is at the core of many of the templates in 3.0 and replaces many of the features that were implemented inside UseMvc(...) in the past.

Also notice that inside UseRouting(...) we’re setting up a few things. MapApplication() brings in MVC controllers and pages for routing. MapGet(...) shows how to wire up a request delegate to routing. MapHealthChecks(...) hooks up the health check middleware, but by plugging it into routing.

What might be surprising to see is that some middleware now come after UseRouting. Let’s tweak this example to demonstrate why that is valuable.

public void Configure(IApplicationBuilder app, IHostingEnvironment env)
 if (env.IsDevelopment())


 app.UseRouting(routes =>

 routes.MapGet("/hello", context =>
 return context.Response.WriteAsync("Hi there! Here's your secret message"); 
 .RequireAuthorization(new AuthorizeAttribute(){ Roles = "secret-messages", });



Now I’ve added an AuthorizeAttribute to my request delegate. This is just like placing [Authorize(Roles = "secret-messages")] on an action method in a controller. We’ve also given the health checks middleware an authorization policy as well (by policy name).

This works because the following steps happen in order (ignoring what happens before routing):

  1. UseRouting(...) makes a routing decision – selecting an Endpoint
  2. UseAuthorization() looks at the Endpoint that was selected and runs the corresponding authorization policy
  3. hidden… At the end of the middleware pipeline the Endpoint is executed (if no endpoint was matched then a 404 response is returned)

So think of UseRouting(...) as making a deferred routing decision – where middleware that appear after it run in the middle. Any middleware that run after routing can see the results and read or modify the route data and chosen endpoint. When processing reaches the end of the pipeline, then the endpoint is invoked.

What is an Endpoint and why did we add this?

An Endpoint is a new primitive to help frameworks (like MVC) be friends with middleware. Fundamentally an Endpoint is a request delegate (something that can execute) plus a bag of metadata (policies).

Here’s an example middleware – you can use this to examine endpoints in the debugger or by printing to the console:

app.Use(next => (context) =>
 var endpoint = context.GetEndpoint();
 if (endpoint != null)
 Console.WriteLine("Name: " + endpoint.DisplayName);
 Console.WriteLine("Route: " + (endpoint as RouteEndpoint)?.RoutePattern);
 Console.WriteLine("Metadata: " + string.Join(", ", endpoint.Metadata));

 return next(context);

In the past we haven’t had a good solution when we’ve wanted to implement a policy like CORS or Authorization in both middleware and MVC. Putting a middleware in the pipeline feels very good because you get to configure the order. Putting filters and attributes on methods in controllers feels really good when you need to apply policies to different parts of the application. Endpoints bring togther all of these advantages.

As an addition problem – what do you do if you’re writing the health checks middleware? You might want to secure your middleware in a way that developers can customize. Being able to leverage the ASP.NET Core features for this directly avoids the need to build in support for cross-cutting concerns in every component that serves HTTP.

In addition to removing code duplication from MVC, the Endpoint + Middleware solution can be used by any other ASP.NET Core-based technologies. You don’t even need to use UseRouting(...) – all that is required to leverage the enhancements to middleware is to set an Endpoint on the HttpContext.

What’s integrated with this?

We added the new authorize middleware so that you can start doing more powerful security things with just middleware. The authorize middleware can accept a default policy that applies when there’s no endpoint, or the endpoint doesn’t specify a policy.

CORS is also now endpoint routing aware and will use the CORS policy specified on an endpoint.

MVC also plugs in to endpoint routing and will create endpoints for all of your controllers and pages. MVC can now be used with the CORS and authorize features and will largely work the same. We’ve long had confusion about whether to use the CORS middleware or CORS filters in MVC, the updated guidance is to use both. This allows you to provide CORS support to other middleware or static files, while still applying more granular CORS policies with the existing attributes.

Health checks also provide methods to register the health checks middleware as a router-ware (as shown above). This allows you to specify other kinds of policies for health checks.

Finally, new in ASP.NET Core 3.0 preview 2 is host matching for routes. Placing the HostAttribute on an MVC controller or action will prompt the routing system to require the specified domain or port. Or you can use RequireHost in your Startup.cs:

app.UseRouting(routes =>
 routes.MapGet("/", context => context.Response.WriteAsync("Hi Contoso!"))

 routes.MapGet("/", context => context.Response.WriteAsync("Hi AdventureWorks!"))


Do you think that there are things that are missing from the endpoint story? Are there more things we should make smarter or more integrated? Please let us know what you’d like to see.

Give feedback

We hope you enjoy the new features in this preview release of ASP.NET Core! Please let us know what you think by filing issues on Github.

Posted on Leave a comment

Microsoft’s Redmond campus modernization update: Demolition begins

Stack of white hardhats bearing the Microsoft logo

Today marks the next milestone in Microsoft’s campus modernization effort. Deconstruction is beginning, and buildings will start coming down.

When the project is complete, the new campus will provide a modern workplace and create greater collaboration and community. To commemorate the original buildings, the company offered an exclusive opportunity for a group of employees to say goodbye to the original campus with a demolition party to kick off the destruction. On Tuesday, one employee and nine of his teammates (who collectively donated money to charity to win the experience via the Employee Giving Campaign auction) took to the company’s first buildings equipped with hard hats, sledgehammers and “the claw.” Check out some highlights from the fun below:

“It is great to see the interest and excitement from employees for the campus modernization,” said Michael Ford, Microsoft general manager of global real estate and security. “Our employees are crucial to building an exceptional place to work, and this event was a great way to kick off this journey together.”

Moving forward

Over the next few months, Microsoft will continue the decommissioning and demolition of 12 buildings, embracing sustainable practices throughout the process.

In 2016, Microsoft became the first technology company in the U.S. to be certified Zero Waste for diverting at least 90 percent of its waste from the landfill. The company’s goal with this project is to remain in line with this certification for construction materials and divert a majority of building materials from the landfill. This means focusing on reusing, donating and recycling. From concrete and steel framing to carpets, ceiling tiles, electronic and networking gear, interior debris and loose assets like furniture, chairs and whiteboards, to even the artificial turf outside — most of the materials in the old spaces will find a new life.

“We strive to make a positive impact on the community,” Ford said. “We’re putting a lot of effort behind finding innovative ways to reduce our impact and optimize our resource usage.”

Beyond what is being recycled, the company is also considering where materials will be processed. To maximize sustainability, Microsoft’s construction team is engaging with local waste processing and recycling companies to study and prioritize the hauling distances to further shrink the project’s construction carbon footprint.

“Corporate and environmental responsibility are equally as important as budget and schedule — and we are aligning our design and construction practices with Microsoft’s global corporate responsibility and sustainability missions,” Ford said. “It feels good to know that here in our hometown we’re supporting this vision.”

Follow updates and developments as this project progresses on Microsoft’s Modern Campus site.

Posted on Leave a comment

Microsoft Store powers new experiences for all at ultra-accessible theme park in Texas

When Gordon Hartman opened Morgan’s Wonderland in April 2010, it was the culmination of a dream he’d had for five years after seeing his daughter, Morgan, a girl with physical and cognitive disabilities, wanting to play with other vacationing kids at a hotel swimming pool – but the children were leery of her and didn’t want to interact with her.

At that moment, he resolved to create opportunities and places where people with and without disabilities can come together not only for fun, but also for a better understanding of one another.

So he and his wife Maggie built the first ultra-accessible theme park of its kind. Completely wheelchair-accessible, the destination in San Antonio, Texas, has hosted more than 1.3 million guests from the U.S. and 69 other countries since it opened. With more than 25 attractions, including rides, Wonderlands, gardens, an 8-acre catch-and-release fishing lake and an 18,000-square-foot special-event center, it’s long been established as fun and understanding for everyone.

Now it’s going to give its visitors a new technology experience.

From day one, Morgan’s Wonderland has focused on inclusion and ultra-accessible play for people with and without disabilities, says Hartman. But technology has been a game-changer for his family.

“My daughter, Morgan, who has both physical and cognitive disabilities, but computer technology has made it possible for her to become more engaged, to become more inquisitive and to understand more of the world around her,” says Hartman.

To help the park’s visitors experience that same type of engagement, Microsoft Store is revamping the Sensory Village at Morgan’s Wonderland. The new Microsoft Experience includes an interactive gaming area, featuring Xbox One X stations and the Xbox Adaptive Controller, which connects to external buttons, switches, mounts and joysticks to give gamers with limited mobility, an easy-to-set-up and readily available way to play Xbox One games.

Additionally, Surface devices will connect to the park’s interactive map, taking guests through the favorite attractions of the Wonder Squad, Morgan’s Wonderland’s super heroes.

Finally, a Wish Machine, powered by Surface Studio, will take submissions from visitors for the chance to have their holiday wish granted by Microsoft Store through Dec. 20.*

Three boys in front of a TV screen, playing a video game using the Xbox Adaptive Controller

“We chose the experiences based on our mission of inclusion,” says Kara Rowe, worldwide director of Microsoft Store Visuals & Experience. “Understanding that mission was paramount in determining what we chose, such as what we experience today in our stores that resonates the most with our customers.”

Rowe says the company created a custom-made Xbox One X gameplay station with the Xbox Adaptive Controllers laid out to be easily accessible for wheelchairs. And in turn, they’ll use the feedback they get from the park’s guests to help create and refine even more accessible experiences in the future – for Microsoft Store and the company’s product groups.

“We were thoughtful about the different use case scenarios for any guest to make sure that the experiences are truly ultra-inclusive,” says Rowe, referring to a phrase Hartman coined from the early days at the park.

The cross-company collaboration, which started with Microsoft volunteers from the Microsoft San Antonio datacenter and Microsoft Store locations, also helped provide auditory and visual cues, such as braille, that would help visitors to the Sensory Village.

“Every person has jumped at the chance to be involved, sharing personal stories as to why it matters to them,” Rowe says.

Visitors at the Sensory Village in Morgan's Wonderland

Hartman credits Microsoft with adding technological vitality to the park.

“It utilizes technology that enables people with disabilities individuals to interact and have fun together in an environment designed to stimulate the senses,” he says.

Rowe says this is the first time a Microsoft Experience has been in a theme park.

“This is about understanding how to teach and help a community learn with technology available to them in an inclusive way,” Rowe says. “Opening up in this space is really special. It is magical for us to have local employees volunteering because our company inspires us to do these activities, to activate cross channels, to come together and align to this greater mission of empowerment.”

Microsoft has committed to maintain and evaluate the space to ensure refreshed experiences.

“Sensory Village is built on technology, and Microsoft is a technological giant,” Hartman says. “We believe Microsoft’s leadership in technology can translate into tremendous benefits for the disability community.”

*The Wish Machine experience celebrates the excitement of the holiday season and the power of a wish. Wish Machine is powered by Surface Studio, where contestants record a video up to 60 seconds long of themselves making a holiday wish. NO PURCHASE NECESSARY. Open to guests of Morgan’s Wonderland who are at least 13 years old. Enter by December 20, 2018. For Official Rules, including prize descriptions, click here.) Void where prohibited.

*Editor’s note: Correction on spelling of Gordon Hartman’s first name.

Updated December 10, 2018 1:25 pm

Posted on Leave a comment

Embark on the Voyage Aquatic, a new Minecraft Hour of Code

More than 50 percent of jobs require technology skills, and in less than a decade that number will grow to 77 percent of jobs.1 Just 40% of schools have classes that teach programming and if you are a girl, black or Hispanic, or live in a rural community you are even less likely to have access.2

Minecraft and Microsoft are committed to helping close the STEM gap and expanding opportunities for students to learn computer science. For the fourth year, we are partnering with to support Hour of Code, a global movement demystifying computer science and making coding more accessible through one-hour tutorials and events. Hour of Code helps students get ‘Future Ready’ by connecting them with STEM learning experiences and career opportunities. 

Today, we launched a new Minecraft Hour of Code tutorial, the Voyage Aquatic, which takes learners on an aquatic adventure to find treasure and solve puzzles with coding. Voyage Aquatic encourages students to think creatively, try different coding solutions and apply what they learn in mysterious Minecraft worlds. 

Since 2015, learners around the world have completed nearly 100 million Minecraft Hour of Code sessions. The tutorials offer more than 50 puzzles, as well as professional development, facilitator guides and online training to help educators get started teaching computer science.

Dive into the Voyage Aquatic 

Minecraft teamed up with four YouTube creators – AmyLee33, Netty PlaysiBallisticSquid and Tomohawk, with a cumulative following of more than five million – for this year’s Minecraft Hour of Code. These creative YouTubers guide participants along their journey through caves, ruins and underwater reefs to solve puzzles and learn coding concepts. 

Voyage Aquatic presents 12 unique challenges, focusing on how to use loops and conditionals, two fundamental concepts in computer science. The tutorial also includes a ‘free play’ level for participants to apply what they learn in the prior puzzles and use coding to build imaginative underwater creations.  

People of all ages and experience levels can use the Minecraft world to learn the basics of coding. The tutorial is free, open to anyone and available for any device. If your language is not available, you can help contribute to translation here.

Host an Hour of Code

Anyone can host an Hour of Code. Download a free facilitator’s guide to lead your own Minecraft Hour of Code at your school, library, museum, learning center or even at home. 

Learn how to effectively facilitate an Hour of Code with this new Microsoft Education course for educators. Learn about the benefits of Hour of Code for your students, and where to find resources to lead an Hour of Code. 

Let us know about your experience by posting on Facebook or Twitter and make sure to mention #HourofCode. Tag us @playcraftlearn!

Keep coding in Minecraft

You can continue your coding journey in Minecraft: Education Edition (or Minecraft on Windows 10) using Code Builder, a special feature that allows you to code in Minecraft. 

  • Visit for trainings, lessons and classroom activities to go beyond Hour of Code with your students. 
  • If you already have a license for Minecraft: Education Edition, click this link to launch a special Voyage Aquatic world in Minecraft. Use code to fill an aquarium with marine life. 
  • If you are not licensed, you can download a free trial of Minecraft: Education Edition for Windows 10, macOS and iPad by visiting

Save the date: Computer Science Education Week, December 3-9 

  • Attend an event: Join an Hour of Code or STEM workshop at a Microsoft Store near you. Sign up at 
  • Connect coding to careers with Skype in the Classroom: Sign up for free 30-minute Skype in the Classroom broadcasts and live Q&A with professionals who use code to create amazing things, including two Minecraft game designers! Happening December 3-7, introduce your students to nine inspiring ‘Code Creators’ in the worlds of dance, fashion, gaming, animation and artificial intelligence in a series brought to you by Microsoft and Get your questions ready as we explore code-powered creativity. Register for free at
  • Learn how to bring CS to your school: Microsoft is helping close the skills gap for all youth by increasing access to equitable computer science education. Discover resources at 

The Future of Jobs Employment, Skills and Workforce Strategy for the Fourth Industrial Revolution, World Economic Forum, January 2016.

2 Pioneering Results in the Blueprint of U.S. K-12 Computer Science Education, Google Gallup poll, 2016.

The post Embark on the Voyage Aquatic, a new Minecraft Hour of Code appeared first on Minecraft: Education Edition.

This post was originally published on this site.

Posted on Leave a comment

Microsoft Store releases top cybersecurity tips for small business owners

October is National Cyber Security Awareness Month, and Microsoft Store is committed to helping small business owners understand the dangers of cyberattacks and the actionable steps they can take to mitigate cyber threats.

Getting hacked is a scary threat, yet most small business owners don’t think they’re at risk.

Nearly 82 percent[1] of small business owners think their business does not have data that hackers would be interested in stealing, yet 61 percent[2] have suffered a cyberattack and over half[3] have had a data breach.

What’s even scarier than not knowing your business is vulnerable to a cyberattack is finding out your business has been hacked.

Identifying Vulnerabilities

Small business owners may not know the appropriate steps to protect themselves, and others may find the topic can be daunting.

This was the case for Quants Bakery, a six-employee catering and subscription-based vegan bakery in Glendora, California. Sean Etesham, CEO, started the online-based business in September 2017 after he saw a need for more vegan options at his local coffee shops. Like so many first-time small business owners who are strapped for resources, he launched the business and operated it without a dedicated IT person or cybersecurity software.

“Our worst nightmare would be if someone broke into the back end …and took all of the customer credit card information…and posted it online,” shared Sean.

Microsoft Store recently teamed up with a Microsoft cybersecurity expert to take Sean and his now CIO, Richard Idigo, through a lifelike simulation of a common phishing scam, giving them a firsthand look at the damage that can be done in mere minutes. Sean and Richard were shocked at how easily a hacker can create a false site to access passwords and compromise their data.

“I’ve made a lot of sacrifices in terms of my time and energy that could’ve been spent elsewhere,” explained Sean about starting his business. “I cashed out my retirement accounts, and my investment accounts, and put all that money towards the business.”

Any small business owner can attest that success and health of their business is of utmost importance, so addressing cybersecurity vulnerabilities should be a top concern.

The simulation helped Sean to understand the type of security protections and other actions he needed to have in place to safeguard his business and avoid phishing attacks. He worked with a business solutions specialist at Microsoft Store to decide on Microsoft 365 Business as the right solution for their productivity, collaboration, and security needs.

By understanding cybersecurity risk factors, every small business owner can put an affordable, actionable plan in place to mitigate risk and save time and money it takes to recover from a breach.

Addressing Vulnerabilities

Many small business owners do not have IT departments to turn to in the case of a cybersecurity emergency. During National Cyber Security Awareness Month, Microsoft Store is offering small business owners the opportunity to understand the risks of cyberattacks and will be the go-to destination for personalized guidance from in-store business solutions specialists.

If you’re a small business owner, visit Microsoft Store and online to discover cybersecurity solutions like Microsoft 365 Business that are customized for the needs of small businesses to more effectively navigate the cybersecurity landscape and mitigate cyber risks.

[1] SMEs and Cyber Attacks, by Towergate Insurance

[2] 2017 State of Cybersecurity In Small & Medium Sized Businesses, 2017 by The Ponemon Institue

[3] SMBs Are Vulnerable To Cyber Attacks, 2016 by The Ponemon Institute

[4] HACKED: Just Because It’s in the Cloud, Doesn’t Mean Bad Guys Can’t Reach It, 2017 by UPS Capital