Meaningful innovation: Human ingenuity, powered by AI

This week, I’ve had a great time in the 110-degree heat in Las Vegas at our Inspire and Ready conferences (well, maybe not enjoying the heat so much). I’ve had the opportunity to share the role of innovation and the power of artificial intelligence with partners and employees from every corner of the globe. This is a really exciting time to be at the forefront of the AI revolution, and it’s been great to be involved in discussions about the profound effect AI is having on how we live and work.

Making the most of AI in business

Making AI real, and delivering tangible impact from it, is something we must work together to achieve. Our partners play a critical role in making the most of the opportunities that are only possible through AI. All week, I’ve heard stories of our partners embracing the power of AI to differentiate their services and extend their capabilities to create new business opportunities for themselves and great business outcomes for their customers.

For example, there’s Xanterra – a customer-experience-driven business that hosts 26 million visitors all over the world. Their goal is to provide world-class service and personalization for their well-traveled guests, many of whom are repeat customers at their different global properties. Xanterra’s challenge was that their properties weren’t connected, creating a gap between the high-touch experiences the company delivers and the need to personalize those experiences for guests. Our partner, RedPoint Global, used Microsoft AI to help Xanterra create a detailed profile of each and every customer by connecting data from more than 100 different sources. Now, they use this information to anticipate their guests’ needs more effectively and generate relevant, customized offers to each. Because of this, the Xanterra team increased their revenue with each touchpoint, elevated the guest experience and is sending customer communications more efficiently.

YouTube Video


So how can we help unlock more opportunities for people to use AI? Our new Microsoft Azure AI Accelerate Program is helping our partners take advantage of the opportunities AI can bring to their business. It focuses on helping grow ecosystems and bridge adoption barriers to create more value for everyone. We’re also making it easier for partners to access AI Business School by connecting each learning path into the Partner Training Center. In addition, we are publishing the AI Business School in a Box in the Partner Marketing Center, a set of unique assets to enable partners to lead AI engagements with confidence.

Making the most of AI in society: AI for Good

While the opportunity that AI presents for our customers and partners is huge, the potential for AI to positively benefit society through ingenuity, vision and scale is even bigger. Microsoft’s AI for Good initiative already has more than 300 grantees across 63 countries. All of them are using AI to tackle some of our greatest challenges that go beyond business — including climate change, humanitarian crises and enabling greater accessibility for the more than 1 billion people across the world living with a disability.

One of my favorite stories, and one that I was honored to share at Microsoft Inspire this week, is the incredible work of Wild Me, one of the program’s grantees. They’re using the power of Microsoft AI to track individual animals to monitor the health of entire species in support of conservation efforts. As a result, they’ve been able to identify 10 times more whale sharks than ever before in human history. It’s truly awe-inspiring and vital work.

Wild Me’s story is one of many showing how AI is positively impacting society and transforming the world we live in. Building upon our commitment to use tech to make a positive impact on society, we just announced our newest AI for Good program dedicated to the preservation and enrichment of cultural heritage. Through our AI for Cultural Heritage program, we’re partnering with the people preserving places of historical and cultural significance for future generations – such as the work by nonprofit organization Iconem, which is using AI to digitally re-create at-risk locations in areas of conflict. The program will also help communities preserve languages that are at risk of being lost, such as the Yucatec Maya and Querétaro Otomi in Mexico. Because of this, more people across the world will be able to enjoy historical artifacts – as exemplified by our recent partnership with the Metropolitan Museum of Art, making its collection of 1.5 million works of art that spans 5,000 years more accessible to everyone.

The innovations from the grantees we’re working with through our AI for Good initiative reflect our belief in the use of AI to inspire and nurture breakthrough ideas that have meaningful impact and can solve some of the world’s greatest problems.

I’m humbled by the work of our partners and grantees, and being at Inspire reminds me just how fortunate I am to help and work alongside them as they create what’s next and do really powerful things with AI. Technology is just one part of the story, and empowering people to shape and transform the world and do good are at the center of it. By encouraging our partners and users to create their own meaningful innovations, we’re working together to embrace change and find solutions to all sorts of challenges – whether they’re business or societal ones.

Related links:


How AI is changing the world of manufacturing

Henry Ford’s assembly line revolutionized manufacturing in the 20th century. The transformation driven by AI in the 21st century will be just as dramatic. 

“The Future Computed: AI and Manufacturing” is the second book in Microsoft’s The Future Computed series, looking at the impact of AI on society. 

Author Greg Shaw examines the way in which leading manufacturing companies are using AI to build the factories and supply chains of the future. He explores the exciting opportunities around this new technology and also looks at the way the manufacturing industry is again at the forefront of grappling with the challenges of adopting a new technology. In particular, The Future Computed: AI and Manufacturing looks at the importance of creating a framework for the ethical and responsible use of AI and ensuring that workers can be trained to take on new tasks.  

[Subscribe to Microsoft on the Issues for more on the topics that matter most.] 

In this video, Shaw explains the themes underlying his research, and hears directly from Çağlayan Arkan, general manager of Microsoft Global Manufacturing, and some of the manufacturing customers at the cutting edge of industrial AI. 

For more on AI innovations at Microsoft follow @MSFTIssues on Twitter.


Microsoft and Providence St. Joseph Health announce strategic alliance to accelerate the future of care delivery

Providence St. Joseph Health, in partnership with Microsoft, will develop and deploy new health care technologies that will harness the power of Microsoft Azure and AI with clinical expertise to transform the care experience

RENTON, Wash., and REDMOND, Wash., July 8, 2019 – Microsoft Corp. and Providence St. Joseph Health today announced a multi-year strategic alliance to accelerate the digital transformation of health care. The alliance will combine the power of Microsoft’s cloud, artificial intelligence (AI), research capabilities, and collaboration tools with the clinical expertise and care environments of Providence St. Joseph Health, one of the largest health systems in the country.

The two organizations will develop a portfolio of integrated solutions designed to improve health outcomes and reduce the total cost of care by combining technologies from Microsoft with Providence St. Joseph Health’s data and clinical expertise. The alliance will accelerate the health care industry’s adoption of the cloud and enable data-driven clinical and operational decision-making by leveraging Microsoft Azure, and industry interoperability standards like FHIR, to integrate siloed data sources in a cloud environment that enables security and compliance.

Providence St. Joseph Health logoProvidence St. Joseph Health will deploy next-generation solutions and emerging technologies from Microsoft and its partners at a Providence St. Joseph Health-affiliated hospital facility in Seattle, Wash., near Microsoft’s Redmond headquarters. This site will enable modern clinical and operational experiences for both patients and providers. The goal will be to scale these innovations across the entire Providence St. Joseph Health system, in a transformation that will bring innovative and necessary solutions to more communities.

“Providence St. Joseph Health has been on a journey to transform health care and achieve a vision of health for a better world. We’re excited to accelerate that journey by collaborating with Microsoft. Together, we’ll support doctors, nurses and all caregivers by equipping them with innovative tools and technology that make it easier to do the vitally important work of improving lives,” said Rod Hochman, M.D., president and CEO of Providence St. Joseph Health.
Microsoft logo

“Our alliance with Providence St. Joseph Health brings together the expertise of one of the largest and most comprehensive health systems in the country with the power of Azure, Microsoft 365 and Dynamics 365,” said Satya Nadella, CEO of Microsoft. “Our ambition is to accelerate Providence St. Joseph Health’s digital transformation and to build new innovations together that are designed to improve health care delivery and outcomes.”

As part of the strategic alliance, Providence St. Joseph Health will use Microsoft Azure as its preferred cloud platform and standardize productivity and collaboration tools for its 119,000 caregivers on Microsoft 365, and will continue to improve and support patient engagement using technologies including Dynamics 365. Providence St. Joseph Health doctors and nurses will use Microsoft Teams, which is part of the Microsoft 365 platform, for more secure communication and collaboration, enabling them to bring together chat, video meetings and conferencing, and line-of-business applications into a single hub.

About Providence St. Joseph Health
Providence St. Joseph Health is a national, not-for-profit Catholic health system comprising a diverse family of organizations and driven by a belief that health is a human right. With 51 hospitals, 829 physician clinics, senior services, supportive housing and many other health and educational services, the health system and its partners employ more than 119,000 caregivers serving communities across seven states – Alaska, California, Montana, New Mexico, Oregon, Texas and Washington with system offices based in Renton, Wash., and Irvine, Calif.

About Microsoft
Microsoft (Nasdaq “MSFT” @microsoft) enables digital transformation for the era of an intelligent cloud and an intelligent edge. Its mission is to empower every person and every organization on the planet to achieve more.

Media Contacts:
Elizabeth Brophy, Providence St. Joseph Health,
Microsoft Media Relations, WE Communications for Microsoft,


Icertis narrows the gap between AI expectations and reality

Transform recently sat down with several Microsoft customers at an event highlighting emerging trends in data and artificial intelligence. We talked with Vivek Bharti, general manager of product management at Icertis, which provides contract management services to enterprise customers.

TRANSFORM: Describe how your company uses AI to serve customers.

VIVEK BHARTI: We strive to solve the hardest contract management problems, using technology, and making it easy for the customers to use it.

All a company’s commitments and obligations are in that contract document. The problem is that those are in natural languages, not very amenable to the traditional systems. We take their (contract) repository, which is usually a set of folders of PDFs, etc., convert them into intelligent documents and help them manage their publications for life.

In the process, we save them money or help them garner more revenue.

TRANSFORM: What are the biggest challenges you see with AI?

BHARTI: One challenge is that the technology seems to be ahead of the business. The pace of innovation is so fast that technologies are producing possibilities, left, right and center. But it’s not necessarily true that the business use cases have been found. It’s resulting in very hyped-up speculation from the customers. There’s a gap between expectations and reality.

Another challenge is creating pools of people who understand both the customer and the technology, to create the intersection of technology and what the business problems are. At a tactical level, that’s solved by having a pool of domain experts who sensitize the engineers about what the business problem is.

Conversely, every single person who joins the company, they go through product training. So we set a foundation where everyone is aligned on what the company vision is and what is it that we’re building for. That starts sensitizing people who are more business oriented, sales folks, business analysts or whatever, to what is the technology capability.

TRANSFORM: How does that intersection of technology and business know-how help you serve customers better?

BHARTI: If you look at the size of our company, and the kind of customers we are acquiring, some of them have 400,000 suppliers. And they’ve trusted our system to manage (their contracts). It’s not because we are a big company. They saw that everyone that they interacted with from our company was actually talking about their problems, not about their product. So that’s how we’re making a difference.


How AI could boost GDP and help reduce greenhouse gas emissions

The application of AI technologies in four areas – agriculture, water, energy and transport – have the potential to increase global GDP by up to $5.2 trillion by 2030, according to a new report from Microsoft and Pricewaterhouse Coopers. That is an increase of 4.4% in global GDP over the next 11 years, relative to business as usual.

At the same time, these technologies could reduce global greenhouse gas emissions by up to 4%. That is equivalent to the predicted 2030 annual emissions of Australia, Canada and Japan combined.

This map shows where those changes could occur.

Summary of regional GDP and GHG impacts relative to the baseline by 2030 in the “Expansion” Scenario

Europe could see the greatest rise in GDP – an increase of 5.4%, while the United States could see the greatest fall in greenhouse gas emissions – a drop of 6.1%.

The report also predicts that, without addressing some of the blockers to technology and AI adoption and readiness, the impact of AI will be felt less in some parts of the world. If progress is made on those blockers, however, these regions could benefit greatly from low-carbon, sustainable economic growth.

[Subscribe to Microsoft on the Issues for more on the topics that matter most.]

Of the four sectors detailed in the report, the impact of AI within the energy and transport sectors is predicted to contribute most to the rise in GDP and to the fall in harmful emissions. The following shows the changes forecast in each of the sectors.

Key sectoral results – impact on GDP and GHG emissions by 2030 in the “Expansion” scenario

The high potential of transformation within energy and transport sectors is due to an array of innovations, some of which are already being realized including traffic optimization systems, vehicle-sharing services, increased efficiency of renewables and the smart management of energy consumption, in two heavy-emitting industries.

For more on AI innovations at Microsoft follow @MSFTIssues on Twitter.


How AI, drones and cameras are keeping our roads and bridges safe

“It's a dangerous business, Frodo, going out your door. You step onto the road, and if you don't keep your feet, 
there's no knowing where you might be swept off to.” ― J.R.R. Tolkien, The Lord of the Rings

Europe’s roads are the safest in the world. Current figures show that there are 50 fatalities per one million inhabitants, compared to the global figure of 174 deaths per million. Despite this, each loss remains a tragedy. In 2017, 25,300 people lost their lives on European roads.   

The cause of these accidents can vary from human error and weather conditions, to damaged structures and surfaces. While some things are beyond the realms of control, road and bridge conditions are a variable which can be governed.

As soon as a road is paved, a combination of traffic and weather conditions begin to degrade and erode the surface. Undetected cracks, abrasions or defects can quickly lead to bigger problems, such as costly repairs, major traffic delays, and in the worst cases, unsafe condition. These problems are also shared by bridges, particularly when concrete is critical in maintaining the integrity of the structure. The earlier faults are detected, the sooner they can be addressed, saving time and money, while minimising disruption. Ultimately, this helps ensure that the roads themselves are safer for those travelling on them.

The detection of these faults, however, can be very difficult to carry out manually, especially as early-forming cracks are hard to spot with the naked eye. Predicting where faults are likely to occur ahead of time so that appropriate measures can be taken in advance also possess a massive challenge. Thankfully, technology is here to help.

[embedded content]

Building bridges
Built more than 20 years ago, the Great Belt Bridge is a suspension bridge which connects the Danish islands of Zealand and Funen. Holding company Sund & Bælt, which is responsible for the maintenance of the bridge, has worked with Microsoft to deploy an innovative solution which combines the flexibility of drones, with the power of artificial intelligence (AI.)

The drones are used to fly around the bridge and capture thousands of pictures of the concrete structure – a method that’s far safer and faster than tasking a worker to dangle 200 metres above the surface to take pictures manually. The expertise and experience of these workers is instead used to help train a machine learning algorithm which can automatically detect cracks in the surface of the concrete, after the photos have been uploaded to Microsoft’s Azure cloud. After the AI creates a list of areas with cause for concern, the same experts are used to select the areas which need maintenance and repair.

Drone flying beneath a concrete bridge


Microsoft makes AI debugging and visualization tool TensorWatch open source


The rise of deep learning is accompanied by ever-increasing model complexity, larger datasets, and longer training times for models. When working on novel concepts, researchers often need to understand why training metrics are trending the way they are. So far, the available tools for machine learning training have focused on a “what you see is what you log” approach. As logging is relatively expensive, researchers and engineers tend to avoid it and rely on a few signals to guesstimate the cause of the patterns they see. At Microsoft Research, we’ve been asking important questions surrounding this very challenge: What if we could dramatically reduce the cost of getting more information about the state of the system? What if we had advanced tooling that could help researchers make more informed decisions effectively?

Introducing TensorWatch

We’re happy to introduce TensorWatch, an open-source system that implements several of these ideas and concepts. We like to think of TensorWatch as the Swiss Army knife of debugging tools with many advanced capabilities researchers and engineers will find helpful in their work. We presented TensorWatch at the 2019 ACM SIGCHI Symposium on Engineering Interactive Computing Systems.

Custom UIs and visualizations

The first thing you might notice when using TensorWatch is it extensively leverages Jupyter Notebook instead of prepackaged user interfaces, which are often difficult to customize. TensorWatch provides the interactive debugging of real-time training processes using either the composable UI in Jupyter Notebooks or the live shareable dashboards in Jupyter Lab. In addition, since TensorWatch is a Python library, you can also build your own custom UIs or use TensorWatch in the vast Python data science ecosystem. TensorWatch also supports several standard visualization types, including bar charts, histograms, and pie charts, as well as 3D variations.

With TensorWatch—a debugging and visualization tool for machine learning—researchers and engineers can customize the user interface to accommodate a variety of scenarios. Above is an example of TensorWatch running in Jupyter Notebook, rendering a live chart from multiple streams produced by an ML training application.

With TensorWatch—a debugging and visualization tool for machine learning—researchers and engineers can customize the user interface to accommodate a variety of scenarios. Above is an example of TensorWatch running in Jupyter Notebook, rendering a live chart from multiple streams produced by an ML training application.

Streams, streams everywhere

One of the central premises of the TensorWatch architecture is we uniformly treat data and other objects as streams. This includes files, console, sockets, cloud storage, and even visualizations themselves. With a common interface, TensorWatch streams can listen to other streams, which enables the creation of custom data flow graphs. Using these concepts, TensorWatch trivially allows you to implement a variety of advanced scenarios. For example, you can render many streams into the same visualization, or one stream can be rendered in many visualizations simultaneously, or a stream can be persisted in many files, or not persisted at all. The possibilities are endless!

TensorWatch supports a variety of visualization types. Above is an example of a TensorWatch t-SNE visualization of the MNIST dataset.

TensorWatch supports a variety of visualization types. Above is an example of a TensorWatch t-SNE visualization of the MNIST dataset.

Lazy logging mode

With TensorWatch, we also introduce lazy logging mode. This mode doesn’t require explicit logging of all the information beforehand. Instead, you can have TensorWatch observe the variables. Since observing is basically free, you can track as many variables as you like, including large models or entire batches during the training. TensorWatch then allows you to perform interactive queries that run in the context of these variables and returns the streams as a result. These streams can then be visualized, saved, or processed as needed. For example, you can write a lambda expression that computes mean weight gradients in each layer in the model at the completion of each batch and send the result as a stream of tensors that can be plotted as a bar chart.

Phases of model development

At Microsoft Research, we care deeply about improving debugging capabilities in all phases of model development—pre-training, in-training, and post-training. Consequently, TensorWatch provides many features useful for pre- and post-training phases as well. We lean on several excellent open-source libraries to enable many of these features, which include model graph visualization, data exploration through dimensionality reduction, model statistics, and several prediction explainers for convolution networks.

Open source on GitHub

We hope TensorWatch helps spark further advances and ideas for efficiently debugging and visualizing machine learning and invite the ML community to participate in this journey via GitHub.


How ‘alfred’ AI solution helps thyssenkrupp Materials Services optimize its logistics network

Alfred Krupp founded his company in 1811. Now, over 200 years later, he is the namesake and inspiration for an artificial intelligence solution built by thyssenkrupp Materials Services (tkMX), one of Germany-based thyssenkrupp AG’s strategic business areas – and the largest materials distributor and service provider in the western world.

The “alfred” AI solution, powered by Microsoft Azure, helps the company analyze and process more than two million orders per year and better serve its 250,000 global customers.

Though alfred has been in place for just under a year, the solution is already helping tkMX optimize its logistics network – allocating materials to the right location much faster, minimizing transport volume and enhancing usage of the company’s transport capacity.

Axel Berger, head of the Digital Transformation Office, thyssenkrupp Materials Services.
Axel Berger, head of the Digital Transformation Office, thyssenkrupp Materials Services.

Transform caught up with Axel Berger, head of digital transformation at tkMX, to hear more about how alfred is changing the business.

TRANSFORM: Tell me about alfred and why tkMX developed it. What business challenges were you facing?

AXEL BERGER: We are a wholesaler, so data insights and data algorithms are possibly one of the strongest levers we have to improve our business. We had a lot of data that we weren’t using before, for three main reasons.

First, we didn’t really have the expertise to work on specific data science topics – we had the data, but it wasn’t always available. Second, data quality was an issue. And third, we lacked the technology to store data in different formats to use it and make it available in one central location on a massive scale. We also lacked the related tools to really analyze it, visualize it and finally, build algorithms out of it that could be deployed in different scenarios.

There are many possible use cases for wholesalers, and it took us a long time to pinpoint the use case that we should implement first. The major topic we’ve focused on is network optimization. How can we optimize, for example, transport costs or our supply network? How can we reduce the stock that is delivered from A to B without sacrificing our service levels? So the first project that we’ve worked on is network simulations within our German trade network.

It’s important to note that alfred is growing through its use cases. We didn’t create a huge global platform that could do everything. The first use case requires a specific amount of data, computing power and certain tools. But with additional use cases that we are now implementing, alfred is growing.

thyssenkrupp Material Services' alfred AI solution.
thyssenkrupp Material Services’ alfred AI solution.

TRANSFORM: I understand you developed alfred internally. Can you tell us a little about that?

BERGER: Alfred came to life in early 2018. The biggest challenge was definitely data availability. You can have the greatest technology, the best tools, but the biggest challenge is to get quality data. Another challenge is to have the domain knowledge, the expertise in the specific topic to really make it relevant.

Everybody’s thinking that if you just use data and artificial intelligence, in the end this artificial intelligence will give you the insights that you don’t know yet. But that’s not happening. It’s about having the right data of the right quality, the expertise and the domain knowledge on a specific topic, and the technology to run it. Technology is the easy part, because nowadays there is someone like Microsoft with the technology. But to bring data and domain knowledge into the project and to understand the use case and the questions you are trying to answer, that is the hardest part.

TRANSFORM: Can you walk me through what alfred might do over the course of one day?

BERGER: There are so many things that alfred can do! Alfred dynamically tells us from which site we should ship which material to which customer. Alfred optimizes our stock levels. Alfred tells us what the perfect price for a specific customer for a specific product is. Alfred visualizes and tells us which customers are profitable, and which customers are not.

Alfred can help us build a predictive maintenance model for our machinery, and tell us which machine is about to break. Alfred also helps us to optimize our supply network in terms of physical sites – where should we open the next site or close it down, and which materials should be subbed somewhere else. It helps us to get better purchase prices because it helps us in negotiations and the bundling of materials that we want to purchase. These are all current or potential use cases.

TRANSFORM: I understand it’s still evolving, but what is the biggest benefit alfred has had on your business?

BERGER: We handed over decision-making to alfred (a machine) that relies on data. One of the taglines that we use for alfred is “intelligence in each transaction,” which means that we want to build decision engines. Alfred already delivered the first decision engine: The system tells us from which location the customer is to be supplied – taking into account all relevant frame data. That was our first decision engine, you could say.

TRANSFORM: What has been the employee reaction to alfred? Have they embraced alfred, or was there some resistance early on?

BERGER: People weren’t resistant to alfred, because right away we could show them how alfred would help them in their daily work, and the benefits we’d gain. With the use case we’ve been working on, alfred doesn’t imply any layoffs or redundancies. It is purely optimizing the way we are working, and helping to enhance the impact our employees are driving. So alfred is seen positively.

Employees collaborate in a thyssenkrupp Materials Services warehouse.
All company data can be combined on one platform with alfred.

TRANSFORM: Did you do any training to prepare employees for alfred?

BERGER: Yes, absolutely. We helped them, trained them, involved them in the process very early. We trained them in the tools. What we are also planning is to deploy data labs, small versions of alfred, so people working on a specific data problem can use alfred to solve their own problems with just a push of the button. We teach them how to do this – how to use Microsoft Power BI, for example, to visualize their own data. That helped a lot because they started to work with data and to better understand what it’s all about and how it can be utilized.

TRANSFORM: How else has alfred helped your employees achieve more and optimized their work?

BERGER: Alfred has helped employees by enabling them to simulate tkMX’s network setup, which was extremely difficult before because our network is extremely complex. It has helped with data availability – the employees have much more data that they can now access themselves, without involving anybody from data warehousing. And obviously by increasing data transparency.

TRANSFORM: Have new roles or opportunities opened up to support alfred?

BERGER: Yes, of course. Roles like data engineering, data architecture, data science, solution designers – these are all new roles that we staff now.

TRANSFORM: What advice would you give other companies that are considering launching an AI initiative?

BERGER: I’d like to shift the focus away from the buzzword “AI” and better discuss what’s behind it. I don’t believe that there is an artificial intelligence as such. We have focused algorithms.

In other words, what I would recommend is to calm down and don’t be afraid of AI, because the methods are 60 years old. What has changed are the opportunities that advanced technologies such as cloud and edge computing provide and the pace at which they evolve. So, businesses need to get used to these new technologies, and use technology that is easy to handle – like Microsoft Azure. With Azure we can quickly launch applications that can be used for data aggregation, manipulation and analysis with the click of a button, with only a few people in the beginning.

To start, I would recommend taking data, searching for your first use cases, and just building them without engineering them forever. Clarify the questions you want to answer. Don’t believe in overarching algorithms that will solve the problem of finding the question, the use case. Because otherwise everybody is expecting results for something that you don’t even know is a problem.

tkMX infographicTRANSFORM: Based on your experience, what concerns or rewards do you see for society as AI becomes more ubiquitous?

BERGER: Again, I would say calm down and get in touch with the methods and technologies behind AI. People are fearing things they don’t know. If you get in touch with it and understand what’s really behind AI, then I think it’s easier for people to understand that we are far away from real artificial intelligence. We see specific use cases, specific technologies to solve specific problems, but nothing like a mastermind.

It’s important to talk about AI and engage in the public debate, because with the evolving technology around machine learning and AI, there are questions to answer, including both ethical and legal questions. For example, the much-used example of an autonomous car. How do we cope as we give more and more autonomy and decision-making capacity to machines?

I studied mechatronics some 25 years ago. With mechatronics you were already talking about cyber-physical systems and programming and automizing machines. So IoT is nothing new. It’s just the technology has evolved and that gives us new opportunities.

When you look at artificial intelligence, the methodologies are out of the 1940s, 1950s – neural networks, for example. It’s nothing new. It’s all about cheaper storage, more computing power and better connectivity, but also about standardization and harmonization of data. And if you come back to that point, you realize it’s feasible to cope with it, because we’ve been able to cope with it for many years.

A thyssenkrupp truck travels on a highway.
AI will help thyssenkrupp Materials Services efficiently manage their logistics network.

TRANSFORM: You talked about what alfred is doing now. In 10 years, where do you want the platform to be?

BERGER: Technology is evolving so fast, it’s hard to foresee. Do you know the saying, ‘The appetite comes with eating’? It’s like when you’re working on a project, you’re finding new data insights, new data points that give you the motivation to go to the next step. So I am convinced there will be so many more use cases in the future that I cannot foresee right now.

I will learn, we all will learn, the machine will learn. We will get more and more data created out of the data that we already have – other data sources, third-party data and so forth. So right now, I cannot foresee all the use cases we will see in the future. We will work under one paradigm, which is ‘Intelligence in each transaction’. Over time alfred will also take decisions in our ERP system automatically. In average transactions that we do, we would like to have more intelligence, and alfred will help us with that.

TRANSFORM: Is there anything you would like to add?

BERGER: I’m a great believer in removing the mystique of buzzwords like AI and focusing on what’s behind it instead – helping people and companies understand the technologies and methods that help us make our businesses as well as our personal lives easier and better.

It’s part of my role as the CDO, but I also believe that digitalization is a bunch of buzzwords. If you ask someone at a conference what you really mean by digitalization, most people will get very thin in their answers. Why?

Because they don’t really know, because they are looking at digitalization from a huge height. And I think if you really want to go beyond the buzzwords, you really need to go into the use cases and the business, and you really need to redefine the opportunities. So I am trying hard to get out of these buzzwords and really get down to the use cases.

Top photo: thyssenkrupp Materials Services receives around 14 million order items annually. With alfred, these can be efficiently processed and analyzed. (All photos courtesy of thyssenkrupp Materials Services)


Louisville signs alliance with Microsoft to accelerate city’s AI practices, digital transformation

Image of Louisville Metro HallImage of Louisville Metro Hall

At Louisville’s Entrepreneur Center, Mayor Greg Fischer announced a three-year digital alliance with Microsoft to boost the region’s digital transformation and accelerate AI practices. The remarks were part of a kick-off event for a weekend of start-up activities at the center, which supports the growth of local tech companies. The alliance will focus on strengthening the region’s ability to navigate its major industries’ digital transformation by skilling the population in the key digital fronts of the future: artificial intelligence, Internet of Things, and data science.

Upon signing Mayor Fischer said, “Artificial intelligence is the next frontier in technology, and through this collaboration with Microsoft, we will prepare our workforce for the tech revolution and create economic opportunity, while not losing sight of the need for equity within economic growth. We are excited to collaborate with Microsoft to ensure Louisville residents and businesses are ready for the future economy.”

With a physical hub in the Entrepreneur Center in Louisville’s Innovation District, the alliance will encompass public events, technology investment, and skills training for all ages throughout the greater Louisville area, with a population of approximately 620,000 people.

Microsoft will collaborate with the city of Louisville, the Brookings Institute, and the University of Louisville as well as other local partners in this wide-reaching public-private initiative to increase digital skills along the full continuum of the educational pipeline.

Also presenting at the event was Louisville Chief of Civic Innovation and Technology Grace Simrall who said, “In essence, the goal of this collaboration is to create a regional hub, a center of gravity for AI and IoT.”

A changing landscape

In recent years, some 12% of jobs in the Louisville region have been in manufacturing, employing over 250,000 workers. Some 285,000 people in the region are currently working in healthcare and education. As digital transformation brings increased efficiency to these and other industries, there will be a growing demand for workers who command strong digital skills. In addition to manufacturing and healthcare, AI will bring new opportunities in fields ranging from public safety to smart building design and construction. The alliance will help bridge the digital divide to help communities thrive in the coming job market.

Focus on AI

The digital alliance will work to create a comprehensive strategy focusing on the impact of AI, IoT, and data science and on how to help communities adapt to them. All components of the digital alliance will emphasize building people’s agility with these technologies. Four fellows will be sponsored to act as ambassadors for the city’s AI initiatives within industry and non-profit organizations. Public events will include ideation and design-thinking workshops and symposiums to explore what an AI future looks like.

Full lifecycle educational pipeline

The digital alliance will take a multi-pronged approach to reach people at various points along the continuum of job readiness, from familiarizing children with the basic concepts of digital literacy to empowering current or returning members of the workforce to reskill and upskill for the new economy.

Empowering youth

The alliance will provide virtual and in-person programs for youth in the schools and during public events activities for the entire K-12 and P-20 learning spectrum. Programs will include one-time public events such as youth esports and robotics competitions and longer programs including the following.

  • YouthSpark Live events bring young people together to plan their futures. Students identify necessary skills, learn how technology fits in, and get connected to programs that will help them get where they want to go. The events focus on three key areas: employment, careers and entrepreneurship.
  • DigiGirlz teaches middle and high school girls about technology careers, connects them with STEM industry professionals and Microsoft employees, and lets them participate in hands-on computer science workshops.
  • DigiCamps provides the same experience of developing and learning about cutting-edge technologies as DigiGirlz for both girls and boys.

Training the workforce

The digital alliance will coordinate and host digital literacy workshops for parents, transitioning members of the workforce and veterans to expand digital literacy skills in a consistent, predictable manner using Microsoft’s Digital Literacy curriculum. The training, which is available in 30 languages, is online and downloadable. In addition, the three training levels, from basic to advanced, are developed to be self-directed and self-paced.

The digital literacy curriculum starts with the absolute basics of computer and software use and progresses to larger issues of online safety and digital lifestyles. Microsoft will provide resources for Train the Trainer sessions for Microsoft certifications through the Digital Literacy curriculum.

Developing thought leadership

The alliance will host a high-level AI, IoT, and Data Science Summit in Louisville in the coming year, corporate AI briefings, and business roundtables. Executive briefings will be held at the Microsoft Executive Briefing Center in Redmond, WA, to support corporate digital transformation with directors and cabinet officials focused on innovation, AI, cybersecurity, and digital transformation.

To tap into and further grow the capacity of existing tech leaders, the digital alliance will host at least three start-up and tech hackathons focused on civic innovation. Hackathons connect the city with local tech leaders, start-ups, partners, and technology thought leaders to explore solutions to local challenges. The Civic Innovation hackathon will challenge participants to use design thinking and technology to address topics such as public safety, security, smart city, transportation, and education. Microsoft will aid with program development, session facilitators, and speakers.

For more details about the Louisville digital alliance, please see the press release from the Office of Mayor Greg Fischer.

Learn more about how Microsoft is empowering digital transformation and innovation for state and local governments at the Microsoft Smart Cities website.

Learn more about this and other initiatives from the Brookings Institute.

To stay on top of the latest research on the potential for AI in the public sphere, access Microsoft’s resource guide to AI in government.


Learn more about Microsoft’s Airband Initiative to connect rural America.

Gain new insights into how to get started with digital skills.

See how cities and states are transforming with intelligent technology


How AI and satellites are used to combat illegal fishing

Fishing is a way of life for coastal communities around the world. An estimated four million fishing vessels sail the world’s oceans, providing fish for a global seafood market valued at over $120 billion.

“It’s hard to overstate the importance of fish,” says Nick Wise, CEO of the nonprofit organization OceanMind. “There are three billion people in the world who rely on seafood as their primary source of protein, mostly in developing nations. Twelve percent of the world’s population relies on the wild-capture seafood industry directly or indirectly for their livelihoods.”

Overfishing — when fish is caught faster than stocks can replenish — is a significant factor in the decline of ocean wildlife populations, not least because of the bycatch of other marine life such as turtles and cetaceans. Each of these creatures is an important part of ecosystems and the biodiversity of the ocean.

The United Nations Food and Agriculture Organization estimates one-third of all fish stocks are now overfished and are no longer biologically sustainable.

“A collapse in fish stocks and a failure to manage fishing sustainably,” says Wise, “would lead to a food security crisis and result in significant poverty around the world.”

To fight back against this overfishing, OceanMind is using the power of AI to map data and then feeding that information to government authorities to help catch perpetrators.

[Subscribe to Microsoft On the Issues for more on the topics that matter most.]

satellite data map

Smart tracking

Regional, national and international regulations are used to manage fishing efforts and can include restrictions on fishing out of season, using banned gear or techniques, or catching more than a set quota.

There are many ways of trying to catch those flouting the law, such as patrol boats, on-board cameras, and the remote electronic monitoring of discards.

However, the vastness of the ocean makes the job difficult.

OceanMind’s system currently tracks thousands of boats, with the capability of tracking millions, across the globe by gathering data from a wide range of sources, including collision-avoidance transponders aboard boats; radar images; satellite imagery; and cellphone signals. Analyzing these enormous datasets is beyond the capability of any one person. OceanMind has developed machine-learning algorithms that predict the type of fishing behavior based on vessel location, and flags suspicious and potentially illegal activity such as fishing too close to the shore.

But the system can’t tell on its own whether anyone is breaking the law.

“The difference between legal and illegal fishing is simply whether or not the vessel had a license to do what it did in that place, at that time, and in that way,” says Wise. “That’s what makes combating illegal fishing challenging: One vessel making a particular maneuver might be legal, another vessel doing the same thing next to it might be illegal.”

OceanMind’s fisheries experts verify the alerts flagged by AI and coordinate closely with the relevant authorities, who can then decide whether to investigate further. The organization already has partnerships with governments, including Thailand’s, which can then target resources to catch offenders.

Real-time advances

Until now, OceanMind has used onsite servers to process the data that comes in every day. “We were basically running a day behind,” explains Wise. “We reviewed things that were happening yesterday.”

Through a Microsoft AI for Earth grant, OceanMind is moving its data analytics to the Microsoft Azure cloud. “The collaboration with Microsoft is going to bring all of that data through our system much more quickly and apply the AI in near real time.”

That transformation will make a big difference to enforcement. Real-time monitoring will help authorities plan patrols that can catch illegal fishing as it happens.