Every summer, Andy Nott spends his Saturdays on a verdant cricket field with his local team from Calne, a picturesque town in southwest England. But instead of batting and fielding balls, Nott is live-scoring his team’s seven-hour matches, often from inside a scorebox that looks like a garden shed. The job, though largely hidden, is a key role in the game, requiring intense focus and meticulous record-keeping.
To score a match, Nott brings colored pens, a paper scorebook, binoculars, water, a fan or heater, and a laptop connected to NV Play, an innovative, cloud-based cricket scoring and analytics solution launched last year. Built by NV Interactive, a digital agency in New Zealand, the platform has enabled more than 2,000 U.K. recreational teams like Calne Cricket Club to produce professional-grade livestreams that make the sport more engaging to follow. Calne doesn’t yet have the budget for video, but its live scores are deeply appreciated by fans.
“We’ve heard from people lying on a beach somewhere in Spain keeping abreast of a game by looking at the internet and seeing what’s happening,” says Nott, who learned to score cricket 15 years ago on paper, a method he still uses while scoring digitally. “The technology and live scoring make the game more appealing to a wider audience.”
For NV Interactive, building NV Play was a way to democratize technology in a favorite sport of product director Matt Smith, chairman Geoff Cranko and brothers Matt Pickering, managing director, and Gus Pickering, technical director. Previously, the company had developed elite cricket solutions for 15 years, beginning with a scoring platform in 2005 for ESPNcricinfo, a global cricket news site, still in use today.
NV then went on to build digital tools for first-class and national teams in the U.K. and New Zealand, also still in use today. For NV Play, the company integrated the same advanced capabilities into a single platform that can scale from serving small, recreational teams to the highest levels of professional cricket. Built on Microsoft Azure, the flexible solution features live scoring, live video, video highlights, ball-by-ball statistics, high-performance analytics and predictive insights – all to help teams grow and shape the future of cricket.
“Historically, cricket has been scored on pen and paper,” says Smith, who grew up playing the sport. “Depending on the level of the game, the scores might end up on a spreadsheet that you share around or on a website, if you’re lucky.
“We’re making all the things typically available only to massive cricket clubs with high-performance budgets available to all levels of the game, from grassroots through to the elite.”
Even for famous, professional clubs like Middlesex Cricket in London, NV Play has enabled new ways to serve fans and new opportunities for growth. The platform helps the club deliver high-quality video livestreams and video highlights of key moments, which delight fans who can’t attend the club’s four-day matches. The features also serve Middlesex’s many global fans from Australia to India to South Africa.
“We’ve had a lot of feedback from people who enjoy having an open window on their laptop and being able to duck in and out of a game and get a real feel for what’s going on, without being here in the stadium,” says Rob Lynch, chief operating officer of Middlesex Cricket, which plays at London’s iconic Lord’s Cricket Ground venue.
The platform helped Middlesex livestream video of its professional women’s matches for the first time this year. It has also increased engagement on the club’s website, leading to new monetization opportunities.
“We work hard to keep our website a living, breathing organism and not something that goes dormant and out of date quickly,” Lynch says. “By incorporating NV Play, we’ve seen a significant increase in traffic to our site.”
Building the solution with Azure DevOps, NV Interactive worked with the England and Wales Cricket Board to deliver the solution in the U.K., where it’s branded as Play-Cricket Scorer Pro. An NV Play partnership with New Zealand Cricket soon followed. To date, the platform has scored more than 30,000 matches involving 90,000 players and captured more than 24 terabytes of video.
It has coded metadata for 18 million balls bowled, including details on batter, bowler, type of hit, runs scored, weather and pitch conditions. In any given week, NV Play is livestreaming scores and video of more than 750 simultaneous matches to an audience of over 2 million people. The scalability of Azure is crucial for handling the enormous usage spikes.
Transform recently sat down with several Microsoft customers at an event highlighting emerging trends in data and artificial intelligence (AI). We spoke with Jean Lozano, chief technology officer at MediaValet, a cloud-based digital asset management (DAM) system, which helps customers of all sizes manage their growing digital media libraries.
TRANSFORM:What is a DAM and what business problem does it solve?
JEAN LOZANO: A DAM is like a video or photo library, but it can handle way more than that. It’s typically used by enterprise marketing teams to provide a single source of truth for all brand and marketing content.
The big challenge with digital media is that it’s unstructured by nature, so discoverability can be a problem. A lot of companies invest tens of thousands of dollars in creating digital media and infographics, but its lack of searchable data makes them hard to find.
A DAM implementation contains anywhere from 10,000 to several-million digital media [items] that are put into a central repository and tagged to make them discoverable. Having just one person as the curator of the digital library is way too much, and that’s where AI comes in.
TRANSFORM: How does AI help you serve your customers?
LOZANO: We’re betting big on Azure AI to make MediaValet the most intelligent DAM out there. We’ve seen it in various deal cycles that we would not have otherwise won. It’s just a matter of demonstrating capabilities.
So, for example, one of the hottest technologies that we can demo is our capability for video indexing, which other DAMs cannot offer right now. Video is the fastest growing media out there, so when our customers can actually see their videos getting analyzed and made discoverable, they can see the value it adds.
As an example, we have a huge video production house as a customer. Every time they do a shoot, it’s a 500-gigabyte upload to MediaValet. But, they also have to generate video transcriptions. With video indexing, we [provide] the transcriptions as soon as they load it on our system, so they don’t have to send the files [elsewhere] for manual transcription. Does that improve their workflow? Definitely.
TRANSFORM: Have your customers fully embraced the changes and the new technologies?
LOZANO: Many of our customers are digital asset curators, and some of them probably fear that AI is going to replace them. But the guiding principle for AI is assistance, not replacement. AI is going to make their lives easier. It will enrich the data that they’re producing, but it doesn’t replace them.
TRANSFORM: How does AI make life easier for a digital asset curator?
LOZANO: We have a couple of customers that are sports franchises that are creating millions of images. These teams have been there for decades and decades. One hockey team had about 50 petabytes of video clips. So, how do you work through millions of images and hundreds of thousands of hours of video?
Machine learning is actually helping with that now. For example, you can detect a jersey number and see who is the player that wears that number. We can actually make those images easy to use for marketers, or the communications team when they look for brand approved images.
Launched in 2006, P3 is the first facility to apply a more data-driven approach to understanding how elite competitors move. It uses advanced sports-science strategies to assess and train athletes in ways that will revolutionize pro sports – and, eventually, the bodies and abilities of weekend warriors, Elliott says.
“We are challenging them and measuring them. But we’re not interested in how high they jump or how fast they accelerate,” Elliott says. “We’re interested in the mechanics of how they jump, how they accelerate and decelerate. It’s helping us unlock the secrets of human movement.”
Working directly with players and their agents or families, P3 has evaluated members of the past six NBA draft classes, amassing a database of more than 600 current and former NBA athletes.
Some of P3’s clients include NBA stars Luka Doncic and Zach LaVine plus athletes from the NFL, Major League Baseball, international soccer, track and field and more.
Many of those NBA clients, like Philadelphia 76ers guard Josh Richardson, return to P3 each summer for re-testing to pinpoint whether their movement patterns have gained asymmetries that could cause injury, or to reconfirm the health of physical systems they use to leap, land, stop and start, fueling their on-court edge.
“This is my fifth off-season now at P3,” Richardson says. “When I started with them during my NBA draft preparation, I immediately saw that their approach was different and that it could help me have the best chance to improve my athleticism. Every off-season I get to see exactly where I am physically compared to where I was before – and compared to other NBA players.
“They are able to help me identify where I might be at risk of injury and where I can improve physically. It’s important for me to know that the training I am doing is specific to my unique needs,” Richardson says.
To collect all that granular data, P3 outfitted its lab with a high-speed camera system manufactured by Simi Reality Motion Systems GmbH, a German company from the ZF Group and a Microsoft partner.
Simi offers markerless, motion-capture software that removes the need for athletes to wear tracking sensors while they play or train. Simi also works with seven Major League Baseball clubs, deploying high-speed camera systems to those stadiums to record every pitch during every game since the 2017 season.
Simi’s software digitizes the pitchers’ arm angles and related body movements, spanning 42 different joint centers across 24,000 pitches thrown per team per season. That produces hundreds of billions of data points that are uploaded and processed on Microsoft Azure, enabling teams to create in-depth biomechanical analyses for the players, says Pascal Russ, Simi’s CEO.
“The first team that deploys this effectively on the field to pick lineups or to see which pitch angles worked well against which batters is going to see a huge separation between them and the other teams not using this,” Russ says.
“It’s freakishly accurate.”
While Russ foresees this technology eventually remaking baseball, such seismic shifts already are occurring in the NBA through P3’s player assessments, says Benedikt Jocham, Simi’s U.S. chief operations officer.
“We provide the software solution that can quantify the movement and analyze, for example, how much pressure and torque a person is putting on various body parts,” Jocham says. “P3 adds the magic sauce. They are wizards at figuring out what it all means and making sense out of it for athletes.”
After the cameras record a player’s movements in the P3 lab, those datasets are loaded into Azure where machine-learning algorithms reveal how that player’s physical systems are most related to other NBA players who were similarly assessed. The algorithm then assigns that player into one of several clusters or branches that predict how their basketball career may unfold, Elliott says.
One branch, for example, contains athletes who had a brief NBA experience and never became significant players. Another branch encompasses players who were impactful during their first three or four seasons then sustained serious injuries that depleted their skills. In still another branch, players share rare combinations of length, power and force that fed elite careers – and they remained healthy.
“The human eye is good at measuring size and maybe estimating weight, and very bad at comparing athletes’ physical systems and movement symmetries to one another,” Elliot says. “But we can measure those things in the lab and the machine tells us how young athletes are most alike.
“It’s a solid foothold into an area of sports science that has been out of sight until now,” he says.
The data is also helping to shatter long-held theories that successful NBA players who, at first glance, lack the size, jumping ability or quickness of traditional stars are merely compensating by tapping unmeasurable intangibles such as “intuition” or “IQ” or “heart.”
“That’s how people once would have defined (2017-18 NBA most valuable player) James Harden, as somebody who just has this super-high basketball IQ,” Elliott says. “Maybe he does. But he also has a better stopping or braking system than anybody we’ve ever assessed in the NBA.
“That creates competitive advantages,” he adds. “There’s Newtonian physics behind these advantages.”
Case in point: Dallas Mavericks rookie Luka Doncic. In its pre-draft assessment of Doncic one year ago, P3 identified that same hidden performance metric – the elite ability to stop quickly. P3 knew, before his NBA Draft, that Doncic and Harden were in the same player branch. Doncic posted a stunning first pro season.
The insights also help athletes avoid injuries by adopting new training techniques to change unhealthy movement patterns revealed in the data, says Elliott, who previously served as the first director of sports science in MLB (for the Seattle Mariners) and as the first director of sports science in the NFL (for the New England Patriots).
Every NBA player or draft prospect assessed by P3 receives a report that highlights their injury risks and compares them to league peers based on performance.
“Athletes come to us because they trust us to take better care of their bodies than would happen anywhere else,” Elliott says. “Traditionally, and still today, when these bad things happen to players, everyone says, ‘Oh, that was a freak injury.’ I’m just telling you that the machine learning models predict a whole lot of these.
“I can’t imagine a world where out of nowhere you suffer, say, a right tibial stress fracture – not your left one, not your femur, it’s your tibia, out of nowhere,” he adds. “Without a doubt, these are not random events. Sports science just has not been very good about identifying them.”
Eventually, this same information may become available to amateur athletes and everyone else, Elliott says. The same technologies could predict, for example, that a weekend warrior has too much force going through the left leg while jumping or landing plus a tiny but unhealthy rotation of the left knee and femur, causing too much friction, and, eventually, an erosion of the left knee cartilage.
“What if you identified that when you were 30 or 20, instead of learning when you’re 50 that your cartilage is gone? That really is the future,” Elliott says.
“The power of machine learning and (Microsoft) Artificial Intelligence are going to help us unlock these secrets in ways that have never existed. We’re already doing it but it’s only in the early days of what I think is going to be a revolution in this space,” he says. “It’s coming. It’s definitely coming.
Top photo: Stanley Johnson, a forward with the NBA’s New Orleans Pelicans, moves laterally inside an exercise band at the P3 lab. (All photos courtesy of P3.)
The Federal Aviation Administration (FAA) expects the number of drones in our airspace to increase as much as threefold by 2023, as commercial drone operations become more common. But with more drones in the air, mishaps become more likely. In 2017, the FAA reported an average of 250 safety incidents per month, in some cases halting operations at major international airports.
Israeli startup Vorpal specializes in tracking and, ideally, preventing those near misses. Their drone detection and tracking solution, VigilAir, uses a geographically distributed network of sensors that scan relevant frequencies to identify drone transmissions, allowing them to identify and track drones and their operators in near-real time.
Each of Vorpal’s sensors is equipped with computing hardware that processes their location-tracking software. The more drones in the sky, the more compute power needed to handle all that data. To ensure that VigilAir can seamlessly maintain those capabilities, Vorpal is looking to the cloud, working with AT&T and Microsoft to test how edge computing could allow them to track thousands of drones at any given time.
The AT&T Foundry, a network of innovation spaces dedicated to rapid prototyping, is testing how to bring network edge compute (NEC) capabilities into AT&T’s network with Microsoft’s intelligent edge offerings, including Azure’s IoT and AI services, and Azure Stack hybrid technology. By deploying Microsoft’s advanced cloud services closer to the edge of the network, NEC could allow businesses to access low-latency network compute at a fraction of the cost of traditional, embedded processing.
Mixed reality takes digital information beyond two-dimensional screens to a three-dimensional experience by using holograms, which are images made of light and sound.
Microsoft’s HoloLens headset is a culmination of breakthroughs in artificial intelligence (AI), hardware design and mixed reality development. It allows people to interact with holograms in physical space, meaning that they can view and manipulate holographic images on their own in the air or in combination with real physical objects.
The recent release of HoloLens 2 takes the mixed reality experience a step further, allowing users to manipulate holograms the same way they would handle physical objects. The headset also offers eye tracking that can sense when a user’s eyes land on a particular location and produce relevant digital information, as well as automatic scrolling as the user reads. Users can log in via iris recognition, making sharing among multiple people easy and secure.
Putting mixed reality to work
Airbus has seen impressive results in its trials and deployments of Microsoft’s mixed reality technology in training, design and manufacturing.
“Mixed reality can help us to increase quality, safety and security,” Dumont says. “The level of human error is significantly reduced, and in aerospace, increased quality is increased safety—and needless to say, security goes with that.”
Mixed reality allows aerospace trainees to learn in an immersive virtual environment without the need for an actual physical aircraft or parts. This 3D environment can offer features that real-life training cannot, such as the ability to view elements in three dimensions from any angle.
HoloLens helps Airbus designers virtually test their designs to see if they are ready for manufacture. Mixed reality speeds up the process substantially, decreasing the time spent by 80 percent.
Mixed reality technology can also help workers on the production line access crucial information while keeping their hands free. Digital information, such as instructions or diagrams, can be overlaid on a real piece of machinery to aid in complex or hard-to-reach tasks. These kinds of mixed-reality solutions have allowed Airbus to cut manufacturing time by a third while improving quality.
Mixed reality empowers employees to execute their jobs in the most efficient and ergonomic way possible, and this contributes directly to performance improvements, according to Barbara Bergmeier, who is head of operations at Airbus Defense and Space.
“By having the right information at the right time in hands free mode, not only does quality increase, but also safety, and this is what we are looking for. Quality without consideration of the well-being of our workers is not possible,” she says.
Working together to evolve mixed reality
Not only is Airbus creating solutions for its workforce, but it has built off-the-shelf solutions for its customers, so they can also benefit from Airbus’ expertise in building mixed reality solutions. Starting at the Paris Air Show, Airbus will be selling these in partnership with Microsoft on HoloLens 2.
“HoloLens 2 was born from the inspiration that it be designed for the customer, by the customer,” says Alex Kipman, technical fellow in Microsoft’s Cloud and AI group. “Airbus has long been a strategic partner in building the future of mixed reality solutions for an industrial environment and we have learned a lot from them. We are thrilled to continue our partnership as we embark on this next era of computing, the era of mixed reality and artificial intelligence.”
The first new solution offered under this partnership is a mixed reality training program first released with Japan Airlines (JAL). It helps maintenance operators and cabin crews learn in a 3D holographic environment and access instructions, heads-up and hands-free, while on the job.
In addition, Airbus will launch a collaborative map solution that allows participants from the defense and aerospace fields to virtually connect, quickly share space data and interact with complex virtual environments to plan and prepare ahead of missions.
Airbus is working on requests from other customers for mixed reality maintenance, training and remote collaboration solutions.
Leading in real life
Airbus’ collaboration with Microsoft on mixed reality goes beyond helping the company reach its internal goals. Such technological innovation is crucial to Airbus’ larger objective to become a world leader in digital services for the aerospace industry.
“We are very optimistic about this future collaboration with Microsoft based off what we’ve done in the last four years,” Dumont says. “This is really a way for us to lead our digital transformation. It’s multifold, but the use of mixed reality and HoloLens 2 are one of the key assets for Airbus in the future.”
Top photo: Holographic technology from Microsoft will be key to helping Airbus manufacture more aircraft faster.
Alfred Krupp founded his company in 1811. Now, over 200 years later, he is the namesake and inspiration for an artificial intelligence solution built by thyssenkrupp Materials Services (tkMX), one of Germany-based thyssenkrupp AG’s strategic business areas – and the largest materials distributor and service provider in the western world.
The “alfred” AI solution, powered by Microsoft Azure, helps the company analyze and process more than two million orders per year and better serve its 250,000 global customers.
Though alfred has been in place for just under a year, the solution is already helping tkMX optimize its logistics network – allocating materials to the right location much faster, minimizing transport volume and enhancing usage of the company’s transport capacity.
Transform caught up with Axel Berger, head of digital transformation at tkMX, to hear more about how alfred is changing the business.
TRANSFORM: Tell me about alfred and why tkMX developed it. What business challenges were you facing?
AXEL BERGER: We are a wholesaler, so data insights and data algorithms are possibly one of the strongest levers we have to improve our business. We had a lot of data that we weren’t using before, for three main reasons.
First, we didn’t really have the expertise to work on specific data science topics – we had the data, but it wasn’t always available. Second, data quality was an issue. And third, we lacked the technology to store data in different formats to use it and make it available in one central location on a massive scale. We also lacked the related tools to really analyze it, visualize it and finally, build algorithms out of it that could be deployed in different scenarios.
There are many possible use cases for wholesalers, and it took us a long time to pinpoint the use case that we should implement first. The major topic we’ve focused on is network optimization. How can we optimize, for example, transport costs or our supply network? How can we reduce the stock that is delivered from A to B without sacrificing our service levels? So the first project that we’ve worked on is network simulations within our German trade network.
It’s important to note that alfred is growing through its use cases. We didn’t create a huge global platform that could do everything. The first use case requires a specific amount of data, computing power and certain tools. But with additional use cases that we are now implementing, alfred is growing.
TRANSFORM: I understand you developed alfred internally. Can you tell us a little about that?
BERGER: Alfred came to life in early 2018. The biggest challenge was definitely data availability. You can have the greatest technology, the best tools, but the biggest challenge is to get quality data. Another challenge is to have the domain knowledge, the expertise in the specific topic to really make it relevant.
Everybody’s thinking that if you just use data and artificial intelligence, in the end this artificial intelligence will give you the insights that you don’t know yet. But that’s not happening. It’s about having the right data of the right quality, the expertise and the domain knowledge on a specific topic, and the technology to run it. Technology is the easy part, because nowadays there is someone like Microsoft with the technology. But to bring data and domain knowledge into the project and to understand the use case and the questions you are trying to answer, that is the hardest part.
TRANSFORM: Can you walk me through what alfred might do over the course of one day?
BERGER: There are so many things that alfred can do! Alfred dynamically tells us from which site we should ship which material to which customer. Alfred optimizes our stock levels. Alfred tells us what the perfect price for a specific customer for a specific product is. Alfred visualizes and tells us which customers are profitable, and which customers are not.
Alfred can help us build a predictive maintenance model for our machinery, and tell us which machine is about to break. Alfred also helps us to optimize our supply network in terms of physical sites – where should we open the next site or close it down, and which materials should be subbed somewhere else. It helps us to get better purchase prices because it helps us in negotiations and the bundling of materials that we want to purchase. These are all current or potential use cases.
TRANSFORM: I understand it’s still evolving, but what is the biggest benefit alfred has had on your business?
BERGER: We handed over decision-making to alfred (a machine) that relies on data. One of the taglines that we use for alfred is “intelligence in each transaction,” which means that we want to build decision engines. Alfred already delivered the first decision engine: The system tells us from which location the customer is to be supplied – taking into account all relevant frame data. That was our first decision engine, you could say.
TRANSFORM: What has been the employee reaction to alfred? Have they embraced alfred, or was there some resistance early on?
BERGER: People weren’t resistant to alfred, because right away we could show them how alfred would help them in their daily work, and the benefits we’d gain. With the use case we’ve been working on, alfred doesn’t imply any layoffs or redundancies. It is purely optimizing the way we are working, and helping to enhance the impact our employees are driving. So alfred is seen positively.
TRANSFORM: Did you do any training to prepare employees for alfred?
BERGER: Yes, absolutely. We helped them, trained them, involved them in the process very early. We trained them in the tools. What we are also planning is to deploy data labs, small versions of alfred, so people working on a specific data problem can use alfred to solve their own problems with just a push of the button. We teach them how to do this – how to use Microsoft Power BI, for example, to visualize their own data. That helped a lot because they started to work with data and to better understand what it’s all about and how it can be utilized.
TRANSFORM: How else has alfred helped your employees achieve more and optimized their work?
BERGER: Alfred has helped employees by enabling them to simulate tkMX’s network setup, which was extremely difficult before because our network is extremely complex. It has helped with data availability – the employees have much more data that they can now access themselves, without involving anybody from data warehousing. And obviously by increasing data transparency.
TRANSFORM: Have new roles or opportunities opened up to support alfred?
BERGER: Yes, of course. Roles like data engineering, data architecture, data science, solution designers – these are all new roles that we staff now.
TRANSFORM: What advice would you give other companies that are considering launching an AI initiative?
BERGER: I’d like to shift the focus away from the buzzword “AI” and better discuss what’s behind it. I don’t believe that there is an artificial intelligence as such. We have focused algorithms.
In other words, what I would recommend is to calm down and don’t be afraid of AI, because the methods are 60 years old. What has changed are the opportunities that advanced technologies such as cloud and edge computing provide and the pace at which they evolve. So, businesses need to get used to these new technologies, and use technology that is easy to handle – like Microsoft Azure. With Azure we can quickly launch applications that can be used for data aggregation, manipulation and analysis with the click of a button, with only a few people in the beginning.
To start, I would recommend taking data, searching for your first use cases, and just building them without engineering them forever. Clarify the questions you want to answer. Don’t believe in overarching algorithms that will solve the problem of finding the question, the use case. Because otherwise everybody is expecting results for something that you don’t even know is a problem.
TRANSFORM: Based on your experience, what concerns or rewards do you see for society as AI becomes more ubiquitous?
BERGER: Again, I would say calm down and get in touch with the methods and technologies behind AI. People are fearing things they don’t know. If you get in touch with it and understand what’s really behind AI, then I think it’s easier for people to understand that we are far away from real artificial intelligence. We see specific use cases, specific technologies to solve specific problems, but nothing like a mastermind.
It’s important to talk about AI and engage in the public debate, because with the evolving technology around machine learning and AI, there are questions to answer, including both ethical and legal questions. For example, the much-used example of an autonomous car. How do we cope as we give more and more autonomy and decision-making capacity to machines?
I studied mechatronics some 25 years ago. With mechatronics you were already talking about cyber-physical systems and programming and automizing machines. So IoT is nothing new. It’s just the technology has evolved and that gives us new opportunities.
When you look at artificial intelligence, the methodologies are out of the 1940s, 1950s – neural networks, for example. It’s nothing new. It’s all about cheaper storage, more computing power and better connectivity, but also about standardization and harmonization of data. And if you come back to that point, you realize it’s feasible to cope with it, because we’ve been able to cope with it for many years.
TRANSFORM: You talked about what alfred is doing now. In 10 years, where do you want the platform to be?
BERGER: Technology is evolving so fast, it’s hard to foresee. Do you know the saying, ‘The appetite comes with eating’? It’s like when you’re working on a project, you’re finding new data insights, new data points that give you the motivation to go to the next step. So I am convinced there will be so many more use cases in the future that I cannot foresee right now.
I will learn, we all will learn, the machine will learn. We will get more and more data created out of the data that we already have – other data sources, third-party data and so forth. So right now, I cannot foresee all the use cases we will see in the future. We will work under one paradigm, which is ‘Intelligence in each transaction’. Over time alfred will also take decisions in our ERP system automatically. In average transactions that we do, we would like to have more intelligence, and alfred will help us with that.
TRANSFORM: Is there anything you would like to add?
BERGER: I’m a great believer in removing the mystique of buzzwords like AI and focusing on what’s behind it instead – helping people and companies understand the technologies and methods that help us make our businesses as well as our personal lives easier and better.
It’s part of my role as the CDO, but I also believe that digitalization is a bunch of buzzwords. If you ask someone at a conference what you really mean by digitalization, most people will get very thin in their answers. Why?
Because they don’t really know, because they are looking at digitalization from a huge height. And I think if you really want to go beyond the buzzwords, you really need to go into the use cases and the business, and you really need to redefine the opportunities. So I am trying hard to get out of these buzzwords and really get down to the use cases.
Top photo: thyssenkrupp Materials Services receives around 14 million order items annually. With alfred, these can be efficiently processed and analyzed. (All photos courtesy of thyssenkrupp Materials Services)
Alexandr Epaneshnikov, a 19-year-old Russian student who is legally blind, recently decided he wanted to be more independent by commuting on his own and relying less on his mom for rides to school. It meant taking a streetcar to a subway to his high school in Moscow, a 30-minute trip that Epaneshnikov assuredly navigates with a cane and Moovit, an urban mobility app optimized for screen readers.
“I am very happy that Moovit is accessible and offers a good amount of information about Moscow public transportation,” says Epaneshnikov, who wants to study information technology at a university. The app has helped him meet friends at cafes and restaurants, and take a train to an unfamiliar city outside Moscow to visit his girlfriend’s family.
“I feel it adds more confidence and independence,” he says.
Launched seven years ago in Israel, Moovit has become the world’s most popular transit-planning and navigation app, with more than 400 million users and service in 2,700 cities across 90 countries. The company is also a leader in inclusive technology, with innovative work that helps people across the disability spectrum use buses, trains, subways, ride-hailing services and other modes of public transit.
In addition to offering a consumer app in 45 languages, Moovit has partnered with Microsoft to provide its multi-modal transit data to developers who use Azure Maps, and a set of mobility-as-a-service solutions to cities, governments and organizations. The partnership will enable the creation of more inclusive, smart cities and more accessible transit apps.
“Our mission is to simplify urban mobility and make it accessible, because mobility is really a basic human right,” says Yovav Meydad, Moovit chief growth and marketing officer. “Efficient mobility opens a lot of opportunities for employment, education and a better life, and we want to help all users make their journey as easy as possible.”
For Moovit, the work means not only helping rural residents reach cities for work and school, but also helping people with any disability travel. Of the hundreds of daily emails sent to Moovit, emails from people with low vision are some of the most profound pieces of feedback.
“Sometimes, it’s very emotional,” says Meydad. “They say, ‘Thanks to Moovit, I’m more independent. I can now leave home on my own.’ It’s very, very important for us to make Moovit accessible for everyone.”
The company’s accessibility work began in earnest in 2015, when Meydad and other leading app developers met a focus group of people who are blind or low-vision to see how they used their apps.
“Honestly, I was shocked,” says Meydad, who wrote about the experiencetwice in Medium. “I saw people trying to use our product, but couldn’t do it efficiently or at all, because screens were not properly labeled or meaningful [for screen readers].” In one case, Moovit’s search button – a major feature to start a trip plan – had the unhelpful audio label of “Button 56.”
Meydad took notes and promised big changes. He worked with Moovit’s team and a developer who is blind to optimize the app for the mobile screen readers TalkBack on Android and VoiceOver on iOS. The team scrutinized every screen for accessibility, added useful labels and condensed intricate data – routes, trip duration, start and end times, entry and exit stops – into clear sentences for audio. They incorporated feedback from users around the world with low vision.
“After one quarter, we released a major version upgrade that completely changed their experience,” says Meydad.
The accessibility work didn’t stop there. To ease public transit for people who use a wheelchair, Moovit asked its “Mooviters” – 550,000 local contributors who help map transit systems for the app – to identify wheelchair-accessible stations in their cities. That enabled the company to add a feature that shows only routes with stations with ramps and elevators.
“This means the entire journey can be fully accessible,” says Meydad.
For users with hand motor disabilities, Moovit redesigned menus and buttons for easier use with one hand, especially on larger phones. For people who are colorblind and use color-coded transit systems, such as “the green line,” Moovit includes the name of the line, instead of just a colored dot or symbol, a space-saving practice in many maps.
The company also ensures no broken or overlapped text when a user needs to magnify the font. It partnered with Be My Eyes, an app that connects sighted volunteers with people who are blind or low-vision. It’s studying how to use a phone’s vibration and flashlight to serve users with hearing loss. And it continually works with people with a disability to improve or customize the app.
For Microsoft, working with Moovit, who has developed accessible features such as screen readers and global data on wheelchair-friendly routes, is part of a deep commitment to accessibility and inclusion in its products and services. Developers who use Azure Maps will soon have access to Moovit’s trip planner and rich transit data to help build innovative, accessible tools.
“What I love most about Moovit is how they’re empowering other companies to build inclusion into their solutions,” says Megan Lawrence, senior accessibility evangelist at Microsoft. “Our partnership can help people across the disability spectrum use technology to move more freely and independently, a key metric for improving quality of life.”
The clarity of Moovit’s live audio navigation also helps people with an intellectual disability who want extra guidance, such as alerts for when a bus is coming, when to transfer and when to get off. The features are a main reason why Community Living Toronto, an organization that supports people with an intellectual or developmental disability, chose Moovit as the platform for their branded transit app, Discover My Route.
“We tested many apps and Moovit was the full package,” says Angela Bradley, director of resource development and marketing at Community Living Toronto.
“It’s not just an app for riding transit. It’s almost like a coaching tool. It gives people the confidence to take transit and open up their world, which can mean seeing friends, getting a job, going to college or joining a dance class.”
Top photo: Alexandr Epaneshnikov in Moscow. (Photo courtesy of Epaneshnikov)
It’s with that mindset in place that Hasty sets out to bridge the gap between dreams and reality.
“Not long ago, check-in was about a 30-minute process to get you on board a cruise line,” Hasty explains, touting a recent breakthrough. “It really felt like going through the TSA, not starting a vacation.”
To streamline things, Hasty and his team envisioned an “invisible experience,” fueled by customer-submitted cellphone selfies and pre-checks. But as they workshopped the idea, they discovered that removing all friction just resulted in confusion and guilt.
“It’s that feeling you get at a store with no registers; you’re supposed to just walk out, but you feel a little shady about it,” he explains. “That’s even more amplified when you board a cruise ship. You’re like, ‘Am I just supposed to walk on?’ As it turns out, guests need the feedback; something saying all is well.”
In search of a solution, Hasty turned to some old friends. “When I took it to Microsoft, some of the first conversations were: ‘We’ve never tried to do it this way. Let’s think with our hands to see what’s possible,’” he recalls. “It was never ‘We can’t get there,’ it was always “Well, this is what would be required to get there, so this is what we’ve got to go do.’”
In the years since Hasty has begun working with the Commercial Software Engineering (CSE) team, the relationship has empowered him to imagine with no limitations – confident in the knowledge that CSE will bridge the gap between his team and Microsoft’s product engineers, accelerating their capabilities through tech and innovation.
“They provided resources to help us validate things Royal didn’t have access to, like cognitive services and cloud computing capability that allowed us to recognize faces in a millisecond,” Hasty explains, pointing out RC’s privacy policies that ensure customer transparency and that captured images are only used for the cruise experience.
“The first prototype was a camera and a laptop with Cognitive Services. Can we see these people, connect them to our data; can we check them in just using their face? From there, we realized that we want to talk about the guest experience, the height of the camera, the quality of the camera that we need, the flow of people, how fast can the camera pick up people, how many people can be in the frame at once.”
Utilizing an open source mindset, the CSE team worked alongside Royal Caribbean every step of the way to develop a solution that was both “invisible” and interactive enough to remove that sense of guilt.
“We noticed that when all these people went by, they didn’t quite know where to look, so we put a light ring on it. Then we realized people needed feedback, so we put a screen on it,” Hasty says of the project, which averages split seconds per passenger and is significantly faster than the manual review process.
“The LED ring gives you simple color codes – white says ‘We’re looking for your face,’ blue says, ‘We found your face,’ and green says, ‘You’re all clear.’ It happens almost instantaneously, everyone understands it instantly, and we’ve created a beautiful appliance that you can walk through with your whole family together at once.”
“Now boarding is literally, go up the escalator, walk by the facial recognition machine and onto the ship – welcome aboard,” Hasty adds with a smile. “We like to say car to bar in minutes.”
Echoes Schneider: “Joey and I are huge fans of the CSE group. Our focus right now is on how we leverage emerging technology to transform the guest experience, and Microsoft keeps us on the next edge of technology as it relates to disruption in our industry.”
To date, the relationship has manifested itself in a variety of ways. The aforementioned Edge Access tour app, for instance, is powered by Microsoft’s Capture Studio technology. Elements of Azure, AI and dashboards manage guest experiences daily, and RC attended last summer’s One Week hackathon on the Microsoft campus in Redmond, Washington, engaging with the CSE team on video analytics initiatives.
Walk into a Starbucks store anywhere in the world and you’ll encounter a similar sight: coffee beans grinding, espresso shots being pulled and customers talking to baristas while their coffee order is hand-crafted.
The process may look like a simple everyday scene, but it is carefully orchestrated to serve Starbucks’ more than 100 million weekly customers. With the help of Microsoft, Starbucks is creating an even more personal, seamless customer experience in its stores by implementing advanced technologies, ranging from cloud computing to blockchain.
“We have a world-class team of technologists engaging in groundbreaking innovation each day. Their inventiveness and intellectual curiosity are matched by their dedication to enabling the Starbucks experience, and this is increasingly critical to how technology has to show up for us,” says Gerri Martin-Flickinger, Starbucks executive vice president and chief technology officer.
“Everything we do in technology is centered around the customer connection in the store, the human connection, one person, one cup, one neighborhood at a time.”
At the Microsoft Build 2019 conference, Microsoft CEO Satya Nadella recently demonstrated how Starbucks delivers its signature customer experience with new technologies.
Making recommendations more relevant with reinforcement learning
Starbucks has been using reinforcement learning technology — a type of machine learning in which a system learns to make decisions in complex, unpredictable environments based upon external feedback — to provide a more personalized experience for customers who use the Starbucks® mobile app.
Within the app, customers receive tailor-made order suggestions generated via a reinforcement learning platform that is built and hosted in Microsoft Azure. Through this technology and the work of Starbucks data scientists, 16 million active Starbucks® Rewards members now receive thoughtful recommendations from the app for food and drinks based on local store inventory, popular selections, weather, time of day, community preferences and previous orders.
“Just like their relationship with a barista, customers receive the same care and personalized recommendations when it comes from our digital platforms,” says Jon Francis, senior vice president, Starbucks Analytics and Market Research.
This personalization means that customers are more likely to get suggestions for items they will enjoy. For example, if a customer consistently orders dairy-free beverages, the platform can infer a non-dairy preference, steer clear of recommending items containing dairy, and suggest dairy-free food and drinks.
In essence, reinforcement learning allows the app to get to know each customer better. And while the recommendations are driven by a machine, the end goal is personal interaction.
“Starbucks is an experience,” says Martin-Flickinger. “And it’s centered around that customer connection in the store, the human connection, one person, one cup, one neighborhood at a time. I think that mission is so critical to how technology has to show up for us.”
Now, Starbucks is looking to expand this technology to the drive-thru experience.
“As an engineering and technology organization, one of the areas we are incredibly excited to be pursuing is using data to continuously improve the experience for our customers and partners,” says Martin-Flickinger. “Using data for personalization is vital to our mobile app, and now we are leveraging data to improve our drive-thru experience.”
Because the technology does not have the individual order histories for drive-thru customers that are available for mobile app customers, it will generate relevant drive-thru recommendations based on store transaction histories and more than 400 other store-level criteria. These recommendations will be offered proactively on a digital menu display from which customers can order. Eventually, customers will be able to explicitly opt in to recommendations that are even more personalized.
Starbucks is currently testing this technology in its Tryer Center innovation hub in Seattle, with plans to roll it out soon. And according to Francis, reinforcement learning will continue to have an important role at Starbucks in many other applications going forward.
“We’re meeting our customers where they are — whether in-store, in their car or on the go through the app — using machine learning and artificial intelligence to understand and anticipate their personal preferences,” he says. “Machine learning also plays a role in how we think about store design, engage with our partners, optimize inventory and create barista schedules. This capability will eventually touch all facets of how we run our business.”
Implementing IoT to deliver a smooth coffee experience
Each Starbucks store has more than a dozen pieces of equipment, from coffee machines to grinders and blenders, that must be operational around 16 hours a day. A glitch in any of those devices can mean service calls that rack up repair costs. More significantly, equipment problems can potentially interfere with Starbucks’ primary goal of providing a consistently high-quality customer experience.
“Any time we can create additional moments of connection between our partners and customers, we want to explore and activate,” says Natarajan “Venkat” Venkatakrishnan, vice president of global equipment for Starbucks. “Our machines are what allow our partners to create that special beverage, and ensuring they are working properly is critical.”
To reduce disruptions to that experience and securely connect its devices in the cloud, Starbucks is partnering with Microsoft to deploy Azure Sphere, designed to secure the coming wave of connected internet of things (IoT) devices across its store equipment.
The IoT-enabled machines collect more than a dozen data points for every shot of espresso pulled, from the type of beans used to the coffee’s temperature and water quality, generating more than 5 megabytes of data in an eight-hour shift. Microsoft worked with Starbucks to develop an external device called a guardian module to connect the company’s various pieces of equipment to Azure Sphere in order to securely aggregate data and proactively identify problems with the machines.
The solution will also enable Starbucks to send new coffee recipes directly to machines, which it has previously done by manually delivering the recipes to stores via thumb drive multiple times a year. Now the recipes can be delivered securely from the cloud to Azure Sphere-enabled devices at the click of a button.
“Think about the complexity — we have to get to 30,000 stores in nearly 80 markets to update those recipes,” says Jeff Wile, senior vice president of retail and core technology services for Starbucks Technology. “That recipe push is a huge part of the cost savings and the justification for doing this.”
The overarching goal with Azure Sphere, Wile says, is to shift from reactive maintenance to a predictive approach that heads off issues before they happen. Longer term, the company envisions leveraging Azure Sphere for additional uses such as managing inventory and ordering supplies, and will encourage suppliers of its devices to build the solution into future versions of their products.
Using blockchain to share coffee’s journey with customers
Starbucks is also innovating ways to trace the journey that its coffee makes from farm to cup — and to connect the people who drink it with the people who grow it.
The company is developing a feature for its mobile app that shows customers information about where their packaged coffee comes from, from where it was grown and what Starbucks is doing to support farmers in those locations, to where and when it was roasted, tasting notes and more.
For Starbucks, which has long been committed to ethical sourcing, knowing where its coffee comes from is not new. Last year alone, Starbucks worked with more than 380,000 coffee farms. However, digital, real-time traceability will allow customers to know more about their coffee beans. Perhaps even more important and differentiating are the potential benefits for coffee farmers to know where their beans go after they sell them.
This new transparency is powered by Microsoft’s Azure Blockchain Service, which allows supply chain participants to trace both the movement of their coffee and its transformation from bean to final bag. Each state change is recorded to a shared, immutable ledger providing all parties a more complete view of their products’ journey.
This can not only empower farmers with more information and visibility once the beans leave their farms, but also allows customers to see the impact their coffee purchase has on the real people they’re supporting.
“While high-quality, handcrafted beverages are so important, it’s the stories, the people, the connections, the humanity behind that coffee that inspires everything we do,” says Michelle Burns, Starbucks senior vice president of Global Coffee & Tea. “This kind of transparency offers customers the chance to see that the coffee they enjoy from us is the result of many people caring deeply.”
Starbucks previewed digital traceability for shareholders at its annual meeting in March. Eventually, customers will be able to use the Starbucks mobile app to trace the journey of their Starbucks packaged coffee.
“What we’re still working on is interviewing coffee farmers in Costa Rica, Colombia and Rwanda, learning more about their stories, their knowledge and their needs in order to determine how digital traceability can best benefit them,” says Burns. “We’re forging new ground here, so we’re excited to report more in the coming months.”
Top photo: At the Starbucks store at 81st and Broadway in New York City, and at every store around the world, cutting-edge innovation powers a deceptively simple everyday scene. All photos courtesy of Starbucks. Additional reporting by Deborah Bach.
Toyota Material Handling Group is the largest forklift manufacturer in the world, but its customers require much more than warehouse trucks and equipment. To better serve them, the global business is expanding and enriching its logistics solutions with digital innovation and Toyota’s renowned principles in lean and efficient manufacturing.
By providing solutions with artificial intelligence, mixed reality and the Internet of Things (IoT), Toyota Material Handling Group is helping customers meet the global rise in e-commerce and move goods quickly, frequently, accurately and safely.
With Microsoft technologies, the solutions range from connected forklift and field service systems available today to AI-powered concepts that pave the way for intelligent automation and logistics simulation – all designed with Toyota’s standards for optimizing efficiency, operation assistance and kaizen, or continuous improvement.
“Our direction is going to more systemizing and logistics solutions, services in digital automation, AI analytics and IoT,” says Toshihide Itoh, associate director and CIO of Toyota Material Handling Group, an Aichi, Japan-based division of Toyota Industries Corporation. “We also continue to improve our forklift trucks, because this is our origin. But customers need more and more efficient logistics and we need digital innovation to accelerate and expand our business.”
At the Hannover Messe show in Germany this week, Toyota presented its vision for a future warehouse with lean logistics and pre-trained, intelligent forklifts. Enabled with machine learning and IoT services in Microsoft Azure, the vehicles can quickly learn navigation in a virtual model of a customer’s warehouse, a so-called “digital twin.” Customers can experience the trucks interacting with their physical and virtual environment.
The ability to simulate and visualize a physical environment will help solve one of the biggest challenges in the industry: the long deployment time for customized IoT solutions. Installations can normally take six months to a year, but using machine learning and digital twins can significantly shorten the time.
“For customers, this is a very important benefit,” Itoh says of the project, created by Toyota Material Handling Europe.
Once deployed, the intelligent forklifts and other automated guided vehicles (AGV) can adapt to live conditions, continually improve performance and communicate with other machines in a “swarm” that sends the right trucks to the right tasks at the right time.
Machine learning also plays a big role in a factory innovation project that Toyota Material Handling North America is working on with Microsoft. Engineers with both companies are building AI algorithms with sound to evaluate and verify welding quality, an important part of building forklifts.
Earlier this year, the teams worked with welders in Toyota’s Indiana factory and recorded sound from the factory floor. Then they created a machine learning platform to drive product quality, customer satisfaction and better training opportunities for new employees.
“We use machine learning and AI to do things that people cannot do by themselves, like analyze big data quickly,” Itoh says. “AI analysis can lead to new solutions and give open time for people to utilize their brain. That’s very important. So they share some work with AI and it makes everyone and everything more productive.”
Itoh became CIO of Toyota Material Handling Group in 2017, after leadership roles in the group’s divisions for logistics systems, forklift research and development, and advanced system solutions. The work helped him understand different aspects of the industry, from equipment to logistics to customers.
“The important point of view is the customers’ point of view,” he says. “Information technology is my main responsibility, but it needs to help customers and our business team improve their operations.”
He said Azure’s global scale and services have helped the company deliver valuable solutions, including a fleet management system for customers to centrally monitor their forklift fleet. Powered by Azure IoT Edge and scheduled to deploy this spring, the telematics solution helps customers track forklift utilization, plan and predict maintenance, and improve efficiency and safety in their warehouses.
The company is also developing a fully integrated dealer management system in North America with the Microsoft platform. And Toyota Material Handling Europe and North America are collaborating on a robust customer portal, powered by many data sources, to meet customer expectations in a digital world.
“We need to expand our business and see the customers’ point of view more and make them happy,” Itoh says. “But we cannot do everything by ourselves. So I am so excited for everything that Microsoft can provide as a partner to accelerate our digital transformation.”