post

How AI for Earth is inspiring new generations to help the planet

Woodland Park Zoo, Seattle: streamlining camera traps

Manoj Sarathy, a high school student, is working with Dr. Robert Long of Seattle’s Woodland Park Zoo to develop an AI tool that processes data from wildlife camera traps to monitor species in the Pacific Northwest and in the Rocky Mountains. Using machine learning, this resource will be able to identify and index a wide variety of animals in a much shorter timeframe than if processed manually. Camera traps can generate large amounts of data, some of which are false positives – for example, when something triggers the camera, but nothing meaningful is photographed. This AI tool examines the images and eliminates images that don’t require more detailed inspection.

[Subscribe to Microsoft on the Issues for more on the topics that matter most.]

Cornell University, Ithaca, New York: listening to the sound of the rainforest

Dr. Laurel Symes and Dr. Holger Klinck are leading a project that uses insect sounds to monitor rainforests. Working out of the Cornell Bioacoustics Research Program in Ithaca, New York, their team of researchers uses AI to monitor insect sounds to develop a better understanding of the dynamics of these habitats. Tropical rainforests are home to great swathes of biodiversity, in part, because they are inaccessible to humans. The combination of plant density and high canopies makes these rainforests a challenging environment for traditional study techniques. This team’s first focus is on neotropical rainforest katydids, or bush crickets. The research will then move beyond insects to birds, monkeys and other vocal animals.

ETH Zurich: making the rainforest everyone’s joint concern

David Dao is a doctoral candidate at ETH Zurich, the Swiss Federal Institute of Technology. He is building AI and data systems to help fight deforestation. Historically, figuring out which body is responsible for looking after a section of forest has been a challenge. Dao and his team came up with an innovative idea: Make everyone a caretaker of the forest. This system allows people to make a financial investment in the forest’s wellbeing, with the hope of gaining a return. This gives – giving everyone an incentive to help fight deforestation.

University of Alberta in Canada: grizzly bears and the right to roam

Clayton Lamb is a researcher at the University of Alberta in Canada, specializing in grizzly bears. Lamb is particularly interested in the scope of the bears’ territory – or lack of it – as grizzlies only occupy a fraction of the land they formerly roamed. Much of the remaining population also faces competition for land use from humans. Lamb’s work uses AI and machine-learning tools to create a comprehensive analysis of the human and environmental factors that could contribute to the limiting of grizzly bear density.

iNaturalist: Using AI for species identification

iNaturalist is a social media platform that helps people identify the plants, animals and insects they see anywhere in the world. With badges and features that offer a gaming element, users capture biodiversity data and share their findings. The iNat’s Seek app was created with families in mind with features to mask location data and other potentially sensitive information about children using the app. There are an estimated 10 million species in the world, but only 1.8 million have been discovered and just 90,000 analyzed. So there’s plenty of scope for getting involved.

Colorado State University: Inspiration from above

The Community Collaborative Rain, Hail and Snow Network, CoCoRaHS for short, works with thousands of trained volunteer observers, including children, to gather data on rain, snow and hail. Using tools like plastic rain gauges, citizen scientists measure rain or snowfall each morning and submit their readings to the CoCoRaHS database. Making this data available to weather forecasting services is helping improve severe weather models, making it easier to issue accurate and timely alerts when needed.

City University of New York: Looking to the future 

YouTube Video

Just off the Pacific coast of North America lies the Ocean Observatories Initiative (OOI) Regional Cabled Array. It sits atop the Juan de Fuca tectonic plate, collecting oceanographic data. Dr. Dax Soule at City University of New York uses that data to engage students from a variety of backgrounds. One such study concerns the Axial Seamount, which combines a volcanic hotspot with a mid-ocean ridge. Using cloud-based tools to access this data and then conduct important research, Soule is helping to inspire the next generation of oceanographic scientists.

For more on AI for Earth, visit Microsoft AI for Earth. And follow @MSFTIssues on Twitter.  

post

18 million new building footprints in Africa will help rescuers respond to natural disasters

In the last ten years, 2 billion people were affected by disasters according to the World Disasters report 2018. In 2017, 201 million people needed humanitarian assistance and 18 million were displaced due to weather related disasters. Many of these disaster-prone areas are literally “missing” from the map, making it harder for first responders to prepare and deliver relief efforts.

Since the inception of Tasking Manager, the Humanitarian OpenStreetMap Team (HOT) community has mapped at an incredible rate with 11 million square kilometers mapped in Africa alone. However, large parts of Africa with populations prone to disasters still remain unmapped — 60% of the 30 million square kilometers.

Under Microsoft’s AI for Humanitarian Action program, Bing Maps together with Microsoft Philanthropies is partnering with HOT on an initiative to bring AI Assistance as a resource in open map building. The initiative focuses on incorporating design updates, integrating machine learning, and bringing new open building datasets into Tasking Manager.

The Bing Maps team has been harnessing the power of Computer Vision to identify map features at scale. Building upon their work in the United States and Canada, Bing Maps is now releasing country-wide open building footprints datasets in Uganda and Tanzania. This will be one of the first open building datasets in Africa and will be available for use within OpenStreetMap (OSM).

In Tasking Manager specifically, the dataset will be used to help in task creation with the goal of improving task completion rates. Tasking Manager relies on ‘ML enabler’ to connect with building datasets through an API. This API-based integration makes it convenient to access not just Africa building footprints, but all open building footprints datasets from Bing Maps through ML Enabler, and thus the OpenStreetMap ecosystem.

“Machine learning datasets for OSM need to be open. We need to go beyond identifying roads and buildings and open datasets allow us to experiment and uncover new opportunities. Open Building Dataset gives us the ability to not only explore quality and validation aspects, but also advance how ML data assists mapping.”
– Tyler Radford (Executive Director, Humanitarian OpenStreetMap Team)

Africa presented several challenges: stark difference in landscape from the United States or Canada, unique settlements such as Tukuls, dense urban areas with connected structures, imagery quality and vintage, and lack of training data in rural areas. The team identified areas with poor recall by leveraging population estimates from CIESIN. Subsequent targeted labeling efforts across Bing Maps and HOT improved model recall especially in rural areas. A two-step process with semantic segmentation followed by polygonization resulted in 18M building footprints — 7M in Uganda and 11M in Tanzania.

Extractions Musoma, TanzaniaExtractions in Musoma, Tanzania

Bing Maps is making this data open for download free of charge and usable for research, analysis and of course, OSM. In OpenStreetMap there are currently 14M building footprints in Uganda and Tanzania (the last time our team counted). We are working to determine overlaps.

We will be making the data available on Github to download. The CNTK toolkit developed by Microsoft is open source and available on GitHub as well. The ResNet3 model is also open source and available on GitHub. The Bing Maps computer vision team will be presenting the work in Uganda and Tanzania at the annual International State of the Map conference in Heidelberg, Germany and at the HOT Summit.

– Bing Maps Team

post

Virtual reality project Microgravity Lab takes students to space

Virtual reality can transport us to new lands that are near, far, or imagined. As a team of Garage interns found partnering with the Microsoft Hacking STEM and NASA Stem on Station teams, it can also demonstrate physics concepts and spark an interest in STEM careers. For the back-to-school season, we’re excited to announce the opportunity to try Microgravity Lab, a Microsoft Garage project. The VR experience for Windows Mixed Reality and corresponding lesson plan equip teachers with an engaging tool for teaching physics concepts by simulating microgravity. Interested educators can request an invite to try the VR application and corresponding lesson plans. Be sure to include your school name and plan for using the application into the form.

Bringing space into the classroom via Windows Mixed Reality

The Garage Internship is a unique, startup-style program in which teams of interns build projects in response to pitched challenges by Microsoft engineering teams. When this Vancouver intern team heard that the Microsoft Education team was looking for a creative new method way to illustrate the concept of microgravity through VR, they jumped at the opportunity to work on the project.

Microgravity Lab title screen, displaying 5 different expeiences, settings, and other options.An often-misunderstood concept, microgravity is difficult to simulate and understand in Earth’s gravity-laden environment. It is best explained through experiential learning. The Microgravity Lab VR lab experience for Windows Mixed Reality and its accompanying lessons gives teachers the tools to bring this experiential learning to their students.

As NASA Education Specialist Matthew E. Wallace shared, “The concept of microgravity is often misunderstood by students who learn about astronauts on the International Space Station. Providing a virtual reality world for them to explore the phenomena of life on orbit is an excellent way to engage students and solidify their comprehension of concepts related to force, mass and gravitational acceleration.”

Sabrina Ng, Design Intern for the project noted, “When I think of microgravity, I think of it as something you feel, not what you see per se. Thinking about how to visualize and communicate such an abstract concept without stimulating the physical senses was a really cool challenge.”

Microgravity Lab joins a collection of eight middle school lesson plans developed in partnership with NASA to celebrate 20 years of humans living in and working on the International Space Station.

Experiencing microgravity to understand Newton’s 2nd & 3rd Law

Microgravity Lab is designed for grades 6-8. Students can explore three VR modules to understand these physics principles in the context of microgravity on the moon:

  • Conservation of momentum
  • Newton’s 2nd Law
  • Newton’s 3rd Law

The team worked closely with teachers to develop the project, testing early versions of Microgravity Lab with 7th and 8th grade classes. They refined and updated the experienced based on the classroom feedback.

Implementing feedback from teachers and students, the interns added a feature to enable live Microgravity data analysis via Excel. “This project gives students the experience and the fun aspects of VR, but with Excel, we found a way to expose them to Data Analysis. Data is a very important part of our world and this is a great way to introduce it to them,” shared Rébecca Vézina-Côté, the Program Manager Intern for Microgravity Lab.

Introducing space into the classroom via Windows Mixed Reality

Hacking STEM to engage students

Microgravity Lab joins the Hacking STEM portfolio. The portfolio is created by teachers for teachers to offer hands-on, inquiry-driven, real-world lesson plans. The standards-aligned, interdisciplinary lessons lesson plans teach 21st century technical skills in the context of existing curricula. The Hacking STEM portfolio now includes 22 middle and high school lesson plans built by teachers for teachers on topics ranging from circuits and robotic hands to learning how sharks swim, and now, microgravity.

“There are companies moving towards commercializing space travel and package delivery, a project like this might give students an idea of what life might be like on a space station, and hopefully inspire them to want to go further with it and see it as a future path for them as an area of interest or a future career,” shared Adrian Pang, a Software Engineer Intern with the project.

The Microgravity Lab experience makes science more engaging and introduces these concepts to students in a way that inspires lifelong learning and passionate curiosity about the world around them.

The impact of VR in the classroom

Microgravity lab team photoThe Microsoft Education team has provided materials to enable a seamless introduction of VR to the classroom. When immersive technologies are deployed correctly and in a pedagogically consistent manner, they have the potential to support and expand curriculum, enhancing learning outcomes in ways that haven’t been previously affordable or scalable. Read more in this white paper detailing the impact of VR in the classroom.

Based on their own experience learning VR and Windows Mixed Reality, Garage interns have suggestions on how teachers can get started with VR. “Windows Mixed Reality does a great job of walking users through setting up the headset, then it’s just finding the app on the Microsoft Store, downloading it and installing it,” shared Rébecca. Crystal Song, another Software Engineering Intern continues, “I’d encourage teachers and school administrators to not see the tech as just a toy, but something that can teach. VR has a unique ability to teach through discovery, so allowing space and time for students to explore is key.”

James Burke, a longtime Hacking STEM developer partner who worked with the interns to test the project, encourages fellow educators to think outside the box to engage and challenge students. “Kids can do a lot more than people give them credit for.” In Burke’s engineering lab at Tyee Middle School, students work on project-based learning modules that can resemble college-level multidisciplinary assignments. With future-ready equipment and real-world projects to tackle, his award-winning classroom engages with students at every level. VR is just another way to spark that passion in students.

Request an invitation to try the project

To get started with Microgravity Lab for your classroom, request an invite to try the VR application. Include your school name and plan for using the application into the form.

More lesson plans and classroom materials are available at the Hacking STEM website.

post

Meet the Surface design team that built our first smart headphones

Meet the Surface design team who built our first smart headphones

Vivian Nguyen

It’s 2 PM, and you need to finish a project by end of day. Coworkers in your open office space are chatting about the newest ramen spot. While you’d like to expound on the difference between Hokkaido- versus Tokyo-style noodles, you need to focus. You put on your Surface Headphones, and Cortana greets you with a quick update:

“Hi, you’ve got about 8 hours of battery left. You’re connected to [your device name].”

As your favorite Odesza song plays, you start work immediately.

YOUR WORKPLACE ACCOMPANIMENT

Composed with the design trinity of audio quality, comfort, and seamless integration, the Surface Headphones help you create a personal space in the modern workplace and in your modern life. The idea that, when you wear them, you escape into another world that lets you focus on you and what you need to get done. The Surface design team wanted to give you — the actual user — control over how plugged in (or not!) you want to be to your immediate environment. Check out the tech specs here.

And you can see that thoughtful approach in the hardware. Designing comfortable earmuffs was paramount because it’s the one part that touches you all the time. They initially considered a traditional donut shape, but with inclusive design at the heart of everything we do, we wanted to accommodate a diverse set of ear shapes. The earmuffs now fit ears of all shapes and sizes comfortably with even pressure points for a secure fit.

Tactile design wasn’t the only consideration. They set out to craft a device that’s both functional and beautiful. Creating a smooth seam on the earmuff, for example, was surprisingly difficult. See how the team wouldn’t take no for an answer in the video below:

See how the Surface design team wove together elegant hardware design, rich audio, and an intelligent assistant. Click here for the audio description version.

Every decision about the Surface Headphones keeps real people in mind — including the writing they don’t see or touch.

To create a holistic and seamless voice experience, Senior Writer Matt Lichtenberg, who focuses on hardware, and Senior Personality Lead for AI Chris O’Connor, who shapes the voice for intelligent experiences, fused their complementary skills. Because Cortana delivers the instructions, Matt and Chris needed to collaborate and bring together the what (instructions) and the how (Cortana).

“Words contribute to the whole experience,” said Matt, “and we wanted the headphones to be almost invisible to people while they’re wearing them. They shouldn’t have to think about them much.”

“I like to think of it as, we’re helping people achieve more with less energy,” said Chris. “How do they get the most out of this device with the least amount of effort? It’s the idea that design stays out of your way — it’s minimal and there to help you get stuff done.”

THINKING OUT OF THE BOX

From the onset, the design team wanted to understand how people naturally use headphones in a variety of vignettes. They developed a series of scenarios to answer key questions about how people interacted with the headphones.

For instance, when customers initially turn on the headphones, would they want to pair and go? Or would they download the Cortana app first?

As it turns out, most want to pair and go.

When you turn on other Bluetooth devices for the first time, you’ll need to put the device in pairing mode. With the Surface Headphones, they’re immediately in pairing mode and Cortana greets you with, “Hello, you’re ready to pair.”

You connect your device, and Cortana confirms with, “You’re paired to [device name].”

“It’s a challenge to create a rich and enjoyable out-of-the-box experience,” said Chris. “If it’s boring and tedious, people blow right through it. But if it’s enjoyable and people understand the value, they’ll reach an optimal state before carrying on.”

Design is an iterative process, and we’re constantly listening to feedback. We’ve heard customers ask for more device control to turn settings on or off, including the “Hey, Cortana” voice activation, touch controls, and voice prompts. So, we delivered.

The latest firmware update on the Cortana app can help you personalize your headphone settings, like reducing the number or duration of voice prompts. That means you can change your settings so a simple “Hello” plays when you initially turn on your headphones. The app gives you more control of your device, ensuring you get the best experience possible.

“It’s amazing how long it feels to say a few words, so you need to make them count,” said Matt.

Unlike computers, which require constant interaction, the Surface Headphones almost disappear into the background while you work, helping you focus while eliminating outside distractions. To help people achieve this, the voice writing team designed the voice prompts to avoid interruptions unless they’re critical, like letting you know when your battery is low.

“How do you thread the needle between being a voice prompt, a robot, and a conversational entity, but still get out of the way?” asked Chris. “This was one of the first areas where we had to practice design differently and pull back on personality to allow things to be shorter and faster.”

COMMUNICATING WITHOUT WORDS

Some interactions don’t even need words.

When the headphones are charging, for example, the LED light flashes. In this context, a visual cue is more intuitive. You don’t need to pick them up or put them on to know what’s happening.

In times when words feel unnatural, sound itself can communicate information. When you turn the left dial on the Surface Headphones forward, you hear a low-pitched beep to indicate maximum noise cancellation. Conversely, a high-pitched beep plays when you turn the dial in the opposite direction. This confirms the headphones are now amplifying ambient sound.

Inspired by the volume knobs of hi-fi stereos, which turn with a certain slowness, the hardware design team added headset dials to adjust volume, noise cancellation, or sound amplification. Rotating the dial is an intuitive motion that lets people choose the precise level of sound they want (or don’t want).

Our design anticipates different modes of communication contingent on how someone wants to use or interact with the headphones. But whether it’s audio or visual, each interaction remains succinct.

THE NEXT MOVEMENT IN VOICE DESIGN

The Surface Headphones are the first ambient device from Microsoft with an assistant. The Surface design team had a groundbreaking opportunity to radically reimagine headphones as more than just headphones.

In the past, people often confused or conflated digital assistants with voice control. But with increased investments in personality design and the future of interaction, Microsoft is experimenting with giving Cortana added dimension and awareness to help customers get the most out of a digital assistant.

“We decided to use the human metaphor for a digital assistant, because a real-life assistant isn’t just voice control. They don’t just take dictation. They understand what’s important to you, your family, your priorities, your goals,” explained Chris.

As we continue to infuse intelligence across our products and services, teams throughout the company are beginning to explore the potential for what a digital assistant could be.

“The headphones sparked a whole new area of thinking — one that we’re using to think through the same problem from other endpoints as we move on to work for the Office 365 apps,” said Chris.

And who knows? Maybe one day, when you slip on your Surface Headphones, Cortana can chime in with her favorite kind of ramen, too.

post

First reviews for new book ‘Tools and Weapons’ by Microsoft Pres. Brad Smith and Carol Ann Browne

“When your technology changes the world,” he writes, “you bear a responsibility to help address the world that you have helped create.” And governments, he writes, “need to move faster and start to catch up with the pace of technology.” 

In a lengthy interview, Mr. Smith talked about the lessons he had learned from Microsoft’s past battles and what he saw as the future of tech policymaking – arguing for closer cooperation between the tech sector and the government. It’s a theme echoed in the book, “Tools and Weapons: The Promise and the Peril of the Digital Age,” which he wrote with Carol Ann Browne, a member of Microsoft’s communications staff.

The New York Times, Sept. 8, 2019


In 2019, a book about tech’s present and future impact on humankind that was relentlessly upbeat would feel out of whack with reality. But Smith’s Microsoft experience allowed him to take a measured look at major issues and possible solutions, a task he says he relished.

“There are some people that are steeped in technology, but they may not be steeped in the world of politics or policy,” Smith told me in a recent conversation. “There are some people who are steeped in the world of politics and policy, but they may not be steeped in technology. And most people are not actually steeped in either. But these issues impact them. And increasingly they matter to them.”

Fast Company, Sept. 8, 2019


In ‘Tools & Weapons: The Promise and the Peril of the Digital Age,’ the longtime Microsoft executive and his co-author Carol Ann Browne tell the inside story of some of the biggest developments in tech and the world over the past decade – including Microsoft’s reaction to the Snowden revelations, its battle with Russian hackers in the lead up to the 2016 elections and its role in the ongoing debate over privacy and facial recognition technology.

The book goes behind-the-scenes at the Obama and Trump White Houses; explores the implications of the coming wave of artificial intelligence; and calls on tech giants and governments to step up and prepare for the ethical, legal and societal challenges of powerful new forms of technology yet to come.

-GeekWire, September 7, 2019


Tensions between the U.S. and China feature prominently in Smith’s new book, ‘Tools and Weapons: The Promise and the Peril of the Digital Age.’ While Huawei is its own case, Smith worries that broader and tighter strictures could soon follow. The Commerce Department is considering new restrictions on the export of emerging technologies on which Microsoft has placed big bets, including artificial intelligence and quantum computing. “You can’t be a global technology leader if you can’t bring your technology to the globe,” he says.

-Bloomberg Businessweek, Sept. 7, 2019


Tell us what you think about the book @MSFTIssues. You can buy the book here or at bookstores around the world.

post

Bollywood, blockbusters and a $5B industry: How Eros Now is redefining online video

Being on the cutting edge of technology was baked into the DNA of Indian video company Eros Now from the start.

Its parent company, Eros International Plc., was founded in 1977, the same year the VHS videocassette format was released in North America. While some in the entertainment industry were leery about the newfangled technology, Eros was all in.

“Back then, it was a scary thing,” says Eros Digital CEO Rishika Lulla Singh. “But we embraced VHS and we continued to embrace new technology.”

That approach has served the company well. Eros International Plc., a movie distribution and production company, was India’s first VHS distributor and the first Indian media company listed on the New York Stock Exchange.

Eros Now, its on-demand video arm, was launched in 2012 and has now attracted over 18.8 million paid subscribers and 155 million users worldwide with its more than 12,000 Bollywood films, music videos and original content including series and short episodes.

Eros Now amassed its big audience largely by premiering blockbuster films and related content such as trailers and music videos on its site even before they were on YouTube, Lulla Singh says. The company aims to differentiate itself not just as a one-stop destination for online entertainment but also as a tech innovator — and a new collaboration with Microsoft is underpinning those efforts.

Portrait of Rishika Lulla Singh, CEO of Indian company Eros Digital
Eros Digital CEO Rishika Lulla Singh.

Eros Now is working with Microsoft to migrate the company’s operations to the Azure cloud platform to improve video and viewing experiences for consumers worldwide. Lulla Singh says Microsoft’s ability to innovate in cloud computing and artificial intelligence, its research in voice services and discovery, and the company’s capacity for handling big data were the primary reasons Eros Now wanted to collaborate with Microsoft to develop next-generation video technology.

“We feel that Azure can help us to drive a lot of our ambitions to create the correct architecture for the video platform,” she says. “It was the sheer sophistication of the product over everything else on the market.”

The collaboration, Microsoft’s first effort in India in streaming video, signifies a move into a thriving entertainment market. Streaming video is growing rapidly in India, where the market is projected to reach $5 billion by 2023, according to a study by The Boston Consulting Group.

“India is among the fastest-growing entertainment and media markets globally, with cutting-edge innovation in content creation, distribution and data insights ,” says Anant Maheshwari, president of Microsoft India. “Our partnership with Eros Now is a significant milestone. Together, we hope to redefine the video viewing experience for consumers in India and across the globe.”

Lulla Singh has been working with teams across Microsoft and says she’s been struck by the company’s collaborative culture.

“Microsoft wants to enable other companies to be cool and to essentially realize your own ambition. The collaboration that’s come from that is incredible,” she says.

The rise in video streaming has shifted consumer expectations and the role of companies like Microsoft in the industry. Consumers are watching content on multiple devices, from smartphones to tablets, and media companies are facing increased competition to attract viewers, Maheshwari says. Those companies are looking to cloud providers for secure and scalable content delivery, he says, and for capabilities such as advanced search and smart content recommendations.

“What will differentiate video streaming services is the ability to give users exciting content to experience within the limited time and attention span they have,” he says. “AI and intelligent cloud tools will be the next drivers of the media business and will impact everything in the content value chain.”

Eros Now, Lulla Singh says, saw an opportunity to distinguish itself by creating original content that was a departure from the typical Indian television fare.

“The television landscape in India is very, very different to what happens in the U.S., where we have ‘Game of Thrones’ and a lot of sexy content,” she says. “That doesn’t really exist in the Indian television ecosystem.

“Most Indian programming is catered to more older audiences, which is actually not relevant to the new millennial audiences.”

Eros Now began developing its own original content around 2015, launching “Side Hero,” a Bollywood-inspired comedy series, last year, followed by “Smoke,” a drama about drug cartels. Last December, the company introduced “Eros Now Quickie,” a series of eight- to 10-minute episodes ranging from docudramas to comedy, and segments on food, health and travel.

Eros Now plans to continue innovating by personalizing content for customers by language, subtitles and other geographic-based preferences. Bollywood movies have been growing in popularity in regions outside India including China, Russia and Eastern Europe, Lulla Singh says. Eros Now, which currently has viewers in more than 135 countries, hopes to ride that momentum to expand into new markets and reach its goal of 50 million subscribers over the next three years.

“We just want to continue to create, continue to please our customers and grow in the process as well,” she says.

Top image: A sampling of Eros Now’s original content, which began premiering in 2018. (Images courtesy of Eros Now)

post

Oracle and Microsoft expand cloud partnership to boost workplace productivity

Building on Oracle and Microsoft’s cloud interoperability partnership, Oracle today announced the availability of an integration between Oracle Digital Assistant and Microsoft Teams. Enterprise customers can now access Oracle Cloud Applications through an AI-powered voice experience in Teams.

“Using Oracle Digital Assistant, business users can simply and conversationally interact with business applications directly from their Microsoft Teams interface just as they would collaborate with their fellow employees or other productivity tools,” said Suhas Uliyar, vice president, AI and Digital Assistant, Oracle. “Completing daily work tasks becomes much more efficient as the AI-trained conversational access doesn’t require additional employee training on different applications. This is yet another way we are enabling customers to run mission-critical enterprise workloads across Microsoft 365 and Oracle Cloud.”

Once Oracle Digital Assistant is enabled from the Teams App Store, users can query Oracle Cloud Applications, such as CX and HCM, through a bot conversation. Skills from Oracle Digital Assistant are auto provisioned and auto configured, tapping into the richness of the Teams experience. “Together, Oracle Digital Assistant and Teams enable our customers to transform existing workflows and save time,” said Bhrighu Sareen, general manager, Microsoft Teams Platform. “This integration reduces context change for people since it eliminates the need to switch between applications, enabling them to complete tasks like viewing a sales pipeline in Oracle CX without leaving Teams.”

Today, the integration of Oracle Digital Assistant and Teams provides customers with a frictionless work environment to boost productivity and faster decision making. In the future, out-of-the-box skills or chatbots for Oracle ERP Cloud, Oracle HCM Cloud and Oracle CX Cloud are planned to be available in Teams via the Oracle Digital Assistant. These pre-built features can enable employee self-service for scenarios spanning sales, project management, expenses, productivity, time and absence management, compensation and benefits, and recruiting.

“We’ve been working closely with our partner IntraSee to create a HR digital assistant for our global employee base and Oracle Digital Assistant was the natural choice for us because of its ability to securely operate in our hybrid cloud infrastructure,” said Mark Burgess, senior director, HR Technology Solutions, Honeywell. “Our aim is to have it be the preferred method to get questions answered 24×7, access to policies and an amazing end-to-end approach for completing transactions with more speed and accuracy. We knew we wanted our HR digital assistant to be available where employees spend their time online, and an integration with Teams was therefore essential. Our vision is to have it become to employees what J.A.R.V.I.S. is to Iron Man.”

“We are excited to hear that Oracle Digital Assistant will support Teams,” said Sayan Ray, vice president, IT, SRF Ltd. “We’ve been using Oracle Digital Assistant to deliver conversational interfaces to our backend systems with great success. We also use Teams to collaborate so this partnership will help take Oracle Digital Assistant implementation to new levels in productivity gains.”

For more information on Oracle Digital Assistant for Teams, please sign up for private preview today.

post

Helping first responders achieve more with autonomous systems and AirSim

With inputs from: Elizabeth Bondi (Harvard University), Bob DeBortoli (Oregon State University), Balinder Malhi (Microsoft) and Jim Piavis (Microsoft)

Autonomous systems have the potential to improve safety for people in dangerous jobs, particularly first responders. However, deploying these systems is a difficult task that requires extensive research and testing.

In April, we explored complexities and challenges present in the development of autonomous systems and how technologies such as AirSim provide a pragmatic way to solve these tasks. Microsoft believes that the key to building robust and safe autonomous systems is providing a system with a wide range of training experiences to properly expose it to many scenarios before it can be deployed in the real world. This ensures training is done in a meaningful way—similar to how a student might be trained to tackle complex tasks through a curriculum curated by a teacher.

With autonomous systems, first responders gain sight into the unknown

One way Microsoft trains autonomous systems is through participating in unique research opportunities focused on solving real-world challenges, like aiding first responders in hazardous scenarios. Recently, our collaborators at Carnegie Mellon University and Oregon State University, collectively named Team Explorer, demonstrated technological breakthroughs in this area during their first-place win at the first round of the DARPA Subterranean (SubT) Challenge.

Snapshots from the AirSim simulation showing the effects of different conditions such as water vapor, dust and heavy smoke. Such variations in conditions can provide useful data when building robust autonomous systems.

Snapshots from the AirSim simulation showing the effects of different conditions such as water vapor, dust and heavy smoke. Such variations in conditions can provide useful data when building robust autonomous systems.

The DARPA SubT Challenge aspires to further the technologies that would augment difficult operations underground. Specifically, the challenge focuses on the methods to map, navigate, and search complex underground environments. These underground environments include human-made tunnel systems, urban underground, and natural cave networks. Imagine constrained environments that are several kilometers long and structured in unique ways with regular or irregular geological topologies and patterns. Weather or other hazardous conditions, due to poor ventilation or poisonous gasses, often make first responders’ work even more dangerous.

Team Explorer engaged in autonomous search and detection of several artifacts within a man-made system of tunnels. The end-to-end solution that the team created required many different complex components to work across the challenging circuit including mobility, mapping, navigation, and detection.

Microsoft’s Autonomous Systems team worked closely with Team Explorer to provide a high-definition simulation environment to help with the challenge. The team used AirSim to create an intricate maze of man-made tunnels in a virtual world that was representative of such real-world tunnels, both in complexity as well as size. The virtual world was a hybrid synthesis, where a team of artists used reference material from real-world mines to modularly generate a network of interconnected tunnels spanning two kilometers in length spread over a large area.

Additionally, the simulation included robotic vehicles—wheeled robots as well as unmanned aerial vehicles (UAVs)—and a suite of sensors that adorned the autonomous agents. AirSim provided a rich platform that Team Explorer could use to test their methods along with generate training experiences for creating various decision-making components for the autonomous agents.

At the center of the challenge was the ability for the robots to perceive the underground terrain and discover things (such as human survivors, backpacks, cellular phones, fire extinguishers, and power drills) while adjusting to different weather and lighting conditions. Multimodal perception is important in challenging environments, as well as AirSim’s ability to simulate a wide variety of sensors, and their fusion can provide a competitive edge. One of the most important sensors is a LIDAR, and in AirSim, the physical process of generating the point clouds are carefully reconstructed in software, so the sensor used on the robot in simulation uses the same configuration parameters (such as number-of-channels, range, points-per-second, rotations-per-second, horizontal/vertical FOVs, and more) as those found on the real vehicle.

It is challenging to train perception modules based on deep learning models to detect the target objects using LIDAR point clouds and RGB cameras. While curated datasets, such as ScanNet and MS COCO, exist for more canonical applications, none exist for underground exploration applications. Creating a real dataset for underground environments is expensive because a dedicated team is needed to first deploy the robot, gather the data, and then label the captured data. Microsoft’s ability to create near-realistic autonomy pipelines in AirSim means that we can rapidly generate labeled training data for a subterranean environment.

Detecting animal poaching through drone simulations

With autonomous systems, the issues with collection data are further exacerbated for applications that involve first line responders since the collection process is itself dangerous. Such challenges were present in our collaboration with Air Shepherd and USC to help counter wildlife poaching.

The central task in this collaboration was the development of UAVs equipped with thermal infrared cameras that can fly through national parks at night to search for poachers and animals. The project had several challenges, the largest of which was building such a system that requires data for both training as well as testing purposes. For example, labeling a real-world dataset, which was provided by Air Shepherd, took approximately 800 hours over the course of 6 months to complete. This produced 39,380 labeled frames and approximately 180,000 individual poacher and animal labels on those frames. This data was used to build a prototype detection system called SPOT but did not produce acceptable precision and recall values.

AirSim was then used to create a simulation, where virtual UAVs flew over virtual environments like those found in the Central African savanna at an altitude from 200 to 400 feet above ground level. The simulation took on the difficult task of detecting poachers and wildlife, both during the day and at night, and ultimately ended up increasing the precision in detection through imaging by 35.2%.

Driving innovation through simulation

Access to simulation environments means that we have a near-infinite data generation machine, where different simulation parameters can be chosen to generate experiences at will. This capability is foundational to test and debug autonomous systems that eventually would be provably robust and certified. We continue to investigate such fuzzing and falsification framework for various AI systems.

Holistic challenges such as the DARPA SubT Challenge, and partnerships with organizations like Air Shepherd allow researchers and developers to build complete solutions that cover a wide array of research topics. There are many research challenges at the intersection of robotics, simulations, and machine intelligence that we continue to invest in our journey to build toolchains, enabling researchers and developers to build safe and useful simulations and robots.

We invite readers to explore AirSim on our GitHub repository and invest in our journey to build toolchains in collaboration with the community. The AirSim network of man-made caves environment was co-created with Team Explorer for the DARPA SubT Challenge and is publicly available for researchers and developers.

post

Analyzing data from space – the ultimate intelligent edge scenario

Space represents the next frontier for cloud computing, and Microsoft’s unique approach to partnerships with pioneering companies in the space industry means together we can build platforms and tools that foster significant leaps forward, helping us gain deeper insights from the data gleaned from space.

One of the primary challenges for this industry is the sheer amount of data available from satellites and the infrastructure required to bring this data to ground, analyze the data and then transport it to where it’s needed. With almost 3,000 new satellites forecast to launch by 20261 and a threefold increase in the number of small satellite launches per year, the magnitude of this challenge is growing rapidly.

Essentially, this is the ultimate intelligent edge scenario – where massive amounts of data must be processed at the edge – whether that edge is in space or on the ground. Then the data can be directed to where it’s needed for further analytics or combined with other data sources to make connections that simply weren’t possible before.

DIU chooses Microsoft and Ball Aerospace for space analytics

To help with these challenges, the Defense Innovation Unit (DIU) just selected Microsoft and Ball Aerospace to build a solution demonstrating agile cloud processing capabilities in support of the U.S. Air Force’s Commercially Augmented Space Inter Networked Operations (CASINO) project.

With the aim of making satellite data more actionable more quickly, Ball Aerospace and Microsoft teamed up to answer the question: “what would it take to completely transform what a ground station looks like, and downlink that data directly to the cloud?”

The solution involves placing electronically steered flat panel antennas on the roof of a Microsoft datacenter. These phased array antennas don’t require much power and need only a couple of square meters of roof space. This innovation can connect multiple low earth orbit (LEO) satellites with a single antenna aperture, significantly accelerating the delivery rate of data from satellite to end user with data piped directly into Microsoft Azure from the rooftop array.

Analytics for a massive confluence of data

Azure provides the foundational engine for Ball Aerospace algorithms in this project, processing worldwide data streams from up to 20 satellites. With the data now in Azure, customers can direct that data to where it best serves the mission need, whether that’s moving it to Azure Government to meet compliance requirements such as ITAR or combining it with data from other sources, such as weather and radar maps, to gain more meaningful insights.

In working with Microsoft, Steve Smith, Vice President and General Manager, Systems Engineering Solutions at Ball Aerospace called this type of data processing system, which leverages Ball phased array technology and imagery exploitation algorithms in Azure, “flexible and scalable – designed to support additional satellites and processing capabilities. This type of data processing in the cloud provides actionable, relevant information quickly and more cost-effectively to the end user.”

With Azure, customers gain its advanced analytics capabilities such as Azure Machine Learning and Azure AI. This enables end users to build models and make predictions based on a confluence of data coming from multiple sources, including multiple concurrent satellite feeds. Customers can also harness Microsoft’s global fiber network to rapidly deliver the data to where it’s needed using services such as ExpressRoute and ExpressRoute Global Reach. In addition, ExpressRoute now enables customers to ingest satellite data from several new connectivity partners to address the challenges of operating in remote locations.

For tactical units in the field, this technology can be replicated to bring information to where it’s needed, even in disconnected scenarios. As an example, phased array antennas mounted to a mobile unit can pipe data directly into a tactical datacenter or Data Box Edge appliance, delivering unprecedented situational awareness in remote locations.

A similar approach can be used for commercial applications, including geological exploration and environmental monitoring in disconnected or intermittently connected scenarios. Ball Aerospace specializes in weather satellites, and now customers can more quickly get that data down and combine it with locally sourced data in Azure, whether for agricultural, ecological, or disaster response scenarios.

This partnership with Ball Aerospace enables us to bring satellite data to ground and cloud faster than ever, leapfrogging other solutions on the market. Our joint innovation in direct satellite-to-cloud communication and accelerated data processing provides the Department of Defense, including the Air Force, with entirely new capabilities to explore as they continue to advance their mission.

  1. https://www.satellitetoday.com/innovation/2017/10/12/satellite-launches-increase-threefold-next-decade/

Tags: ,

post

Want to start a side gig? Hustle Up! for higher education students is here to help

While many are interested in starting a side gig, there is one group in particular that’s looking for ways to make extra money and improve business skills this time of year—higher education students.

With the arrival of a new academic year comes a diverse crop of achievement-minded students looking for innovative ways to gain invaluable on-the-go experience while earning much-needed income.

Considering student loans, single parenthood, increased costs of living, and more, the reality for today’s higher education students is that they need to earn money now, while expanding their professional know-how. They understand employers are looking for nontraditional employees with uniquely diversified expertise and specialties, and they don’t have the luxury of depending solely on internships anymore.

These students have found they can leverage their passions to start side hustles to turn a profit and gain hands-on knowledge that aligns with the theories they are learning in class.

In order to pinpoint the most advantageous resources and tips needed for a side hustle, Microsoft Store collaborated with Chris Guillebeau, a New York Times bestselling author and host of the Side Hustle School podcast.

“Side hustles are a great way to create options, which are important in today’s world. They’re a fast track to freedom and job security. Consider the purpose of an internship—experience. Why not get paid for your experience by learning to start an income-generating project?”
—Chris Guillebeau

The challenge for some people who build a side hustle is that they have amazing ideas to generate extra income, but need help managing their business operations. That’s where solutions like Microsoft 365 and other Microsoft Store resources can help.

Start to Hustle Up!

Hustle Up!, a mobile experience, was developed by Microsoft Store to help identify the right resources needed to amplify different kinds of side hustles. By answering a series of questions, Hustle Up! explores your side hustle aptitude, identifies your strengths and interests, and connects you with the best resources to help you on your way.

Each of the four Hustle Up! outcomes—Freelancer, Maker, Reseller, and Expert—were carefully crafted to match you with your top side hustle type and each highlight your professional skills along with top actionable tips from Chris Guillebeau. Tips include prime resources that help you maintain work, school, and life balance, such as:

  • For Freelancers, having the ability to get reviews ASAP is critical. Reviews matter a lot in business, especially when you are trying to stand out in an overly saturated market. Chris recommends that Freelancers gather real-time client feedback by creating surveys and polls using Microsoft 365 offerings.
  • Side hustlers who fall into the Expert category know how to adapt their knowledge to a product or service but can struggle trying to stay on top of all their clients’ various needs. To manage multiple asks and schedules, Chris advises Experts to keep track of their daily, weekly, and monthly tasks while on the go with OneNote.

Eager to learn and achieve more with your side hustle? Even more expert tips await! Try Hustle Up! to discover how to better your side hustle and visit Microsoft Store in person or online to uncover additional resources, fun and free workshops, and solutions that will amplify your entrepreneurial skills.