Posted on Leave a comment

Accelerating innovation in computational chemistry

Scientists from Microsoft, ETH Zurich, and the Pacific Northwest National Laboratory have recently presented a new automated workflow to leverage the scale of Azure to transform R&D processes in quantum chemistry and materials science. By optimizing the simulation code and re-factoring it to be cloud native, the team has achieved a 30 times acceleration and 10 times cost reduction for the simulation of a catalytic chemical reaction. Moreover, these powerful automation capabilities free scientists from navigating a complex web of heterogeneous hardware and software packages, allowing them to focus on the development of new products such as sustainable production of fertilizer, more eco-friendly paints and coatings, new methods for carbon fixation, and many others.

Solving the world’s most complex and pressing challenges requires significant breakthroughs in chemical and materials sciences

Predicting chemical synthesis and catalytic processes is a key endeavor in chemistry but also poses as one of the science’s most pressing challenges. Reactions occur in a very complicated chemical space. It is almost impossible to identify the comprehensive mechanisms of chemical reactions through laboratory experiments alone. Computer simulation provides an alternative and complimentary route to elucidate reaction mechanisms, but the human time involved to date has been so high that researchers have only been able to consider a few key reaction pathways that typically ignore crucial side reactions in a conventional setting. This is because the modeling of reactions requires chemical intuition and manual trial and error, and the accurate simulation of the modeled system will become intractable when the reactions are considered in full depth, including all options possible.

This reality is what motivates the Azure Quantum team every day to build a fully scalable quantum machine. Since quantum mechanics explains the nature and behavior of matter at the atomic level, quantum computers will be inherently capable of understanding and predicting the complexities of nature. While we’re making progress towards this vision, we’re simultaneously helping innovators accelerate progress in chemical and materials science today with new workflows leveraging state-of-the-art research and the power of Azure’s high-performance computing (HPC).

Introducing AutoRXN for automated reaction exploration

AutoRXN is a new automated workflow designed to empower scientists to explore reaction networks virtually using HPC in the Azure cloud. With this advancement, discovering and evaluating chemical reactions becomes extraordinarily more accessible in the cloud, which in turn will enable organizations to transform their R&D processes and speed development of new products. Using the AutoRXN workflow, scientists can expand the number of chemical reaction pathways explored from dozens to thousands of configurations with higher than conventional accuracy. The central workhorse behind the AuthoRXN orchestration is the chemical network exploration software Chemoton developed by our collaborators at ETH Zurich. We have adapted the chemistry simulations used today to be cloud-native for the modern hardware and network topology in Azure data centers, ensuring autonomy, stability, and minimum operator interference across all components of the workflow.

Reaction network view indicating scope of the exploration (branches omitted for clarity).
Reaction network view indicating scope of the exploration (branches omitted for clarity).

This automation enabled the research outlined in our recent paper, where we applied it to study mechanisms of an asymmetric hydrogenation catalyst. The AutoRXN workflow carries out a huge number of comparatively cheap quantum chemical calculations for exploration, automatically refines the results obtained by a vast number of expensive correlated ab initio calculations, and automates collection and evaluation of data—including back-checking of results by alternative simulation approaches. The team has been able to orchestrate highly accurate computational chemistry calculations at an unprecedented rate, which is critical for high-throughput tasks. 

The AutoRXN workflow opens a new avenue of modeling and understanding chemical reactions where many side reactions can be found and studied to inform the actual performance of catalysts. The exploration scrutinizes expected reaction mechanisms and reveals the general reactivity of different atoms and functional groups in the catalyst, which enables one to improve the catalyst. 

We have already identified more than five hundred reactions and more than two thousand elementary steps that reveal a comprehensive overview of the iron-complex catalyzed asymmetric hydrogenation reaction. This is far beyond the reach of conventional manual reaction modeling, as many side reactions and catalyst degradations cannot be captured by a chemist’s intuition today. Leveraging the modern hardware heterogeneity on Azure makes the process significantly faster and more cost effective. Results from the simulations help us understand the reactivity of a catalyst and accelerate R&D into exciting new discoveries in chemical and materials science. 

Exploring the catalytic reactions on Azure high-performance computing provides researchers with a robust and reliable platform for hyper-scale chemistry and materials simulations without having to physically build out the system and infrastructure. 

Start your path to accelerated innovation today

It’s estimated that chemistry directly touches over 96 percent of all manufactured goods.1 That means the opportunity for organizations to make new chemical and materials science discoveries to solve society’s most intractable problems and generate new growth is tremendous. New technologies and methodologies like AutoRXN are emerging from advancements in cloud computing and computational chemistry. Innovation in cloud capabilities and automation enables unprecedented scalability and hardware heterogeneity, and deep collaboration between industry and academic research is fostering the development of cloud-optimized simulation codes and methodologies. These technologies have advanced computational chemistry to a stage where it can solve the challenging problems scientists have been working on for decades.

We’re excited to see how innovators leverage the hyperscale of the cloud today and a scaled quantum machine in the future to discover new materials to solve today’s seemingly unsolvable problems and launch the next wave of technological and societal progress.

Learn more

See the publication: High-throughput ab initio reaction mechanism exploration in the cloud with automated multi-reference validation.

Explore the benefits of Azure high-performance computing in the cloud.

If you are interested in co-innovation opportunities, please contact us at [email protected].


1Source : https://www.americanchemistry.com/chemistry-in-america/news-trends/press-release/2021/us-specialty-chemical-markets-start-third-quarter-on-a-strong-note

Posted on Leave a comment

Microsoft Quantum Innovator Series: The path to quantum at scale

The path to quantum at scale; Microsoft Quantum Innovator Series” image of Quantum hardware chandelier

Get the inside, first-hand account of Microsoft’s strategy to scaled quantum computing in a new webinar series where you’ll hear directly from Microsoft Azure Quantum scientists and leaders about the path to quantum at scale and how to get involved today.

In this ongoing series, scientists and researchers can hear directly from Microsoft’s quantum scientists and leaders like Krysta Svore, Chetan Nayak, Matthias Troyer, and others about our strategy, progress, and most importantly, how Microsoft aims to empower innovators to make a breakthrough impact with quantum at scale. 

Save my spot. Register for the Microsoft Quantum Innovator Series now.

The series kicks off with our first event “Have you started developing for practical quantum advantage?” on January 31 from 9:00–9:30 AM PT. Our speaker will be Dr. Krysta Svore, distinguished engineer and VP of Quantum Software, Microsoft. During this webinar, you will:

  • Learn what’s required for scalable quantum computing and what can be done now to get ready for it.
  • See the new Azure Quantum Resource Estimator—the first end-to-end toolset that provides estimates for the number of logical and physical qubits as well as runtime required to execute quantum applications on post-NISQ, fault-tolerant quantum computers.
  • Understand the number of qubits required for a quantum solution and the differences between qubit technologies.
  • Explore how Microsoft is empowering innovators today by co-designing tools to optimize quantum solutions and to run small instances of algorithms on today’s diverse and maturing quantum systems and prepare for tomorrow’s scaled quantum computers.
  • Participate in a live Q&A chat with the Azure Quantum team and be one of the first to hear about recent advancements.
a close up of Krysta Svore

Krysta Svore | Distinguished Engineer and VP of Quantum Software, Microsoft

About the Speaker:

Dr. Svore has published over 70 refereed articles and filed over 30 patents. She is a Fellow of the American Association for the Advancement of Science. She won the 2010 Yahoo! Learning to Rank Challenge with a team of colleagues, received an ACM Best of 2013 Notable Article award, and was recognized as one of Business Insider’s Most Powerful Female Engineers of 2018. A Kavli Fellow of the National Academy of Sciences, she also serves as an advisor to the National Quantum Initiative, the Advanced Scientific Computing Advisory Committee of the Department of Energy, and the ISAT Committee of DARPA, in addition to numerous other quantum centers and initiatives globally.

Microsoft Quantum Innovator Series: Why and what is the future of the topological qubit?

On February 28, we will focus on why Microsoft decided to design its quantum machine with topological qubits—an approach that is both more challenging and more promising than others—and what’s next for Microsoft’s hardware ambitions. This episode will share more about Microsoft’s quantum hardware journey, specifically touching on Microsoft’s physics breakthrough outlined in Dr. Nayak’s recent paper, and will also focus on the physics behind the topological qubit. Join our speaker Chetan Nayak, Technical Fellow and and VP of Quantum Hardware and Systems Engineering, Microsoft to:

  • Learn about topological phases in physics and how they are applied to quantum computing. 
  • Explore how topological properties create a level of protection that can, in principle, help a qubit retain quantum information despite what’s happening in the environment around it.
  • Understand the role of the topological gap and the recently discovered Majorana zero modes, and how together they impact a topological qubit’s stability, size, and speed. 
  • Learn how to examine the raw data and analysis from Microsoft’s hardware research on Azure Quantum.
  • Use interactive Jupyter notebooks and explore what’s next in engineering the world’s first topological qubit. 
  • Participate in a live Q&A chat with the Azure Quantum team and be one of the first to hear about recent advancements.
A close up of Chetan Nayak

Chetan Nayak | Technical Fellow and VP of Quantum Hardware and Systems Engineering, Microsoft 

About the Speaker:

Dr. Nayak is a pioneer of the study of quantum matter, including topological and non-equilibrium phases. He holds a bachelor’s degree from Harvard and a PhD in physics from Princeton. He was an assistant, associate, and full professor at UCLA, a visiting professor at Nihon University in Tokyo, and is a professor of physics at UCSB. Chetan was a trustee of the Aspen Center for Physics and an editor of Annals of Physics. He is a Fellow of the American Physical Society and a recipient of an Alfred P. Sloan Foundation Fellowship and a National Science Foundation CAREER award. He has published more than 150 refereed articles with more than 20,000 citations and has been granted more than 20 patents. 

Microsoft Quantum Innovator Series: What kind of problems can we solve today with quantum simulation?

On April 20, we will feature Matthias Troyer, Microsoft Technical Fellow, who will discuss what kind of problems we can solve today with quantum simulation. Learn how years of Microsoft research reveal that the discovery of new chemicals, materials, and drugs that will ultimately help solve the world’s most challenging problems will greatly benefit from quantum computing. Dr. Troyer will explain what is happening today and how chemical and materials science innovators can get started on their quantum journey:

  • Learn how real progress can be made today by combining high performance computing (HPC), state-of-the-art machine learning, and quantum knowledge to fundamentally transform our ability to model and predict the outcome of chemical processes.
  • Get real-world insights from co-innovation projects happening right now with leading chemical and materials science companies around the world.
  • Find out how researchers in chemical and materials fields can get started on their quantum journey today.
  • Participate in a live Q&A chat with the Azure Quantum team and be one of the first to hear about recent advancements.
A close up of Matthias Troyer

Matthias Troyer | Technical Fellow, Microsoft

About the Speaker:

Dr. Troyer is a Fellow of the American Physical Society, Vice President of the Aspen Center for Physics, a recipient of the Rahman Prize for Computational Physics of the American Physical Society “for pioneering numerical work in many seemingly intractable areas of quantum many body physics and for providing efficient sophisticated computer codes to the community” and of the Hamburg Prize for Theoretical Physics.

After receiving his PhD in 1994 from ETH Zurich in Switzerland, he was a postdoc at the University of Tokyo before returning to ETH Zurich where he was a professor of Computational Physics until joining Microsoft’s quantum computing program in early 2017. He works on a variety of topics in quantum computing, from the simulation of materials and quantum devices to quantum software, algorithms and applications of future quantum computers. His broader research interests span from high performance computing and quantum computing to the simulations of quantum devices and island ecosystems.

Image of a Quantum chandelierImage of a Quantum chandelier

Microsoft Quantum Innovator Series

The path to quantum at scale.

Posted on Leave a comment

‘It’s magic’: Students and researchers get hands-on with quantum hardware via Azure Quantum Credits program

Since it was introduced in February, the Azure Quantum Credits program has attracted applicants ranging from enterprise innovators and solution partners to academic researchers and student explorers. It has been exciting to see the diversity of proposals submitted – featuring the use of quantum hardware accessible through Azure Quantum to investigate novel use cases, experiment with state-of-the-art algorithms, and pursue applications in industries like chemistry and materials sciences.

From New York to Tennessee, Hyderabad to Verona, and Finland to Canada—we’re delighted to be the quantum platform of choice for research in areas as diverse as molecular energy estimation, quantum computer crosstalk, protein folding dynamics, quantum machine learning for price prediction, and quantum detection of cardiovascular events in cardiac signals.

We are pleased to showcase the creativity and energy of three of our credit recipients—the University of Washington, Bar-Ilan University, and KPMG in collaboration with the Danish Technical University (DTU)—who leveraged IonQ and or Quantinuum hardware through the Credits program.

Watch the video below to see these projects in action

The benefits of getting hands-on with quantum hardware in a classroom setting are clear. As Kai-Mei Fu, Professor of Physics at the University of Washington, described, “Our students had never accessed hardware. Many people think that you can just do everything on a simulator. It turns out, there are some surprising results that happen when you use a real quantum computer that are very important. It’s extremely valuable to be on real machines through Azure Quantum.”

Professor Emanuele Dalla Torre, of Bar-Ilan University’s Department of Physics, added “Azure Quantum allows you to connect to different quantum computers. Through this, we were able to see that what we had imagined in our theoretical analysis was happening in the real world on a quantum computer. Our experiment with Azure Quantum gave us a hint of what the possible near-term applications of quantum computing are.

In a public-private research endeavor, solution partner KPMG collaborated with DTU on neural networks-focused research using Azure Quantum Credits.

Bent Dalager, Partner and Global Head of KPMG’s Quantum Hub, noted, “Azure Quantum democratized the ability to use quantum computing. Instead of having to rely on a specific piece of hardware, through a language layer, you can pursue quantum computing through Azure Quantum in a tremendously more efficient way.”

In the last six months, the Azure Quantum Credits program has expanded from its initial offering of IonQ Harmony and Quantinuum H1 quantum processing units (QPUs). The associated simulator and emulator also now offer experimentation on IonQ Aria’s 23 algorithmic qubit system, Quantinuum H2, and Rigetti’s 40 qubit Aspen-11 and latest 80 qubit modular chip Aspen-M-1 endpoints.

Coming soon, Pasqal’s neutral atom-based quantum technology will be available in the Azure Quantum Credits program, allowing innovators and explorers to harness Pasqal’s impressive qubit connectivity and the ability to directly manipulate neutral atoms.

The enthusiasm we’ve enjoyed during our weekly office hours about Azure Quantum Credits has been palpable and we’ve appreciated the benefit of community feedback in continuously improving the Credits program. For quantum educators, the Azure Quantum Credits program is a cornerstone of our one-stop resource for curriculum, samples, and tools to facilitate the skilling up of a quantum-ready workforce.

Along with the $500 in credits available to all users to experiment with Azure Quantum’s participating hardware partners, we are eager to continue to empower practitioners and researchers to explore solutions on today’s leading quantum hardware using Azure Quantum Credit grants. Accelerate your exploration and apply today.

Posted on Leave a comment

Goldman Sachs eyes quantum advantage for derivative pricing

TBD.TBD.
Goldman Sachs text-only company logo.

At Goldman Sachs, our Research and Development team is always looking to push forward the cutting edge in technology for financial services. While quantum computing remains in an early stage, the promise of the technology means that we are actively researching where and how it can be applied in the future. A key approach here is for us to “work backward.” We start with a valuable, well-defined mathematical problem in finance that we can pair with a theoretical computational advantage for quantum computers. We then ask: what would the specifications of a real quantum computer need to be to achieve a practical advantage for this problem? In doing this resource estimation work we need to fill in practical details and plug gaps in theoretical approaches. It also often uncovers important optimizations that can, for example, reduce time to solution or the required quantum memory.

Resource estimation for quantum advantage in derivative pricing

One example that we have focused on is the pricing of complex derivatives. Derivatives are financial contracts whose value today is based on some statistical model of what will happen in the future. A common example of a financial derivative is a stock option. When you have a complicated contract or a complicated statistical model then it can be computationally expensive to compute the price. Derivatives are so common in finance that even a small improvement in pricing them, or in calculating related quantities, could be very valuable.

Derivatives are a good target for resource estimation because the underlying algorithm that is often used is Monte Carlo, and it’s known that there is a theoretical speedup available to quantum computers for fairly generic Monte Carlo algorithms. The algorithm builds on a subroutine called amplitude estimation and offers a quadratic speedup. For instance, to achieve an accuracy ε in the price a classical Monte Carlo algorithm needs to run for O(1/ε2) steps. However, the quantum algorithm runs in only O(1/ε) steps. For example, if you are targeting accuracy of one part per thousand (ε = 10-3) then the quantum algorithm could need only 1,000 steps vs. a classical algorithm that would need 1,000,000.

Of course, this is just the theoretical scaling and details need to be filled in to see if this is practical. For example, each step on a quantum computer might take much longer than each step on a classical computer because the clock rate is slower. There also may be other overheads that influence the constant factors in the algorithm.

In 2020, we worked with co-authors at IBM to produce the first end-to-end resource estimate for derivative pricing in our paper “A Threshold for Quantum Advantage in Derivative Pricing.” We used two practical examples of derivative contracts in that paper: an autocallable and a Target Accrual Redemption Forward (TARF). These are examples that are complicated enough to price today that we would like a speedup and that are traded in enough volume that improving their pricing matters. In order to make the resource estimate practical, we introduced some modifications to the algorithm called the re-parameterization method. This resulted in the following estimates for the resources needed for the autocallable example. We include the total resources needed as well as the resources used in an important subroutine of amplitude estimation, the Q operator:

  Total Resources Q Operator
T-count 1.2 x 10^10 11.4M
T-depth 5.4 x 10^7 9.5k
Logical Qubits 8k 8k

We include three important figures of merit to describe the resources. The T-count gives the number of T-gate operations needed in the algorithm. The T-gate operation in many fault-tolerant quantum computing architectures requires significantly more resources than other operations and so dominates the resources needed by the computation. We also include the T-depth. This is the number of T-gate operations that needed to be executed sequentially. In some architectures, this depth number then determines the overall runtime of the algorithm as other T-gates can be parallelized. Finally, we include the amount of quantum memory needed for the algorithm as measured by the number of qubits.

Resource estimation with Q#

Resource estimation is challenging as all the details matter. For example, our paper uses fully mixed precision in the implementation, where each fixed-point register is optimized to use the right number of qubits. How can we be sure that we didn’t make mistakes when we can’t run a full implementation?

In order to take our resource estimate to the next level, we chose to use Q# and work with Mathias Soeken and Martin Roetteler on the Microsoft Azure Quantum team to develop a full Q# implementation of our algorithm. Doing resource estimation this way had many benefits:

  1. Handling complexity: We could use Q#’s features to automatically handle the allocation and management of quantum memory. Further, features like automatically generating controlled and adjoint operations made it easier for us to express the algorithm at a higher level and let the compiler figure out the details.
  2. Using libraries: Much of the resource complexity in our derivative pricing algorithm is used by reversible arithmetic on quantum registers. Q# already has many libraries for fixed-point arithmetic operations that we could import and invoke without needing to re-implement them ourselves.
  3. Finding mistakes: Since much of the code in our implementation is dealing with reversible versions of classical arithmetic, we were able to make use of Q#’s Toffoli simulator to efficiently test portions of our implementation for correctness. While the whole algorithm cannot be directly simulated, we were able to develop unit tests for key components that we could efficiently simulate to build up confidence in our resource counts.
  4. Modular design: The overall algorithm is complicated. Having a concrete implementation lets one focus on optimizing specific functions one at a time and then letting the system tell you the overall effect on resource counts.

New updates to the algorithm from using Q#

While implementing the algorithm from our previous work in Q# we made some improvements and modifications.

Firstly, we removed the arcsine and square-root arithmetic operations (Step 3 of Algorithm 4.2) and replaced them with the comparator method (Section 2.2 of this work). This reduces the resources needed for that step.

Secondly, we replaced the piecewise polynomial implementation of the exponential function with a lookup table. A lookup table can further reduce resources over reversible fixed-point arithmetic that can be expensive on quantum computers. This lookup table implementation has been open sourced as part of Q#. In the resource estimate results given below, the lookup table for the exponential function has a free parameter given by the number of “swap” qubits used. In the resource estimates below we quote resources for three different choices of swap qubits. As we have an implementation in Q# it is straightforward to manage and compute different resource requirements for differently parameterized implementations.

Resource estimation results

With these updates and the more detailed implementation in Q#, we calculated the resources needed for three key subroutines in derivative pricing and compared them to our previous work. The first is for the Q operator, the key operator in amplitude estimation. The second is for the payoff operator that reversibly implements the derivative payoff. The third is for the exponential function itself, which is the largest resource consumer besides the fundamental amplitude estimation itself.

The benchmark chosen is the 3 asset autocallable on 20 time steps. These parameters match real instances that one could find in practice.

Comparisons are made amongst three methods:

  • Paper: the original hand estimates from our work in Chakrabarti et al: https://quantum-journal.org/papers/q-2021-06-01-463/.
  • SWAP10: Q# implementation estimates where the exponential lookup table is set to use 10 swap bits.
  • SWAP5: Q# implementation estimates where the exponential lookup table is set to use 5 swap bits.
  • SWAP1: Q# implementation estimates where the exponential lookup table is set to use 1 swap bit.

Overall Q Operator

  Paper SWAP10 SWAP5 SWAP1
T-count 11.4M 14.6M 2.9M 6.3M
T-depth 9.5k 16k 16.6k 36k
Logical Qubits 8k 3.8M 124k 19.2k

Payoff Operator

  Paper SWAP10 SWAP5 SWAP1
T-count 189k 77k 77k 77k
T-depth 3.2k 2.7k 2.7k 2.7k
Logical Qubits 1.6k 19.2k 19.2k 19.2k

Fixed Point Exponential

  Paper SWAP10 SWAP5 SWAP1
T-count 7M 12.3M 617k 3.9M
T-depth 1.2k 62 1.3k 20.5k
Logical Qubits 5.4k 3.8M 124k 11.5k

Broadly speaking, our SWAP1 implementation results are close but not the same as our by-hand estimates. This means that our by-hand estimates were sometimes pessimistic (like for T-count) and other times optimistic, but not by too much.

Takeaways

By working with a Q# implementation we were able to improve the accuracy and flexibility of our resource estimates for quantum advantage in derivative pricing. The implementation also gives us a foundation to more rapidly iterate on updated versions and on other algorithms that use similar subroutines. We look forward to continuing optimization of this algorithm and implementation by taking advantage of new ideas and developments in the Q# ecosystem.

 “Working directly with the Goldman Sachs team has provided a fantastic opportunity to collaborate on resource estimation for an important problem in the finance industry, gain insights to enhance the offerings across the Azure Quantum ecosystem, and share resource estimation techniques and algorithm improvements with the community. It’s exciting to see the impact Q# can enable, from algorithm development to resource estimation and reduction, and it’s been a pleasure working with Goldman Sachs to further quantum impact.”—Dr. Krysta Svore, Distinguished Engineer and VP Quantum Software for Azure Quantum

Posted on Leave a comment

Teaching the ABCs of quantum computing: Azure Quantum for Educators

Close up of male college student sitting in university lecture hall, looking ahead while taking notes on a Surface Pro. Other male and female students sit behind him, taking notes.Close up of male college student sitting in university lecture hall, looking ahead while taking notes on a Surface Pro. Other male and female students sit behind him, taking notes.

Achieving technological leaps forward requires more than scientific and engineering breakthroughs. A critical dependency is the cultivation of a skilled workforce that can unlock the potential of emerging technology. In the field of quantum computing, now is the perfect time for educators to get ahead of the curve and prepare their students to start their quantum journeys. Along with quantum technology being on a path to scale, platform and tool maturity and accessibility are converging to enable academic institutions to meet workforce demand.

Microsoft and Azure Quantum want to empower educators and students to do just this. In our efforts to innovate across every layer of the Azure Quantum stack, we are pleased to launch Azure Quantum for Educators: a one-stop resource for curriculum, samples, and tools to facilitate the skilling up of a quantum-ready workforce. It also includes case studies in using a practical, software-driven approach to teach quantum computing to undergraduate students and perspective on bringing hands-on use of quantum hardware to classrooms.

Azure Quantum for Educators features include:

  • Practical and programming-oriented quantum computing curriculum for educators: A free, classroom-tested and continuously improving curriculum appropriate for students with and without a physics background. Includes syllabus, lecture slides, programming assignments, an automatic homework grading tool, samples of final projects, and more.
  • Free access to hands-on quantum hardware: The Azure Quantum Credits Program provides free access for quantum hardware exploration supporting teaching, learning, and deploying quantum programs on a diverse set of quantum computers.
  • Python and Q# code samples: Run these samples against Azure Quantum’s diverse and growing hardware portfolio of trapped ion, superconducting, and neutral atom quantum processing units (QPUs), or against a variety of hardware simulators and resource estimators.
  • Case studies and white papers: Learn about ways to introduce quantum computing to a variety of academic levels and settings, including undergraduate students.
  • Azure Quantum office hours: We’re here to help! Drop in to request direct support for and provide input on quantum education initiatives.

Dr. Celia Merzbacher, Executive Director of the Quantum Economic Development Consortium (QED-C®) notes, “Initiatives like Azure Quantum for Educators help to build a robust talent pipeline of quantum-ready workers for the emerging quantum computing industry and the industries that will use the technology, from finance to pharma. Practical hands-on experience is highly sought-after by employers across the board.”

Join institutions, like the University of Washington, in leveraging Azure Quantum for Educators to enable new quantum computing capabilities in the classroom. Combining learning with doing in a recent course that included access to quantum hardware through Azure Quantum generated enthusiastic feedback from learners around:

Putting classroom concepts into immediate practice

  • “We quickly get to apply what we learned from the professors and guest lectures.”
  • “The Azure Quantum platform was useful and straightforward to use. Submitting jobs was also straightforward.”

Ease of use

  • “One of the best classes regarding quantum computing implementations.”

“We’ve enjoyed a wellspring of enthusiasm from teaching institutions globally about the Azure Quantum Educators resources,” says Kent Foster Microsoft University Relations Director. “Universities, colleges, business and vocational schools, and even high school educators are interested in integrating our materials and quantum computing hardware access into a broad range of classroom scenarios, ranging from for-credit classes and summer schools to multi-disciplinary student clubs and continuing education classes targeted at learners already in the workforce.”

Our commitment

With increasing government, private, and academic investment in quantum research, developing a skilled quantum workforce is critical to accelerating quantum computing breakthroughs in areas like chemistry, materials science, and finance. We are incredibly excited to partner with educators, learners, and researchers to close the talent gap and inspire the next generation of quantum enthusiasts with Azure Quantum for Educators resource and an invitation to join our Azure Quantum Network—a vibrant coalition working together to solve for a better future.

Posted on Leave a comment

Qubit Engineering Inc. uses Azure Quantum to optimize wind farm energy production

Qubit Engineering is using quantum-inspired capabilities available on the Azure Quantum platform to optimize wind farm layouts and, in doing so, capture more available energy with the same physical wind farm assets. 

A constructed wind farm in operation.
A constructed wind farm in operation

Wind farms have achieved tremendous efficiency gains over the last two decades through hardware innovation. Software innovation, in the form of turbine layout optimization, can now further amplify these efficiencies.

Wind farm layout optimization is considered a key profit driver for developers and owners. And, as layouts become increasingly optimized, wind power becomes even more attractive as an alternative to fossil fuels, thus helping reduce the carbon footprint for consumers and businesses. 

Optimizing wind farm layouts is challenging because individual turbine positions are highly correlated. Moving the position of a turbine even a couple of meters in the design process has the potential to affect the energy production of the entire farm over its more than 20-year lifetime.

To illustrate the complexity of the optimization, a modest wind farm with 50 turbines and 100 suitable locations yields an astronomical number of possible configurations:

                                    100C50= 100,891,344,545,564,193,334,812,497,256

Qubit Engineering, a quantum algorithms startup collaborating with Azure Quantum on industry solutions, takes an entirely novel approach by converting a complex optimization problem into a new format adapted to quantum-inspired optimization (QIO). Azure Quantum QIO enables Qubit Engineering to employ quantum-inspired techniques that combine classical algorithms and classical compute hardware while leveraging the scale of the Azure cloud. They combine all the constraints and equations describing the dynamics of a problem into a Quadratic Binary formulation. The problem is then cast into a large matrix with thousands of variables and millions of values—not practical to tackle using traditional techniques—but solvable using Azure Quantum QIO.

Optimized wind farm layout by Qubit Engineering using Azure Quantum.
Optimized wind farm layout by Qubit Engineering using Azure Quantum

This approach demonstrably performs 1 percent to 3 percent better than the traditional next best approach in the industry, confirmed by multiple leading turbine manufacturers and wind farm developers, including RES, the world’s largest independent renewable energy company. This 1 percent to 3 percent improvement translates to megawatts of energy which in turn can power hundreds of additional households over the lifetime of a wind farm—at no additional capital expense.

“RES have been working successfully with Qubit Engineering to improve wind farm energy yields. We look forward to seeing the improvements produced by their latest developments with Azure Quantum, which should help us to further increase the value of wind energy projects across our global portfolio” says Tom Young, Senior R&D Specialist at RES.

Building off this success, the Qubit Engineering and Azure Quantum teams are now working together to address a truly pioneering problem: showcasing the impact of optimization on a large 1,000-turbine wind farm. This requires grappling with tens of thousands of variables and billions of values.

”Such a large problem is simply intractable using traditional optimization techniques,”  says Qubit Engineering CEO Marouane Salhi. “The complexity of it makes it incredibly interesting from both an energy and computational perspective. We’re leveraging both quantum-inspired algorithms and the cloud-scale offered by Azure Quantum to solve a relevant problem to today’s wind energy industry, against the backdrop of an accelerating need to develop larger and larger wind farms.”

The additional energy that could be generated by optimizing large wind farms with smarter layouts could be in the tens of megawatts per individual wind farm, collectively powering thousands of additional households with no change in physical assets. It’s easy to extrapolate the cumulative potential benefit applied to multiple wind farms.

For Qubit Engineering and Azure Quantum, this work is just scratching the surface of what is possible. Qubit Engineering is researching how to expand to other areas of renewable energy system optimization, while Azure Quantum continues to develop and share with its solution partners a cutting-edge technology platform on which these impactful applications can thrive.  

We are excited to showcase the pioneering work of Qubit Engineering as an example of our community of quantum solution partners. If your enterprise is interested in exploring quantum computing, quantum-inspired optimization, and Azure cloud services for renewable energy solutions optimization, you can express your interest today.

Posted on Leave a comment

A practical perspective on quantum computing

There’s a lot of speculation about the potential for quantum computing, but to get a clearer vision of the future impact, we need to disentangle myth from reality. At this week’s virtual Q2B conference, we take a pragmatic perspective to cut through the hype and discuss the practicality of quantum computers, how to future-proof quantum software development, and the real value obtained today through quantum-inspired solutions on classical computers.

Achieving practical quantum advantage

Dr. Matthias Troyer, Distinguished Scientist with Microsoft Quantum, explains what will be needed for quantum computing to be better and faster than classical computing in his talk Disentangling Hype from Reality: Achieving Practical Quantum Advantage. People talk about many potential problems they hope quantum computers can help with, including fighting cancer, forecasting the weather, or countering climate change. Having a pragmatic approach to determining real speedups will enable us to focus the work on the areas that will deliver impact.

For example, quantum computers have limited I/O capability and will thus not be good at big data problems. However, the area where quantum does excel is large compute problems on small data. This includes chemistry and materials science, for game-changing solutions like designing better batteries, new catalysts, quantum materials, or countering climate change. But even for compute-intensive problems, we need to take a closer look. Troyer explains that each operation in a quantum algorithm is slower by more than 10 orders of magnitude compared to a classical computer. This means we need a large speedup advantage in the algorithm to overcome the slowdowns intrinsic to the quantum system; we need superquadratic speedups.

Troyer is optimistic about the potential for quantum computing but brings a realistic perspective to what is needed to get to practical quantum advantage: small data/big compute problems, superquadratic speedup, fault-tolerant quantum computers scaling to millions of qubits and beyond, and the tools and systems to develop the algorithms to run the quantum systems.

Future-proofing quantum development

Developers and researchers want to ensure they invest in languages and tools that will adapt to the capabilities of more powerful quantum systems in the future. Microsoft’s open-source Quantum Intermediate Representation (QIR) and the Q# programming language provide developers with a flexible foundation that protects their development investments.

QIR is a new Microsoft-developed intermediate representation for quantum programs that is hardware and language agnostic, so it can be a common interface between many languages and target quantum computation platforms. Based on the popular open-source LLVM intermediate language, QIR is designed to enable the development of a broad and flexible ecosystem of software tools for quantum development.

As quantum computing capabilities evolve, we expect large-scale quantum applications will take full advantage of both classical and quantum computing resources working together. QIR provides full capabilities for describing rich classical computation fully integrated with quantum computation. It’s a key layer in achieving a scaled quantum system that can be programmed and controlled for general algorithms.

In his presentation at the Q2B conference, Future-Proofing Your Quantum Development with Q# and QIR, Microsoft Senior Software Engineer Stefan Wernli explains to a technical audience why QIR and Q# are practical investments for long-term quantum development. Learn more about QIR in our recent Quantum Blog post.

Quantum-inspired optimization solutions today

At the same time, there are ways to get practical value today through “quantum-inspired” solutions that apply quantum principles for increased speed and accuracy to algorithms running on classical computers.

We are already seeing how quantum-inspired optimization solutions can solve complex transportation and logistics challenges. An example is Microsoft’s collaboration with Trimble Transportation to optimize its transportation supply chain, presented at the Q2B conference in Freight for the Future: Quantum-Inspired Optimization for Transportation by Anita Ramanan, Microsoft Quantum Software Engineer, and Scott Vanselous, VP Digital Supply Chain Solutions at Trimble.

Trimble’s Vanselous explains how today’s increased dependence on e-commerce and shipping has fundamentally raised expectations across the supply chain. However, there was friction in the supply chain because of siloed data between shippers, carriers, and brokers; limited visibility; and a focus on task optimization vs. system optimization. Trimble and Microsoft are designing quantum-inspired load matching algorithms for a platform that enables all supply chain members to increase efficiency, minimize costs, and take advantage of newly visible opportunities. You can learn more about our collaboration in this video:

Many industries—automotive, aerospace, healthcare, government, finance, manufacturing, and energy—have tough optimization problems where these quantum-inspired solutions can save time and money. And these solutions will only get more valuable when scaled quantum hardware becomes available and provides further acceleration.

How to get started

Explore Microsoft’s quantum-inspired optimization solutions, both pre-built Azure Quantum and custom solutions that run on classical and accelerated compute resources.

Learn how to write quantum code with Q# and the Quantum Development Kit. Write your first quantum program without having to worry about the underlying physics or hardware.

Azure Quantum will be available in preview early next year. Join us for our next Azure Quantum Developer Workshop on February 2, 2021, where you can learn more about our expanding partner ecosystem and the solutions available through the Azure Quantum service. Registration opens today.

Posted on Leave a comment

Building a bridge to the future of supercomputing with quantum acceleration

Using supercomputing and new tools for understanding quantum algorithms in advance of scaled hardware gives us a view of what may be possible in a future with scaled quantum computing. Microsoft’s new Quantum Intermediate Representation (QIR), designed to bridge different languages and different target quantum computation platforms, is bringing us closer to that goal. Several Department of Energy (DOE) national laboratories are using this Microsoft technology in their research at the new National Quantum Initiative (NQI) quantum research centers.

As quantum computing capabilities mature, we expect most large-scale quantum applications will take full advantage of both classical and quantum computing resources working together. QIR provides a vital bridge between these two worlds by providing full capabilities for describing rich classical computation fully integrated with quantum computation.

QIR is central to a new collaboration between Microsoft and DOE’s Pacific Northwest National Laboratory (PNNL) born out of NQI’s Quantum Science Center (QSC) led by DOE’s Oak Ridge National Laboratory (ORNL). The goal of the PNNL project is to measure the impact of noisy qubits on the accuracy of quantum algorithms, specifically the Variational Quantum Eigensolver (VQE). In order to run it in simulation on the supercomputer, they needed a language to write the algorithm, and another representation to map it to run on the supercomputer. PNNL used Microsoft’s Q# language to write the VQE algorithm and then QIR provides the bridge, allowing easy translation and mapping to the supercomputer for the simulation.

The PNNL team is showcasing the simulation running on ORNL’s Summit supercomputer at this week’s virtual International Conference for High Performance Computing, Networking, Storage, and Analysis (SC20). You can view their presentation here: Running Quantum Programs at Scale through an Open-Source, Extensible Framework.

Q# and QIR are also helping to advance research at ORNL, which is accelerating progress by enabling the use of the Q# language for all QSC members, including four national labs, three industry partners, and nine universities. ORNL is integrating Q# and QIR into its existing quantum computing framework, so ORNL researchers can run Q# code on a wide variety of targets including both supercomputer-based simulators and actual hardware devices. Supporting Q# is important to ORNL’s efforts to encourage experimentation with quantum programming in high-level languages.

The ORNL team is using QIR to develop quantum optimizations that work for multiple quantum programming languages. Having a shared intermediate representation allows the team to write optimizations and transformations that are independent of the original programming language. ORNL chose to use QIR because, being based on the popular LLVM suite, it integrates seamlessly with ORNL’s existing platform and provides a common platform that can support all of the different quantum and hybrid quantum/classical programming paradigms.

Since QIR is based on the open source LLVM intermediate language, it will enable the development of a broad ecosystem of software tools around the Q# language. The community can use QIR to experiment and develop optimizations and code transformations that will be crucial for unlocking quantum computing.

Microsoft technology is playing a crucial role in DOE’s NQI initiative connecting experts in industry, national labs, and academia to accelerate our nation’s progress towards a future with scaled quantum computing.

Learn more about the latest developments in quantum computing from Microsoft and our QSC national lab partner PNNL in these virtual SC20 conference sessions.

Visualizing High-Level Quantum Programs  (November 11 at 12pm EST)

Complex quantum programs will require programming frameworks with many of the same features as classical software development, including tools to visualize the behavior of programs and diagnose issues. The Microsoft Quantum team presents new visualization tools being added to the Microsoft Quantum Development Kit (QDK) for visualizing the execution flow of a quantum program at each step during its execution. These tools are valuable for experienced developers and researchers as well as students and newcomers to the field who want to explore and understand quantum algorithms interactively.

Exotic Computation and System Technology: 2006, 2020 and 2035 (November 17 at 11:45am EST)

Dr. Krysta Svore, Microsoft’s General Manager of Quantum Systems and Software, is on this year’s exotic system panel. The SC20 panel will discuss predictions from past year sessions, what actually happened, and predict what will be available for computing systems in 2025, 2030 and 2035.

Density Matrix Quantum Circuit Simulation via the BSP Machine on Modern GPU Clusters (November 17 at 10am  EST)

As quantum computers evolve, simulations of quantum programs on classical computers will be essential in validating quantum algorithms, understanding the effect of system noise and designing applications for future quantum computers. In this paper, PNNL researchers first propose a new multi-GPU programming methodology which constructs a virtual BSP machine on top of modern multi-GPU platforms, and apply this methodology to build a multi-GPU density matrix quantum simulator. Their simulator is more than 10x faster than a corresponding state-vector quantum simulator on various platforms.

Posted on Leave a comment

Toshiba joins Azure Quantum, providing machine to solve complex optimization problems

Complex optimization problems exist across every industry, such as vehicle routing, supply chain management, risk assessment, portfolio optimization, power grid operations, and many others.  

While a number of sophisticated algorithms have been developed that can solve certain optimization problems very efficiently, many real-world optimization problems remain hard to optimize despite the remarkable advancements in both algorithms and computing power over the past decades. These scenarios usually involve many variables and are computationally difficult to solve using traditional methods. 

Leveraging quantum methods allows us to find more accurate solutions in far less time with much less work, even for the most complex problems. In practice, emulating quantum systems has led to promising breakthroughs in MRI technologyimproving traffic congestionmaterials design, and more. 

Emulating nature with Azure Quantum

Many optimization algorithms, such as simulated annealing, parallel tempering Monte Carlo, or genetic algorithms, mimic natural processes. As we’ve developed a deeper understanding of quantum mechanics, new optimizers have been developed that make use of quantum mechanics to accelerate optimization and escape local minima in the cost function landscape through emulating quantum tunneling. 

Simulating these quantum effects on classical computers has led to the development of new types of quantum solutions that run on classical hardware, also called quantum-inspired optimization (QIO) algorithms. These algorithms allow us to exploit some of the advantages of quantum computing approaches today on classical CMOS-based hardware, providing a speedup over traditional approaches. Using quantum solutions on classical hardware also prepares us for the future of quantum optimization on scaled, fault-tolerant quantum hardware.  

Azure Quantum enables customers to run optimization algorithms on industry-scale classical hardware with self-service solutions designed to solve binary optimization problems on CPUs, GPUs, and FPGAs in Azure.  

Toshiba offering Simulated Bifurcation Machine through Azure Quantum

Expanding the portfolio of QIO algorithms and solvers, Microsoft is pleased to announce that Toshiba is joining the Microsoft Quantum Network and will be offering Toshiba’s Simulated Bifurcation Machine (SBM) in Azure Quantum. Toshiba joins existing partners 1Qbit, Honeywell, IonQ, and QCI in providing services to the growing quantum ecosystem. 

Toshiba’s cutting-edge technique quickly obtains highly accurate solutions for complex large-scale combinatorial optimization problems and has demonstrated an approximately 10-fold improvement over other competing devices. Some examples of combinatorial optimization problems include dynamic portfolio management, risk management, and high-frequency trading. Practical applications include optimizing routing for electrical transmission lines considering cost, safety, time, and environmental impact; or finding the shortest route between cities, considering the time of day, traffic incidents, and driver schedule.  

In principle, every computational problem we see in practice can be translated to a particular type of binary optimization problem: searching for the ground state of an Ising model. While, in general, this mapping can be too costly to be practical, combinatorial optimization problems are often easy to rewrite into this form, and for problems that are native to this form (such as planning, scheduling, or partitioning), techniques such as those employed by Toshiba’s solution provide a powerful tool for solving them. 

Emerging from quantum computing research at Toshiba, SBM is a practical and ready-to-use Ising model solver—a software solution that is able to solve large-scale combinatorial optimization problems at high speed, while harnessing the GPU resources in the Azure cloud. 

Azure Quantum users will soon be able to utilize the Toshiba SBM to explore highly accurate solutions using quantum methods for their own scenarios. 

Build quantum solutions today

Azure Quantum is an open ecosystem of quantum partners and technologies. Building on decades of quantum research and scalable enterprise cloud offerings at MicrosoftAzure Quantum brings you rich software capabilities and development tools paired with quantum and classical hardware through a familiar Azure environment. 

With our ever-growing offerings, like Toshiba’s SBM, achieve immediate impact with quantum-inspired optimization running on classical hardware today and build for tomorrow with quantum hardware. Sign up to become an Azure Quantum early adopter. 

Join us 

With this announcement, we are excited to welcome Toshiba to the Microsoft Quantum Network. The Microsoft Quantum Network is a broad community of individuals and organizations collaborating with Microsoft to advance a comprehensive quantum ecosystem, develop practical solutions, and build a robust quantum workforce.  

To build this right, we need to build it together.  

 

 

 

Posted on Leave a comment

Microsoft and Copenhagen University researchers create new kind of quantum device

In a paper published this week in Nature Physics, a team of researchers from Microsoft and Copenhagen University demonstrated a novel heterostructure with remarkable properties. A heterostructure is, roughly, a device formed out of a sandwich between different solid materials. When the interfaces between the different materials are clean, the device can have properties that would be difficult, if not impossible to obtain in any single material. But when the interfaces contain impurities, the device may capture the worst, rather than the best properties, of the materials comprising it.

The device described in the new Microsoft-Copenhagen University paper is a heterostructure between a semiconductor, a superconductor, and a ferromagnet. The three materials and the interfaces between them were fabricated within an ultra-high-vacuum molecular beam epitaxy (MBE) machinemade possible by the compatibility between the growth and fabrication conditions for the three materialseuropium sulfide (ferromagnet), aluminum (superconductor), and indium arsenide (semiconductor)leading to extremely flat and clean interfaces. 

The authors showed that the device has gate-tunable superconductivity and ferromagnetism induced in and coexisting in the semiconductor. These two phenomena, ordinarily antithetical, are able to peacefully coexist due to a property of indium arsenide called spin-orbit coupling. In fact, when such coexistence occurs in a quantum wire device of the type fabricated and measured by the Microsoft-Copenhagen University teamMajorana zero modes can result, enabling such a wire to be an integral component of a topological quantum computer. The new Nature physics paper shows data that is consistent with the presence of Majorana zero modes in their devices. 

Previous devices without a ferromagnetic layer have exhibited similar signatures upon the application of a large magnetic field, in a direction aligned with the wireBut such a large field brings problems of its own, including the need to align all of the wires in a topological quantum computer to fairly high accuracy, as well as the field’s possible effect on other components higher in the stack. In the devices created by the Microsoft-Copenhagen University teamthe magnetic moment due to the ferromagnetic layer is highly localized and automatically aligned with a preferred crystal axis. 

Microsoft’s Quantum program has made a big bet that new methods for the design, fabrication, and measurement of these types of novel heterostructures will be essential if we are to build a commercial-scale quantum computer. While some might argue that tools invented for classical devices will be sufficient to produce quantum devices, Microsoft and Copenhagen University have already shown in previous work that long-envisioned, but never previously realized, combinations of superconducting and semiconducting elements could be grown and fabricated via MBE and probed by quantum transport, overturning conventional wisdom about what is possible. 

Thus, this work, has intrinsic interest as a new device type with unique mix of features and is also a significant step towards the creation of simpler topological quantum computing systems. It is also another example of how Microsoft and its partners, such as Copenhagen University, are reinventing the science and engineering of quantum devices.