Posted on Leave a comment

Microsoft partners with SAP as the first global cloud provider to launch Project Embrace

When Microsoft and SAP joined forces in 2017, it was aligned with a common goal: to provide enterprise customers with a clear roadmap to drive more business innovation in the cloud. It’s the same principle we called upon last year, when Microsoft teamed up with SAP and Adobe to launch the Open Data Initiative, which empowers companies to unlock more value from their most valuable asset, which is data.

Today, Microsoft and SAP are announcing a new effort that was built with the customer in mind. Microsoft will be the first global cloud provider to join SAP’s new Embrace program.

Project Embrace provides customers with an aligned pathway to the Intelligent Enterprise

Project Embrace is SAP’s collaboration with public cloud providers and Global Strategic Service Partners to simplify and accelerate a customer’s journey to the cloud. Through a combination of market-approved journeys, reference architectures, select service partners, and the power of the Microsoft Azure cloud, customers will be able to accelerate their digital transformation.

Embrace will help customers better manage refactoring and ongoing support costs, access the latest innovations, and improve the performance of SAP applications by leveraging Microsoft and SAP’s unified blueprint running S/4HANA on Microsoft Azure.

Microsoft joins forces with SAP to accelerate customers’ journey to S/4HANA on Azure

Microsoft has been working with SAP on Project Embrace for more than a year to deliver a joint roadmap with specific integrated reference architecture to run SAP S4/HANA on Microsoft Azure. The collaboration will provide customers with prescriptive guidance that facilitates a move from their current on-premise production landscapes to a digital enterprise in the cloud.

Working hand-in-hand with SAP to make Project Embrace a success, Microsoft is developing an integrated, end-to-end process across product engineering, sales, marketing and support teams. As part of this effort, Microsoft and SAP are also aligning their joint Partner Ecosystem, as well as collaborating closely with the SAP Max Attention team to provide a more seamless customer support experience.

Microsoft and SAP work together to drive success for their joint customers

Microsoft’s unique position as the first global cloud provider to support Project Embrace speaks to its long history of partnership and co-innovation with SAP, a focus on joint customer success, and Microsoft’s strength in the enterprise market.

Microsoft and SAP’s mutual customers have been asking for an offering like Embrace. Cemex, a global leader in the building materials industry, sees the enormous opportunity Embrace delivers, as Carlos Mantilla, the company’s Senior Director of Worldwide IT Enterprise Architecture told us:

“We at CEMEX are excited to further explore Project Embrace with SAP and Microsoft! We believe it will assist us in accelerating our, already in-progress, Digital Transformation and Journey to S4/HANA in the Azure Hyperscale cloud.”

This is a true example of partnership—the power of Microsoft, the power of SAP and a trusted relationship coming together to provide the best cloud experience for customers.

Tags: ,

Posted on Leave a comment

Azure Artifacts updates include pay-per-GB pricing

Alex Mullans

Alex

Azure Artifacts is the one place for all of the packages, binaries, tools, and scripts your software team needs. It’s part of Azure DevOps, a suite of tools that helps teams plan, build, and ship software. For Microsoft Build 2019, we’re excited to announce some long-requested changes to the service.

Until now, a separate, additional license was required for anyone using Azure Artifacts, beyond the Azure DevOps Basic license. We heard your feedback that this was inflexible, hard to manage, and often not cost-effective, and we’ve removed it. Now, Azure Artifacts charges only for the storage you use, so that every user in your organization can access and share packages.

Every organization gets 2 GB of free storage. Additional storage usage is charged according to tiered rates starting at $2 per GB and decreasing to $0.25 per GB. Full details can be found on our pricing page.

We’ve had support for Python packages, as well as our own Universal Packages, in public preview for some time. As of now, both are generally available and ready for all of your production workloads.

If you’re developing an open source project using a public Azure Repo or a repo on GitHub, you might want to share nightly or pre-release versions of your packages with your project team. Azure Artifacts public feeds will enable you to do just that, backed by the same scale and reliability guarantees as the private feeds you use for internal development. Interested in joining the preview? Get in touch (@alexmullans on Twitter).

With Azure Artifacts, your teams can manage all of their artifacts in one place, with easy-to-configure permissions that help you share packages across the entire organization, or just with people you choose. Azure Artifacts hosts common package types:

  • Maven (for Java development)
  • npm (for Node.js and JavaScript development)
  • NuGet (for .NET, C#, etc. development)
  • Python

Screenshot of Azure Artifacts

If none of those are what you need, Azure Artifacts provides Universal Packages, an easy-to-use and lightweight package format that can take any file or set of files and version them as a single entity. Universal Packages are fast, using deduplication to minimize the amount of content you upload to the service.

Azure Artifacts is also a symbol server. Publishing your symbols to Azure Artifacts enables engineers in the next room or on the next continent to easily debug the packages you share.

Artifacts are most commonly used as part of DevOps processes and pipelines, so we’ve naturally integrated Azure Artifacts with Azure Pipelines. It’s easy to consume and publish packages to Azure Artifacts in your builds and releases.

We’re excited for you to try Azure Artifacts. If you’ve got questions, comments, or feature suggestions, get in touch on Twitter (@alexmullans) or leave a comment.

Alex Mullans
Alex Mullans

Senior Program Manager, Azure Artifacts

Follow Alex   

Posted on Leave a comment

ID@Xbox Game Fest brings a month of games for everyone

We’re excited to kick off the fourth annual ID@Xbox Game Fest this week, where we celebrate the diverse range of games and experiences offered by independent developers on Xbox One.

From May 7-27, we’ll be unearthing some of the ID@Xbox program’s hidden gems for players to discover. Beginning this week until May 20, we will focus on Gaming for Everyone and feature diverse stories, voices, creators, and characters across 31 participating titles. We’ll wrap up the month (May 21-27) with an ID@Xbox Super Sale highlighting even more games that have released through the ID@Xbox program.

As a Game Fest tradition, we talked with the developers behind some of these games to learn more about what Gaming for Everyone means to them as well as how their own personal stories have impacted the experience and creative processes for their games.

1979 Revolution: Black Friday – “Diversity of experience is one facet of the Gaming for Everyone ethos, and an interactive drama set against the turbulent backdrop of Middle Eastern politics and civil unrest certainly isn’t your typical video game. 1979 Revolution: Black Friday not only demonstrates the medium as a unique platform for powerful story-telling, but also how it can be used as an engaging and entertaining means to inform and educate on a complex, real-world subject that, normally, would fall beyond the interests and/or awareness of many people.” – Matt Cundy, Digerati community manager

Our goal was to breathe life into a piece of history that few understand intimately. By engaging with a diverse cast of characters and engaging with narrative choices, we were able to blend the forms of gaming and documentary to capture a digital recreation of Iran in 1979 for a whole new generation to discover.” – Navid Khonsari, creator of 1979 Revolution: Black Friday and iNK Stories co-founder

Dandara – “Even after many rewrites and inspirations, I think Dandara is always going to be a statement about Brazilian culture and history. It’s an outsider’s scream telling you that inspiration can be found by looking out of your own window, or even closer, on your own story. Better yet if it is able to touch you through those inspirations, with stories and people you have not stopped to think about.” – Luke Icenhower, marketing manager

FirewatchFirewatch is single player, first-person mystery. We love games that transport you to a faraway, unusual place and tell you a gripping story, but we don’t so much love games that trap you in impossible combat scenarios or esoteric puzzles along the way. So, we made Firewatch a mystery for everybody – where the game design isn’t about how good you are with a gun but what it feels like to be alone in the woods with (almost) no one to trust.” – Sean Vanaman, writer/designer

In Between In Between started out as part of my bachelor’s thesis back in university dealing with with the taboo topics of death and dying within a video game. In the end, our mission with In Between was to tell a story, that follows along a specific theme but that would still allow players their own interpretations and provide different possibilities for self-identification. We hope to try and give an insight into/make it easier for other people to relate to the thoughts and emotions of family members, friends or anyone else who finds him-/herself in these seemingly hopeless situations.” – Daniel, art director, co-founder

Night in the WoodsNight in the Woods was an interesting project to write code and music for because, while we had an overall outline in mind, we left a lot of things open for ‘improvisation.’ I had to stay on my toes to make sure we could adapt the code to whatever new ideas might come up. It’s really cool that Night in the Woods has meant a lot to a bunch of folks, because it originated from this very odd and personal process.” – Alec Holowka, programmer, composer and game designer

The Path of Motus“I think the game deals with issues that most people face daily. I know many people that had big dreams as a kid, but as they get older society beats those ideas out of their head and they settle for less than what they want in life. I hope this game speaks to those people and shows them that it’s never too late to pursue your dreams in life. I know some people are stuck in their situation due to monetary problems, so I also released a free educational video series showing people how to program games on our website (pathofmotus.com) to help folks that are interested in game development. I hope The Path of Motus can have some type of positive impact on everyone that plays.” – Michael Hicks, designer

Where the Bees Make Honey“I make games to express myself and to communicate with others. Through the power of interactivity, games offer experiences from different cultures and communities.” – Brian Wilson, developer

See below for the rest of the games included in Game Fest this week:

39 Days to Mars

Aritana and the Harpy’s Feather

Blackwood Crossing

Celeste

Fragments of Him

Gone Home: Console Edition

I, Hope

Monica e a Guarda dos Coelhos

Mulaka

Never Alone Arctic Collection

Nippon Marathon

Old Man’s Journey

Perception

SOMA

Storm Boy

Tacoma

The First Tree

The King’s Bird

The Mooseman

The Town of Light

Three Fourths Home: Extended Edition

What Remains of Edith Finch

Wheels of Aurelia

Yonder: The Cloud Catcher Chronicles

This is just a taste of all the great games included in ID@Xbox Game Fest. The Gaming for Everyone theme will run from May 2 through to May 20, so jump in and start discovering the games behind these exceptional stories.

To see all titles included in the first week of Game Fest visit Xbox.com.

Posted on Leave a comment

3 investments Microsoft is making to improve identity management

As a large enterprise with global reach, Microsoft has the same security risks as its customers. We have a distributed, mobile workforce who access corporate resources from external networks. Many individuals struggle to remember complex passwords or reuse one password across many accounts, which makes them vulnerable to attackers. As Microsoft has embraced digital transformation for our own business, we shifted to a security strategy that places strong employee identities at the center. Many of our customers are on a similar journey and may find value in our current identity management approach.

Our goal is to reduce the risk of compromised identity and empower people to be efficient and agile whether they’re on our network or not.

Our identity management solutions focus on three key areas:

Read on for more details for each of these investment areas, advice on scaling your investment to meet your budget, and a wrap-up of some key insights that can help you smoothly implement new policies.

Securing administrator accounts

Our administrators have access to Microsoft’s most sensitive data and systems, which makes them a target of attackers. To improve protection of our organization, it’s important to limit the number of people who have privileged access and implement elevated controls for when, how, and where administrator accounts can be used. This helps reduce the odds that a malicious actor will gain access.

There are three practices that we advise:

  • Secure devices—Establish a separate device for administrative tasks that is updated and patched with the most recent software and operating system. Set the security controls at high levels and prevent administrative tasks from being executed remotely.
  • Isolated identity—Issue an administrator identity from a separate namespace or forest that cannot access the internet and is different from the user’s information worker identity. Our administrators are required to use a smartcard to access this account.
  • Non-persistent access—Provide zero rights by default to administration accounts. Require that they request just-in-time (JIT) privileges that gives them access for a finite amount of time and logs it in a system.

Budget allocations may limit the amount that you can invest in these three areas; however, we still recommend that you do all three at the level that makes sense for your organization. Calibrate the level of security controls on the secure device to meet your risk profile.

Eliminating passwords

The security community has recognized for several years that passwords are not safe. Users struggle to create and remember dozens of complex passwords, and attackers excel at acquiring passwords through methods like password spray attacks and phishing. When Microsoft first explored the use of Multi-Factor Authentication (MFA) for our workforce, we issued smartcards to each employee. This was a very secure authentication method; however, it was cumbersome for employees. They found workarounds, such as forwarding work email to a personal account, that made us less safe.

Eventually we realized that eliminating passwords was a much better solution. This drove home an important lesson: as you institute policies to improve security, always remember that a great user experience is critical for adoption.

Here are steps you can take to prepare for a password-less world:

  • Enforce MFA—Conform to the fast identity online (FIDO) 2.0 standard, so you can require a PIN and a biometric for authentication rather than a password. Windows Hello is one good example, but choose the MFA method that works for your organization.
  • Reduce legacy authentication workflows—Place apps that require passwords into a separate user access portal and migrate users to modern authentication flows most of the time. At Microsoft only 10 percent of our users enter a password on a given day.
  • Remove passwords—Create consistency across Active Directory and Azure Active Directory (Azure AD) to enable administrators to remove passwords from identity directory.

Simplifying identity provisioning

We believe the most underrated identity management step you can take is to simplify identity provisioning. Set up your identities with access to exactly the right systems and tools. If you provide too much access, you put the organization at risk if the identity becomes compromised. However, under-provisioning may encourage people to request access for more than they need in order to avoid requesting permission again.

We take these two approaches:

  • Set up role-based access—Identify the systems, tools, and resources that each role needs to do their job. Establish access rules that make it easy to give a new user the right permissions when you set up their account or they change roles.
  • Establish an identity governance process—Make sure that as people move roles they don’t carry forward access they no longer need.

Establishing the right access for each role is so important that if you are only able to follow one of our recommendations focus on identity provisioning and lifecycle management.

What we learned

As you take steps to improve your identity management, keep in mind the following lessons Microsoft has learned along the way:

  • Enterprise-level cultural shifts—Getting the technology and hardware resources for a more secure enterprise can be difficult. Getting people to modify their behavior is even harder. To successfully roll out a new initiative, plan for enterprise-level cultural shifts.
  • Beyond the device—Strong identity management works hand-in-hand with healthy devices.
  • Security starts at provisioning—Don’t put governance off until later. Identity governance is crucial to ensure that companies of all sizes can audit the access privileges of all accounts. Invest early in capabilities that give the right people access to the right things at the right time.
  • User experience—We found that if you combine user experience factors with security best practices, you get the best outcome.

Learn more

For more details on how identity management fits within the overall Microsoft security framework and our roadmap forward, watch the Speaking of security: Identity management webinar.

Posted on Leave a comment

Red Hat and Microsoft fuel hybrid cloud development with Azure Red Hat OpenShift

  • Co-developed solution brings the industry’s most comprehensive enterprise Kubernetes platform to Microsoft Azure
  • First jointly managed OpenShift offering in the public cloud now available

BOSTON – RED HAT SUMMIT 2019 – MAY 7, 2019 – Red Hat, Inc. (NYSE: RHT), the world’s leading provider of open source solutions, and Microsoft Corp. today announced the general availability of Azure Red Hat OpenShift, which brings a jointly-managed enterprise-grade Kubernetes solution to Microsoft’s a leading public cloud, Microsoft Azure. Azure Red Hat OpenShift provides a powerful on-ramp to hybrid cloud computing, enabling IT organizations to use Red Hat OpenShift Container Platform in their datacenters and more seamlessly extend these workloads to use the power and scale of Azure services. The availability of Azure Red Hat OpenShift marks the first jointly managed OpenShift offering in the public cloud.

Both Red Hat and Microsoft recognize the importance of hybrid cloud computing to modern IT, as organizations look to expand resources with public cloud infrastructure while maintaining existing on-premises investments. Kubernetes provides a common bridge between the datacenter and public cloud environments, making it a key technology in enabling true hybrid cloud computing.

Azure Red Hat OpenShift combines the innovation of enterprise Kubernetes with the world’s leading enterprise Linux platform, Red Hat Enterprise Linux, running on the scale and power of Azure. Together, these technologies provide a powerful solution for more easily managing and orchestrating cloud-native workloads across a hybrid cloud environment. With Azure Red Hat OpenShift, customers can also bring containerized applications into workflows where they exist, while mitigating many of the inherent complexities of container management.

A fully-managed, jointly-operated service, Azure Red Hat OpenShift is backed by both the open source expertise of Red Hat and the public cloud might of Microsoft. Customers receive an integrated experience, including unified sign-up, on-boarding, service management and technical support. The service is added into customers’ existing Azure bill, further streamlining the user experience.

Additionally, Azure Red Hat OpenShift offers enterprise developers and operations teams:

  • Fully managed clusters with master, infrastructure and application nodes managed by Microsoft and Red Hat; plus, no VMs to operate and no patching required.
  • Regulatory compliance will be provided through compliance certifications similar to other Azure services.
  • Enhanced flexibility to more freely move applications from on-premise environments to the Azure public cloud via the consistent foundation of OpenShift.
  • Greater speed to connect to Azure services from on-premises OpenShift deployments.
  • Extended productivity with easier access to Azure public cloud services such as Azure Cosmos DB, Azure Machine Learning and Azure SQL DB for building the next-generation of cloud-native enterprise applications.

Azure Red Hat OpenShift represents Red Hat and Microsoft’s continued mutual commitment to provide a powerful, supported and more secure choice for developing and deploying hybrid cloud workloads. Jointly supported by both companies, IT organizations can have greater confidence in adopting hybrid cloud innovation that meets the requirements of mission-critical workloads in production.

Microsoft and Red Hat are also collaborating to bring customers containerized solutions with Red Hat Enterprise Linux 8 on Azure, Red Hat Ansible Engine 2.8 and Ansible Certified modules. In addition, the two companies are working to deliver SQL Server 2019 with Red Hat Enterprise Linux 8 support and performance enhancements.

Availability

Azure Red Hat OpenShift is available now via Microsoft Azure.

Supporting Quotes

Paul Cormier, president, Products and Technologies, Red Hat

“Hybrid cloud provides a clear vision into the future of enterprise computing, where public cloud services stand alongside virtualization, Linux containers and bare-metal servers. Together, this forms the new datacenter in the hybrid cloud world. Azure Red Hat OpenShift provides a consistent Kubernetes foundation for enterprises to realize the benefits of this hybrid cloud model. This enables IT leaders to innovate with a platform that offers a common fabric for both app developers and operations.”

Scott Guthrie, executive vice president, Cloud and AI Group, Microsoft

“Microsoft and Red Hat share a common goal of empowering enterprises to create a hybrid cloud environment that meets their current and future business needs. Azure Red Hat OpenShift combines the enterprise leadership of Azure with the power of Red Hat OpenShift to simplify container management on Kubernetes and help customers innovate on their cloud journeys.”

Dave More, senior vice president, Travel Solutions Development Platform, Sabre

“Hybrid cloud technologies fuel our next generation platform, with Red Hat OpenShift forming the common, modern foundation for us to build innovative, cloud-native applications that can span from our data centers to the public cloud. Red Hat OpenShift simplifies our ability to create services that work more seamlessly across hybrid cloud architectures, letting us consume cloud-scale resources, including on Azure, while also enabling us to move workloads wherever and whenever needed through Red Hat OpenShift.”

Additional Resources

Connect with Red Hat

About Red Hat, Inc.

Red Hat is the world’s leading provider of enterprise open source software solutions, using a community-powered approach to deliver reliable and high-performing Linux, hybrid cloud, container, and Kubernetes technologies. Red Hat helps customers integrate new and existing IT applications, develop cloud-native applications, standardize on our industry-leading operating system, and automate, secure, and manage complex environments. Award-winning support, training, and consulting services make Red Hat a trusted adviser to the Fortune 500. As a strategic partner to cloud providers, system integrators, application vendors, customers, and open source communities, Red Hat can help organizations prepare for the digital future.

About Microsoft

Microsoft (Nasdaq “MSFT” @microsoft) enables digital transformation for the era of an intelligent cloud and an intelligent edge. Its mission is to empower every person and every organization on the planet to achieve more.               

Media Contacts:

John Terrill
Red Hat, Inc.
+1-571-421-8132
jterrill@redhat.com

Microsoft Media Relations
WE Communications for Microsoft
+1-425-638-7777
rrt@we-worldwide.com

Posted on Leave a comment

Azure IoT at Build: making IoT solutions easier to develop, more powerful to use

Ran Tech / Microsoft 3/9/17

IoT is transforming every business on the planet, and that transformation is accelerating. Companies are harnessing billions of IoT devices to help them find valuable insights into critical parts of their business that were previously not connected—how customers are using their products, when to service assets before they break down, how to reduce energy consumption, how to optimize operations, and thousands of other user cases limited only by companies’ imagination.

Microsoft is leading in IoT because we’re passionate about simplifying IoT so any company can benefit from it quickly and securely.

Last year we announced a $5 billion commitment, and this year we highlighted the momentum we are seeing in the industry. This week, at our premier developer conference, Microsoft Build in Seattle, we’re thrilled to share our latest innovations that further simplify IoT and dramatically accelerate time to value for customers and partners.

Accelerating IoT

Developing a cloud-based IoT solution with Azure IoT has never been faster or more secure, yet we’re always looking for ways to make it easier. From working with customers and partners, we’ve seen an opportunity to accelerate on the device side.

Part of the challenge we see is the tight coupling between the software written on devices and the software that has to match it in the cloud. To illustrate this, it’s worth looking at a similar problem from the past and how it was solved.

Early versions of Windows faced a challenge in supporting a broad set of connected devices like keyboards and mice. Each device came with its own software, which had to be installed on Windows for the device to function. The software on the device and the software that had to be installed on Windows had a tight coupling, and this tight coupling made the development process slow and fragile for device makers.

Windows solved this with Plug and Play, which at its core was a capability model that devices could declare and present to Windows when they were connected. This capability model made it possible for thousands of different devices to connect to Windows and be used without any software having to be installed on Windows.

IoT Plug and Play

Late last week, we announced IoT Plug and Play, which is based on an open modeling language that allows IoT devices to declare their capabilities. That declaration, called a device capability model, is presented when IoT devices connect to cloud solutions like Azure IoT Central and partner solutions, which can then automatically understand the device and start interacting with it—all without writing any code.

IoT Plug and Play also enables our hardware partners to build IoT Plug and Play compatible devices, which can then be certified with our Azure Certified for IoT program and used by customers and partners right away. This approach works with devices running any operating system, be it Linux, Android, Azure Sphere OS, Windows IoT, RTOSs, and more. And all of our IoT Plug and Play support is open source as always.

Finally, Visual Studio Code will support modeling an IoT Plug and Play device capability model as well as generating IoT device software based on that model, which dramatically accelerates IoT device software development.

We’ll be demonstrating IoT Plug and Play at Build, and it will be available in preview this summer. To design IoT Plug and Play, we’ve worked with a large set of launch partners to ensure their hardware is certified ready:

Build_IoT_replace

Certified-ready devices are now published in the Azure IoT Device Catalog for the Preview, and while Azure IoT Central and Azure IoT Hub will be the first services integrated with IoT Plug and Play, we will add support for Azure Digital Twins and other solutions in the months to come. Watch this video to learn more about IoT Plug and Play.

Announcing IoT Plug and Play connectivity partners

With increased options for low-power networking, the role of cellular technologies in IoT projects is on the rise. Today we’re introducing IoT Plug and Play connectivity partners. Deep integration between these partners’ technologies and Azure IoT simplifies customer deployments and adds new capabilities.

This week at Build, we are highlighting the first of these integrations, which leverages Trust Onboard from Twilio. The integration uses security features built into the SIM to automatically authenticate and connect to Azure, providing a secure means of uniquely identifying IoT devices that work with current manufacturing processes.

These are some of the many connectivity partners we are working with:

Build_IoT_2

Making Azure IoT Central more powerful for developers

Last year we announced the general availability of Azure IoT Central, which enables customers and partners to provision an IoT application in 15 seconds, customize it in hours, and go to production the same day—all without writing code in the cloud.

While many customers build their IoT solutions directly on our Azure IoT platform services, we’re seeing an upswell in customers and partners that like the rapid application development Azure IoT Central provides. And, of course, Azure IoT Central is built on the same great Azure IoT platform services.

Today at Build, we’re announcing a set of new features that speak to how we’re enabling and simplifying Azure IoT Central for developers. We’ll show some of these innovations, such as new personalization features that make it easy for customers and partners to modify Azure IoT Central’s UI to conform with their own look and feel. In the Build keynote, we’ll show how Starbucks is using this personalization feature for their Azure IoT Central solution connected to Azure Sphere devices in their stores.

We’ll also demonstrate Azure IoT Central working with IoT Plug and Play to show how fast and easy this makes it to build an end-to-end IoT solution, with Microsoft still wearing the pager and keeping everything up and running so customers and partners can focus on the benefits IoT provides. Watch this video to learn more about Azure IoT Central announcements.

The growing Azure Sphere hardware ecosystem

Azure Sphere is Microsoft’s comprehensive solution for easily creating secured MCU-powered IoT devices. Azure Sphere is an integrated system that includes MCUs with built-in Microsoft security technology, an OS based on a custom Linux kernel, and a cloud-based security service. Azure Sphere delivers secured communications between device and cloud, device authentication and attestation, and ongoing OS and security updates. Azure Sphere provides robust defense-in-depth device security to limit the reach and impact of remote attacks and to renew device health through security updates.

At Build this week, we’ll showcase a new set of solutions such as hardware modules that speed up time to market for device makers, development kits that help organizations prototype quickly, and our new guardian modules.

Guardian modules are a new class of device built on Azure Sphere that protect brownfield equipment, mitigating risks and unlocking the benefits of IoT. They attach physically to brownfield equipment with no equipment redesign required, processing data and controlling devices without ever exposing vital operational equipment to the network. Through guardian modules, Azure Sphere secures brownfield devices, protects operational equipment from disabling attacks, simplifies device retrofit projects, and boosts equipment efficiency through over-the-air updates and IoT connectivity.

The seven modules and devkits on display at Build are:

  • Avnet Guardian Module. Unlocks brownfield IoT by bringing Azure Sphere’s security to equipment previously deemed too critical to be connected. Available soon.
  • Avnet MT3620 Starter Kit. Azure Sphere prototyping and development platform. Connectors allow easy expandability options with a range of MikroE Click and Grove modules. Available May 2019.
  • Avnet Wi-Fi Module. Azure Sphere-based module designed for easy final product assembly. Simplifies quality assurance with stamp hole (castellated) pin design. Available June 2019.
  • AI-Link WF-M620-RSC1 Wi-Fi Module. Designed for cost-sensitive applications. Simplifies quality assurance with stamp hole (castellated) pin design. Available now.
  • SEEED MT3620 Development Board. Designed for comprehensive prototyping. Available expansion shields enable Ethernet connectivity and support for Grove modules. Available now.
  • SEEED MT3620 Mini Development Board. Designed for size-constrained prototypes. Built on the AI-Link module for a quick path from prototype to commercialization. Available May 2019.
  • USI Dual Band Wi-Fi + Bluetooth Combo Module. Supports BLE and Bluetooth 5 Mesh. Can also work as an NFC tag (for non-contact Bluetooth pairing and device provisioning). Available soon.

For those who want to learn more about the modules, you can find specs for each and links to more information on our Azure Sphere hardware ecosystem page.

See Azure Sphere in action at Build

Azure Sphere is also taking center stage at Build during Satya Nadella’s keynote this week. Microsoft customer and fellow Seattle-area company Starbucks will showcase how it is testing Azure IoT capabilities and guardian modules built on Azure Sphere within select equipment to enable partners and employees to better engage with customers, manage energy consumption and waste reduction, ensure beverage consistency, and facilitate predictive maintenance. The company’s solution will also be on display in the Starbucks Technology booth.

Announcing new Azure IoT Edge innovations

Today, we are announcing the public preview of Azure IoT Edge support for Kubernetes. This enables customers and partners to deploy an Azure IoT Edge workload to a Kubernetes cluster on premises. We’re seeing Azure IoT Edge workloads being used in business-critical systems at the edge. With this new integration, customers can use the feature-rich and resilient infrastructure layer that Kubernetes provides to run their Azure IoT Edge workloads, which are managed centrally and securely from Azure IoT Hub. Watch this video to learn more.

Additional IoT Edge announcements include:

  • Preview of Azure IoT Edge support for Linux ARM64 (expected to be available in June 2019).
  • General availability of IoT Edge extended offline support.
  • General availability of IoT Edge support for Windows 10 IoT Enterprise x64.
  • New provisioning capabilities using x.509 and SaS token.
  • New built-in troubleshooting tooling.

A common use case for IoT Edge is transforming cameras into smart sensors to understand the physical world and enable a digital feedback loop: finding a missing product on a shelf, detecting damaged goods, etc. These examples require demanding computer vision algorithms to deliver consistent and reliable results, large-scale streaming capabilities, and specialized hardware for faster processing to provide real-time insights to businesses. At Build, we’re partnering with Lenovo and NVIDIA to simplify the development and deployment of these applications at scale. With NVIDIA DeepStream SDK for general-purpose streaming analytics, a single IoT Edge server running Lenovo hardware can process up to 70 channels of 1080P/30FPS H265 video streams to offer a cost-effective and faster time-to-market solution.

This summer, NVIDIA DeepStream SDK will be available from the IoT Edge marketplace. In addition, Lenovo’s new ThinkServer SE350 and GPU-powered “tiny” edge gateways will be certified for IoT Edge.

Announcing Mobility Services through Azure Maps

Today, an increasing number of apps built on Azure are designed to take advantage of location information in some way.

Last November, we announced a new platform partnership for Azure Maps with the world’s number-one transit service provider, Moovit. What we’re achieving through this partnership is similar to what we’ve built today with TomTom. At Build this week, we’re announcing Azure Maps Mobility Services, which will be a set of APIs that leverage Moovit’s APIs for building modern mobility solutions.

Through these new services, we’re able to integrate public transit, bike shares, scooter shares, and more to deliver transit route recommendations that allow customers to plan their routes leveraging the alternative modes of transportation, in order to optimize for travel time and minimize traffic congestion. Customers will also be able to access real-time intelligence on bike and scooter docking stations and car-share-vehicle availability, including present and expected availability and real-time transit stop arrivals.

Customers can use Azure Maps for IoT applications—or any application that uses geospatial or location data, such as apps for field service, logistics, manufacturing, and smart cities. Retail apps may integrate mobility intelligence to help customers access their stores or plan future store locations that optimize for transit accessibility. Field services apps may guide employees from one customer to another based on real-time service demand. City planners may use mobility intelligence to analyze the movement of occupants to plan their own mobility services, visualize new developments, and prioritize locations in the interests of occupants.

You can stay up to date about how Azure Maps is paving the way for the next generation of location services on the Azure Maps blog, and if you’re at Build this week, be sure to visit the Azure Maps booth to see our mobility and spatial operations services in action.

Simplifying development of robotic systems with Windows 10 IoT

Microsoft and Open Robotics have worked together to make the Robot Operating System (ROS) generally available for Windows 10 IoT. Additionally, we’re making it even easier to build ROS solutions in Visual Studio Code with upcoming support for Windows, debugging, and visualization to a community-supported Visual Studio Code extension. Read more about integration between Windows 10 IoT and ROS.

Come see us at Build

If you’re in Seattle this week, you can see some of these new technologies in our booth, and even play around with them at our IoT Hands-on Lab. I’ll also be hosting a session on our IoT Vision and Roadmap. Stop by to hear more details about these announcements and see some of these exciting new technologies in action.

Posted on Leave a comment

Business Applications ISV news at Build 2019

Microsoft Build 2019 is here, and thousands of developers are learning about the latest technologies, sharing best practices with colleagues, and writing lots of code. Last week I wrote about business applications sessions to see at Build, and on Monday I discussed updates for helping ISVs (independent software vendors) and developers be successful from both a business and a code perspective.

A few years ago, Microsoft introduced an IP co-sell program to help our digitally transforming enterprise customers get the software they needed from our Azure ISV partners. Many of these customers were moving to the cloud, so we started by focusing on Azure. During this time roughly 3,000 ISVs have generated over $5 billion in partner revenue from the collaboration between Azure sellers and ISVs. Following up on last month’s announcement about an upcoming program for business applications ISVs, Scott Guthrie announced the expansion of the IP co-sell program to include Dynamics 365 and Power Platform partners. Including these products and Azure in the program will make it easier for ISVs and Microsoft sellers to collaborate in serving our joint enterprise customers.

Enterprise customers are increasingly seeing software at the core of their business and are developing a deeper understanding of their software needs. Beyond person to person sales engagements, there will be times when they want to buy the app directly while having confidence in the quality of what they’re receiving. Like consumers, enterprises are familiar with getting their software through marketplaces and the ability for ISVs to transact through marketplaces lowers the barrier for reaching their enterprise customers. On Monday, we announced that SaaS transaction capabilities have been added to AppSource and the Azure Marketplace. As part of this, we will enable transactability support for Dynamics 365 and Power Platform partner apps over the coming months. This is just a start to what we are doing with the marketplace, and now is the perfect time to get familiar with our partner program that launches in July to learn more about the benefits that come with being a Dynamics 365 and Power Platform partner.

The Power Platform provides great general tooling for creating business apps while the Common Data Model (CDM) creates a standard representation of that data. There are times when an industry’s focused solution and data model are better than starting from scratch. For this case Dynamics 365 Industry Accelerators provide industry specific implementations, and we’re announcing private previews in automotive and financial services. The accelerator for the automotive industry enables you to build connected customer experiences based on a proven common data model designed to transform consumer experiences and enable smart mobility services. In financial services, we have built accelerators to help you develop banking solutions in the retail and commercial space with enhanced ways to engage customers and provide an improved customer banking experience. Automotive and financial services accelerators join previously announced accelerators including healthcare, higher education, and nonprofit. Sign up to learn more about how you can participate or help us build the next set of industry accelerators.

During my talk at Build, I showed how all of this fits together through three phases: data at the core, empowering domain experts to be citizen developers, and showing how developers and ISVs can build depth solutions. We used healthcare as an end to end scenario that we can relate to and showcased how we’re infusing AI into all three phases in a way that every industry can start to take advantage of, moving from BI to AI. Lastly, I was delighted to welcome two customers on stage that are delivering innovative solutions. HandsFree Health™ is a new startup founded by senior executives in healthcare including the former President of Aetna. They built an innovative new home healthcare device using Microsoft AI technologies spanning speech, bots, vision, and a companion Xamarin mobile app that we showed on stage. We also took a deeper look at what ISVs can do with Indegene, a global healthcare solutions provider who is building the next generation of cloud applications for life sciences with the Dynamics 365 Healthcare Accelerator.

If you are at Build or following along at home, I have created a list of some Dynamics 365 and Power Platform sessions that you might be interested in. Enjoy your Build 2019 experience and we look forward to seeing how you build, extend, or connect Dynamics 365 and the Power Platform.

Cheers,

Guggs

Posted on Leave a comment

Azure introduces new innovations for SAP HANA, expanded AI collaboration with SAP

For many enterprises modernizing ERP systems is key to achieving their digital transformation goals. At Microsoft we are committed to supporting our customers by offering the single best infrastructure choice that exists for SAP HANA, bar none. 

In terms of raw capabilities we not only have the largest number of SAP HANA-certified offerings (25 configurations that span virtual machines and purpose-built bare metal instances from 192GB to 24TB), but also the widest footprint of regions with SAP HANA certified infrastructure (26 with plans to launch 8 more by end of 2019). We also support some of the largest deployments of SAP HANA in the public cloud, such as CONA Services.

We, in partnership with SAP, are very happy to announce multiple enhancements to SAP on Azure at SAPPHIRE NOW. We will offer our customers even more choices in infrastructure giving them greater VM memory, even more options around bare metal instances and business continuity.

In addition to this we are announcing deeper integration between SAP and Azure around AI, data protection and identity integration. These integrations will help our joint customers accelerate their digital transformation with the power of the cloud.

Here’s what’s new:

  • 6 TB and 12 TB VMs for SAP HANA: Azure’s Mv2 VM series will be available on May 13, offering virtual machines with up to 6TB RAM on a single VM. This is by far the largest-memory SAP HANA-certified configuration offered on any virtual machine in the public cloud. 6TB Mv2 VMs will be generally available and production certified in U.S. East and U.S. East 2 regions. U.S. West 2, Europe West, Europe North and Southeast Asia regions will be available in the coming months.

    In addition, 12TB Mv2 VMs will become available and production certified for SAP HANA in Q3 2019. With this, customers with large-scale SAP HANA deployments can take advantage of the agility offered by Azure Virtual Machines to speed SAP release cycles by spinning up dev/test systems in minutes and simplify operational processes with Azure’s integrated tools for automated patching, monitoring, backup and disaster recovery.

  • Largest Bare Metal Instance with Intel Optane for SAP HANA: In Q4 2019 we plan to launch the largest Intel Optane optimized bare metal instances in the cloud with our SAP HANA on Azure Large Instances, including general availability of a 4 socket, 9TB memory instance and a preview of an 8 socket, 18TB memory instance. These instances enable customers to benefit from faster load times for SAP HANA data in case of a restart, offering lower Recovery Time Objective (RTO) and a reduced TCO. To learn more, please get in touch with your Microsoft representative.
  • Integration of Azure AI in SAP’s digital platform: SAP’s machine learning capabilities will leverage Azure Cognitive Services containers in preview for face recognition and text recognition. By deploying Cognitive Services in containers, SAP will be able to analyze information closer to the physical world where the data resides and deliver real-time insights and immersive experiences that are highly responsive and contextually aware.

    SAP’s Machine Learning team is working with Microsoft Azure Cognitive services team to augment its own portfolio of home grown and partner services by leveraging the containerized Vision and Text Recognition services for solving identity validation and text understanding use cases.” – Dr. Sebastian Wieczorek, VP, Head of SAP Leonardo Machine Learning Foundation

  • SAP Data Custodian on Microsoft Azure is now available: In September 2018, we announced our intent to make SAP Data Custodian, a SaaS offering, available on Microsoft Azure. We deliver on that promise today. Together, SAP and Microsoft offer unprecedented levels of data governance and compliance for our joint customers. Additionally, Microsoft will be a beta customer for SAP Data Custodian for our implementation of SAP SuccessFactors on Azure. For more information, you can read this blog from SAP.
  • Managed business continuity with Azure Backup for SAP HANA: Azure Backup support for SAP HANA databases is now in public preview. With this, customers can manage large-scale SAP HANA implementations with no infrastructure for backup. For more information, please refer to the Azure Backup for SAP HANA documentation.
  • Simplified integrations with Logic Apps connector for SAP: Today, the Logic Apps connector for SAP ECC and SAP S/4HANA is generally available for all customers. Azure Logic Apps is an integration platform-as-a-service offering connectors to 250+ applications and SaaS services. With this, customers can dramatically reduce time to market for integrations between SAP and best-in-class SaaS applications. For more information, check out our Logic Apps SAP connector documentation.
  • Boosted productivity and enhanced security with Azure Active Directory and SAP Cloud Platform: Today, Standards-based integration between Azure Active Directory and SAP Cloud Platform is in preview, enabling enhanced business security and experience. For example, when using SAP Cloud Platform Identity Provisioning and Identity Authentication Services, customers can integrate SAP SuccessFactors with Azure Active Directory and ensure seamless access to SAP applications such as SAP S/4HANA, improving end-user productivity while meeting enterprise security needs.

Customers benefiting from SAP on Azure

With more than 90% of the Fortune 500 using Microsoft Azure and SAP, our 25-year partnership with SAP has always been about mutual customer success. We are confident the announcements made today will help customers using SAP on Azure grow and innovate even more than they already are: Forrester’s Total Economic Impact Study found that SAP customers on Azure, on average, can realize an ROI of 102% with a payback in under nine months from their cloud investments.

Here are five reasons SAP customers increasingly choose Azure for their digital transformation, and some customers who are benefitting:

  • Business agility: With Azure’s on-demand SAP certified infrastructure, customers can speed up dev/test processes, shorten SAP release cycles and scale instantaneously on demand to meet peak business usage. Daimler AG sped up procurement processes to deliver months faster than would have been possible in its on-premises environment. It powers 400,000 suppliers worldwide by moving to SAP S/4HANA on Azure’s M-series virtual machines.
  • Efficient insights: Dairy Farmers of America migrated its fragmented IT application landscape spread across 18 data centers, including mission critical SAP systems over to Azure. It leverages Azure Data Services and PowerBI to enable remote users easily access SAP data in a simplified and secure manner.
  • Real-time operations with IoT: Coats, a world leader in industrial threads, migrated away from SAP on Oracle to SAP HANA on Azure several years ago, enabling Coats to optimize operations with newer IoT-driven processes. With IoT monitoring, Coats now predicts inventory, manufacturing and sales trends more accurately than ever before.
  • Transforming with AI: Carlsberg, a world leader in beer brewing, migrated 80% of its enterprise applications to Microsoft Azure, including mission critical SAP apps. By leveraging Azure AI and sensors from research universities in Denmark, Carlsberg’s Beer Fingerprinting Project enabled them to map a flavor fingerprint for each sample and reduce the time it takes to research taste combinations and processes by up to a third, helping the company get more distinct beers to market faster.
  • Mission-critical infrastructure: CONA Services, the services arm for Coca-Cola bottlers, chose Azure to run its 24 TB mission critical SAP BW on HANA system on Azure’s purpose-built SAP HANA Infrastructure, powering 160,000 orders a day, which represents an annual $21B of net sales value.

Over the past few years, we have seen customers across all industries and geographies running their mission critical SAP workloads on Azure. Whether it’s customers in Retail such as Co-op and Coca-Cola, Accenture and Malaysia Airlines in services or Astellas Pharma and Zeullig Pharma in Pharmaceuticals, Rio Tinto and Devon Energy in Oil & Gas, SAP on Azure helps businesses around the world with their digital transformation.

If you are at SAPPHIRE NOW, drop by the Microsoft booth #729 to learn about these product enhancements and to experience hands-on demos of these scenarios.

Posted on Leave a comment

Meet team EasyGlucose, the 2019 Imagine Cup World Champion

The 17th annual Imagine Cup brought together thousands of students from across the globe over eight months of coding, collaboration, and competition. Through hackathons, online semifinals, and in-person Regional Final events, the 2019 competition season all built up to one moment—the World Championship stage live from Microsoft Build. For the first time, our finalist teams pitched their projects to kick off Microsoft’s premier developer conference.

Congratulations to team EasyGlucose from the United States, who took home the 2019 Imagine Cup trophy for his deep learning, low-cost, and non-invasive blood glucose level monitor for diabetics. He won USD100,000, a mentoring session with Microsoft CEO Satya Nadella, USD50,000 in Azure grants, and ongoing mentoring from M12.

Imagine Cup aims to empower future innovators with the tools and resources to bring their technology solutions to life with Azure. This year’s competition saw many teams developing inspiring and game-changing projects focused on solving key business and societal issues. Teams Caeli from Asia, Finderr from the UK, and EasyGlucose from the USA each won their Regional Final round to advance to the final stage of the competition. They gave a live pitch of their original Machine Learning, Artificial Intelligence, and Virtual Machine projects encompassing solutions in healthcare and accessibility to a panel of three expert judges at Microsoft Build, who selected the most comprehensive idea.

Watch the show and relive the moment of Team EasyGlucose winning the trophy:

Meet the top 3 teams and recap their journey to the World Championship:

EasyGlucose6.jpg

2019 Imagine Cup World Champion: Team EasyGlucose, United States

EasyGlucose is a cloud-powered, non-invasive, and cost-effective method of blood glucose monitoring for diabetic patients. A deep learning computer vision framework using convolutional neural networks developed with Azure Virtual Machines analyzes iris morphological variation in an eye image to predict a patient’s blood glucose level. Recap their journey through the Americas Regional Final.

“I want to make cost-effective and painless blood glucose monitoring to all diabetic patients around the globe, and Imagine Cup enables me not only to share my idea and get invaluable public feedback, but also to obtain funding and keep validating and improving EasyGlucose.” – Bryan, EasyGlucose 

Team Caeli_Asia.jpg

2nd place: Team Caeli, India

Caeli is a smart automated anti-pollution and drug delivery mask specifically designed for asthmatic and chronic respiratory patients.  It implements breakthrough features and Azure Machine Learning in a portable format to improve the quality of life for respiratory patients living in polluted areas. Recap their journey through the Asia Regional Final.

Caeli wanted to build something that could help our society in surviving…here in Imagine Cup we found it suitable to showcase the possibilities and draw industry attention towards this global issue.” – Team Caeli

Team Finderr_EMEA.jpg

3rd place: Team Finderr, United Kingdom

The team won the Azure Champ Challenge at OxfordHack, which inspired them to submit their project to Imagine Cup. They created an app solution which uses Cognitive Services and Virtual Machines to help make finding lost objects accessible to visually impaired individuals through their phones. Recap their journey through the EMEA Regional Final.

“We’re  really, really excited to have the chance to be able to bring our project to fruition to help the visually impaired users.” – Team Finderr

 

Registration for the 2020 competition is now open. Join over two million student competitors worldwide in creating purpose from your passion and sign up for Imagine Cup today!

Posted on Leave a comment

Accelerating the journey from automated to autonomous systems

Microsoft’s autonomous systems platform overcomes some of these challenges by using a unique approach called machine teaching. It relies on a developer’s or subject matter expert’s knowledge — someone who may not have a background in AI but understands how to steer a drill or keep the airflow in an office building at safe levels — to break a large problem into smaller chunks.

Instead of having reinforcement learning algorithms explore how to solve a problem randomly or naively, which could take forever, that person uses a programming language called Inkling to show the system how to solve simpler problems first and provide clues about what’s important. This shortcuts the learning process and enables the algorithms to hit on a solution much faster.

Microsoft’s platform also enables non AI-experts to establish and tweak the reward system, which is key to arriving at a solution that truly works. And it selects and configures the algorithms to tackle the task, eliminating the need for machine learning experts to custom build solutions.

For instance, team members worked with Schneider Electric, a global company working to digitally transform energy management in homes, buildings and industries, to test whether AI could help reduce the carbon footprint of HVAC systems that are used to heat and cool large commercial buildings.

“Schneider is very focused on sustainability, and large buildings are a top contributor to carbon pollution. So there’s a really important mandate to make HVAC systems more energy efficient,” said Barry Coflan, senior vice president and chief technology officer for Schneider Electric’s EcoBuildings Division.

Centered on a longstanding relationship, a proof-of-concept test was conducted using the Microsoft toolchain and Schneider supplied simulation to train an AI system to autonomously run the HVAC systems that controlled airflow and heating in a conference room. It had to balance saving energy with other goals, such as keeping the temperature comfortable for people inside and making sure there’s enough fresh air to keep carbon dioxide levels from building up.

Optimizing for all those factors — which are controlled by different physical systems — requires far more intelligence than a simple thermostat, says Microsoft’s Hammond. The system has to account for environmental variables that are constantly changing: energy costs that fluctuate throughout the day, people coming and going from the room, what the outside weather is doing, the physics of how air flows.

Using a machine teaching approach, Schneider and Microsoft experts first taught the reinforcement learning system to control temperature well. Then the AI system learned how to control air flows to keep air quality at healthy levels.  Then it learned to consider how room occupancy affected those outcomes.

Taking all those factors into account, Microsoft’s AI system was able to reduce energy consumption in the room by about 20 percent, while preserving comfort and high air quality when it mattered. The teams are now embarking on a second phase of collaboration to scale the simulation across different types of rooms and further boost energy savings.

Coflan said the laddered approach to teaching and the ability to layer in different rewards enabled Schneider Electric to understand how the AI system was learning and track which factors contributed to the biggest gains.

“A lot of what we do has safety ramifications so we really need to understand how the AI system is making decisions,” Coflan said. “This approach lets you see how the system is getting smarter and gives you an audit trail that is essential for safety and reproducibility. Our customers would want that too — you can’t just put a system out there and say ‘Trust us.’”

Microsoft’s autonomous systems platform uses a simulated warehouse environment in AirSim to train an intelligent forklift to pick up and deliver loads autonomously while recognizing and avoiding other obstacles. This video illustrates the vision for a future warehouse with pre-trained, intelligent forklifts assisting people in everyday activities.

Running simulation at scale in Azure

Because no company can afford to let a robot or an intelligent control system make millions of mistakes in a real-world factory or wind farm or highway as it is learning, reinforcement learning algorithms need to practice in a simulated environment that can replicate the thousands or millions of different real-world scenarios they might encounter.

The Microsoft toolchain also includes AirSim, an open source simulation platform originally developed by Microsoft researchers to use AI to teach drones, self-driving cars or robots to learn in high fidelity simulated environments. Or, the team can work with customers to train autonomous systems using existing industry-specific simulators.

In either case, running these data-hungry simulations in the Azure cloud enables the system to test thousands of different decision-making sequences in parallel, which allows the AI models to learn what does and doesn’t work much faster.

“If I have the ability to spawn thousands of simulations at once and in each one the pedestrian crossing the street is different and the curve of the road is different, suddenly the AI system is able to gather much more diverse experience in a short amount of time ,” said Ashish Kapoor, Microsoft principal research manager. “Azure gives us the ability to run these simulations at scale, which is really important.”

AirSim also allows developers to train different AI and control tools to solve different parts of more complex problems. In helping develop autonomous forklifts for Toyota Material Handling, for instance, researchers broke the task down into sub-concepts that are simpler to learn and debug: navigating to the load, aligning with the pallet, picking it up, detecting other people and forklifts, delivering the pallet, returning to the charging station.

In these complex scenarios, Kapoor said, it may make sense to use reinforcement learning to train a forklift on basic control tasks, like picking up a pallet. Machine teaching helps the system learn in progressively more difficult steps, such as aligning the lift horizontally and then finding the proper angles.

But other parts of the problem might be better solved by entirely different tools like obstacle detection and avoidance algorithms, robotics path planning or classical control techniques. Decomposing the larger task into smaller ones allows developers to select and deploy the best tool for that particular job.

“We are working to provide a comprehensive platform for customers who want to build intelligent autonomous systems, covering development, operation and end-to-end lifecycle management,” Hammond said.

Top image: An experimental version of the Sarcos Guardian S, a visual inspection robot that can be used in disaster recovery or for industrial inspections, has learned to avoid obstacles and climb stairs on its own using Microsoft’s autonomous systems platform. Photo by Dan DeLong for Microsoft.

Microsoft Build 2019 — related autonomous systems links:

 Jennifer Langston writes about Microsoft research and innovation. Follow her on Twitter