Posted on Leave a comment

New $1M grant aims to advance the role of ‘humanware’ in academic research

Cloud computing is changing the model of cyber infrastructure for Academic Research. Microsoft is supporting the efforts of the Pervasive Technology Institute (PTI) at Indiana University to go beyond technology to invest in the people on campus who work with the Academic Research community to adopt technologies and tools that enhance collaboration, accelerate discovery, and share findings.

This Humanware project will provide an honorarium payment, cloud credits, and technical support to individuals on college campuses that apply directly to PTI to become members of the Cloud Research Software Engineers (CRSE) community.  This is community that prides itself on combining technical expertise with human skills to help others across campus.

PTI Program Management and First Wave CRSEs attending Campus Computing Summit 2019

Brian Voss | Research Engagement Manager

Brian Voss, who originated the Humanware term, is leading this initiative for Indiana University as the Research Engagement Manager responsible for building the CRSE community at campuses throughout the US and North America. Brian is a respected technology executive with over 25 years experience with Higher Education institutions as CIO, and director of research computing infrastructure.

Craig Stewart, Executive Director of the Pervasive Technology Institute and recipient of the grant, will be participating in CCS and discussing NSF collaboration with WW and US Sales team.

Eight applicants have been accepted as First Wave members of the CRSE Community from Rice, UNC, Perdue, UC -Berkeley, U. Nebraska – Lincoln, Georgia Tech, Stanford, and U. Kentucky.

The following CRSEs will kick off the program, visiting Microsoft Headquarters in Redmond, WA and participating as guests in Campus Connections Summit 2019.

John Mulligan | Rice University

John will work with Humanities and Social Science focused teams to design and deliver interactive visualizations that allow researchers to see their data in a new light, and build custom web interfaces to automate the cross-indexing of several databases, allowing his researchers to accelerate and share their work.

Eleftheria (Ria) Kontou | University of North Carolina

Ria proposes to use Microsoft Azure Machine Learning Studio to integrate geospatial data from GPS, travel surveys and trip datasets with socio-demographics and economic characteristics, to assess the impacts of ride-sourcing on transportation’s systems safety. It will use 1.5 million ride-sourcing trips data from Austin TX, overlaid with accident and traffic datasets. The proposal will seek to leverage Applicant uses econometric models (time-series with spatial autocorrelation) and heuristic algorithms to perform the analysis.

Kris Ezra | Purdue University

Kris is proposing to enhance and exercise a model with respect to metrics of interest within a stochastic, parametrically defined design space, to showcase the tremendous benefit of high-performance cloud computing environments as efficient, cost effective, and well-suited to Systems of Systems (SoS) research, now and in the future.

Dan Sholler | University of California – Berkeley

Dan will study the status of cloud research in a discipline to categorize the types of research employing cloud-based tools, and document how cloud computing has changed the methodological approaches, research roles, and necessary skills required for scientific discovery.  The proposal aims to develop actionable recommendations for promoting cloud research, governing cloud services use, and augmenting the humanware systems scientists rely upon to coordinate discovery.

Derek Weitzel | University of Nebraska – Lincoln

Derek will work with UNL community to advance the integration of cloud CI resources by adopting an NSF project, SciTokens, to securely store and transfer identity tokens, which allow access to secure storage and computing resources. This aims to overcome a barrier in using commercial cloud CI services: the management of security credentials.

Nuyun (Nellie) Zhang | Georgia Institute of Technology

Nellie, in her role with GaTech PACE will provide training and one-on-one support of the GA Tech research community, integrating Microsoft Azure HPC into their existing workflows for machine learning and data intensive research.

The CRSEs below are starting at the same time but are unable to visit Redmond this week due to scheduling conflicts.

Josiah K. Leong | Stanford University

Josiah will work with his lab to use Micrpsoft Azure’s Cognitive Services platform to analyze neuroimaging data from the Adolescent Brain Cognitive Development study.

Yongwook Song | University of Kentucky

Yongwook will work with his team to develop a machine learning-based data analysis platform using Microsoft’s Azure TensorFlow estimator API and TFRecordDataset to maximize throughput and the utilization of cloud-scaled GPUs against single molecule studies of in vivo protein oligomerization.

Posted on Leave a comment

Securing the future of AI and machine learning: Early findings from new research paper

Artificial intelligence (AI) and machine learning are making a big impact on how people work, socialize, and live their lives. As consumption of products and services built around AI and machine learning increases, specialized actions must be undertaken to safeguard not only your customers and their data, but also to protect your AI and algorithms from abuse, trolling, and extraction.

We are pleased to announce the release of a research paper, Securing the Future of Artificial Intelligence and Machine Learning at Microsoft, focused on net-new security engineering challenges in the AI and machine learning space, with a strong focus on protecting algorithms, data, and services. This content was developed in partnership with Microsoft’s AI and Research group. It’s referenced in The Future Computed: Artificial Intelligence and its role in society by Brad Smith and Harry Shum, as well as cited in the Responsible bots: 10 guidelines for developers of conversational AI.

This document focuses entirely on security engineering issues unique to the AI and machine learning space, but due to the expansive nature of the InfoSec domain, it’s understood that issues and findings discussed here will overlap to a degree with the domains of privacy and ethics. As this document highlights challenges of strategic importance to the tech industry, the target audience for this document is security engineering leadership industry-wide.

Our early findings suggest that:

  1. Secure development and operations foundations must incorporate the concepts of Resilience and Discretion when protecting AI and the data under its control.
  • AI-specific pivots are required in many traditional security domains such as Authentication, Authorization, Input Validation, and Denial of Service mitigation.
  • Without investments in these areas, AI/machine learning services will continue to fight an uphill battle against adversaries of all skill levels.
  1. Machine learning models are largely unable to discern between malicious input and benign anomalous data. A significant source of training data is derived from un-curated, unmoderated public datasets that may be open to third-party contributions.
  • Attackers don’t need to compromise datasets when they are free to contribute to them. Such dataset poisoning attacks can go unnoticed while model performance inexplicably degrades.
  • Over time, low-confidence malicious data becomes high-confidence trusted data, provided that the data structure/formatting remains correct and the quantity of malicious data points is sufficiently high.
  1. Given the great number of layers of hidden classifiers/neurons that can be leveraged in a deep learning model, too much trust is placed on the output of AI/machine learning decision-making processes and algorithms without a critical understanding of how these decisions were reached.
  • AI/machine learning is increasingly used in support of high-value decision-making processes in medicine and other industries where the wrong decision may result in serious injury or death.
  • AI must have built-in forensic capabilities. This enables enterprises to provide customers with transparency and accountability of their AI, ensuring its actions are not only verifiably correct but also legally defensible.
  • When combined with data provenance/lineage tools, these capabilities can also function as an early form of “AI intrusion detection,” allowing engineers to determine the exact point in time that a decision was made by a classifier, what data influenced it, and whether or not that data was trustworthy.

Our goal is to bring awareness and energy to the issues highlighted in this paper while driving new research investigations and product security investments across Microsoft. Read the Securing the Future of Artificial Intelligence and Machine Learning at Microsoft paper to learn more.

Posted on Leave a comment

Test your quantum programming skills in the Microsoft Q# Coding Contest

Whether you’re new to quantum computing and want to improve your skills, or have done quantum programming before and need a new challenge, we have just the thing for you: The second Microsoft Q# Coding Contest. Designed to help developers ramp up quickly in quantum computing and quantum programming, this contest will help participants build the expertise they’ll need to be ready for the advent of true quantum computing.

Organized in collaboration with Codeforces.com, the contest will be held March 1-4, 2019. It will offer the participants a selection of quantum programming problems of varying difficulty. In each problem, you’ll write Q# code to implement a transformation on qubits, or perform a more challenging task. The top 50 participants will win a Microsoft Quantum T-shirt.

This contest is the second in a series that began last July. The first contest offered problems on introductory topics in quantum computing: Superposition, measurement, quantum oracles, and simple algorithms. This second contest will take some of these topics to the next level as well as introduce some new ones.

For those eager to get a head start in the competition, a warm-up round will be held February 22-25, 2019. It will feature a set of relatively problems and focus on helping participants become familiar with the contest environment, the submission system, and the problem format. The warm-up round is a great introduction to quantum programming, both for those new to Q# or those looking to refresh their skills.

Another great way to prepare for the contest is to work your way through the Microsoft Quantum Development Kit katas. The katas allow you to test and debug your solutions locally, giving you immediate feedback on your code.

Katas measurements in Visual Studio

Q# can be used with Visual Studio, Visual Studio Code or command line, on Windows, macOS or Linux, providing an easy way to start with quantum programming. Any of these platforms can be used in the contest.

We hope to see you at the second global Microsoft Q# Coding Contest!

Posted on Leave a comment

Lessons from the South Side: The dad who helped raise an empathetic engineer

One day, young Heather came home from junior high school with an idea. She’d often toss out ideas to her father for what she could be when she grew up. Back then, he was endearingly tough to please, she remembered. Not in an unloving way, but in a way that emboldened Heather to challenge herself.

Her dad was reading in his tattered, gray arm chair. She touched his arm gently and signed, “Dad! What if I become a sign-language interpreter?”

He peered at her over his book, set it down, and signed, “No.”

“What? I thought you’d be excited. I’ve been interpreting for you and mom my whole life,” she pressed. “Why not?”

“Too safe for you,” said Royce. “Believe in yourself. Do what makes you uncomfortable.”

While Heather walked out of the room thinking that it was strange that he wouldn’t want her to choose interpreting as a career, she knew that he was right. So even though she interpreted for then-president Barack Obama during a 2015 national monument dedication in Chicago, she respected his commitment to working hard for the sake of the family. So, she would keep looking for something that challenged her.

*****

After her sophomore year of high school, Heather went to a summer engineering program at Chicago State University. There, she got to dig in to the hardware of all the gadgets that she loved. As she soldered the circuit board for a phone, she thought about how so many people who have hearing loss, at that time, couldn’t use the phone easily—the feedback on hearing aid devices made people on the other end of the line sound like they were in a construction zone. “Wow, I could make accessible technology and really change people’s lives.”

“Today, I never take it for granted that I can send my dad a text if he doesn’t see me standing at the door. The world of pagers and mobile phones really changed our world,” Heather said.

It turns out that there was a name for being an inventor of technology for people who are Deaf or hard-of-hearing. She came home that summer and told her dad, “Engineer. I’m going to be an engineer.” At that point, Heather had never heard of any female, black engineers. Surely that qualified as uncomfortable.

Royce said nothing, his kind eyes narrowing in on hers. He smirked, shrugged, and then walked down the narrow hallway to rest up for the evening’s shift at the post office.

Posted on Leave a comment

Cleaning your room takes effort. Blurring your background on Skype does not

We’ve all had those moments: You’re about to video call your parents and your laundry is all over the place, or you’re about to have a meeting with a potential investor and your business plan is on a whiteboard behind you, or you’re being interviewed on live television and your adorable child comes marching into the room. There are plenty of life’s moments that can get in the way of you being the focus in every video call—and that’s why we’re introducing background blur in Skype video calls.

Animated image of a background being blurred in Skype.

Background blur in Skype is similar to background blur in Microsoft Teams. It takes the stress out of turning on your video and puts the focus where it belongs—on you! With a simple toggle, right-click, or even through your Skype settings, your background will be instantly and subtly blurred, leaving just you as the only focal point.*

Background blur in Skype and Teams uses artificial intelligence (AI)—trained in human form detection—to keep you in focus during your call. This technology is also trained to detect your hair, hands, and arms, making a call with background blur just as relaxed and easy as a regular video call.

Background blur is available on most desktops and laptops with the latest version of Skype. For more questions about background blur in Skype, read our support article. We also love to hear from you on the Skype Community, where millions of Skype users have registered to share their expertise, feedback, and Skype stories.

*We do our best to make sure that your background is always blurred, but we cannot guarantee that your background will always be blurred.

Posted on Leave a comment

Azure Data Explorer, now available, can query 1 billion records in under a second

As Julia White mentioned in her blog today, we’re pleased to announce the general availability of Azure Data Lake Storage Gen2 and Azure Data Explorer. We also announced the preview of Azure Data Factory Mapping Data Flow. With these updates, Azure continues to be the best cloud for analytics with unmatched price-performance and security. In this blog post we’ll take a closer look at the technical capabilities of these new features.

Azure Data Lake Storage – The no compromise Data Lake

Azure Data Lake Storage (ADLS) combines the scalability, cost effectiveness, security model, and rich capabilities of Azure Blob Storage with a high-performance file system that is built for analytics and is compatible with the Hadoop Distributed File System. Customers no longer have to tradeoff between cost effectiveness and performance when choosing a cloud data lake.

One of our key priorities was to ensure that ADLS is compatible with the Apache ecosystem. We accomplished this by developing the Azure Blob File System (ABFS) driver. The ABFS driver is officially part of Apache Hadoop and Spark and is incorporated in many commercial distributions. The ABFS driver defines a URI scheme that allows files and folders to be distinctly addressed in the following manner:


abfs[s]://file_system@account_name.dfs.core.windows.net/<path>/<path>/<filename>

It is important to note that the file system semantics are implemented server-side. This approach eliminates the need for a complex client-side driver and ensures high fidelity file system transactions.

To further boost analytics performance, we implemented a hierarchical namespace (HNS) which supports atomic file and folder operations. This is important because it reduces the overhead associated with processing big data on blob storage. This speeds up job execution and lowers cost because fewer compute operations are required.

The ABFS driver and HNS significantly improve ADLS’ performance, removing scale and performance bottlenecks.  This performance enhancement is now available at the same low cost as Azure Blob Storage.

ADLS offers the same powerful data security capabilities built into Azure Blob Storage, such as:

  • Encryption of data in transit and at rest via TLS 1.2
  • Storage account firewalls
  • Virtual network integration
  • Role-based access security

In addition, ADLS’ file system provides support for POSIX compliant access control lists (ACLs). With this approach, you can provide granular security protection that restricts access to only authorized users, groups, or service principals and provides file and object data protection.

Azure Data Lake Storage diagram.jpg

ADLS is tightly integrated with Azure Databricks, Azure HDInsight, Azure Data Factory, Azure SQL Data Warehouse, and Power BI, enabling an end-to-end analytics workflow that delivers powerful business insights throughout all levels of your organization. Furthermore, ADLS is supported by a global network of big data analytics ISV’s and system integrators, including Cloudera and Hortonworks.

Next steps

Azure Data Explorer – The fast and highly scalable data analytics service

Azure Data Explorer (ADX) is a fast, fully managed data analytics service for real-time analysis on large volumes of streaming data. ADX is capable of querying 1 billion records in under a second with no modification of the data or metadata required. ADX also includes native connectors to Azure Data Lake Storage, Azure SQL Data Warehouse, and Power BI and comes with an intuitive query language so that customers can get insights in minutes.

Designed for speed and simplicity, ADX is architected with two distinct services that work in tandem: The Engine and Data Management (DM) service. Both services are deployed as clusters of compute nodes (virtual machines) in Azure.

Azure Data Explorer diagram

The Data Management (DM) service ingests various types of raw data and manages failure, backpressure, and data grooming tasks when necessary. The DM service also enables fast data ingestion through a unique method of automatic indexing and compression.

The Engine service is responsible for processing the incoming raw data and serving user queries. It uses a combination of auto scaling and data sharding to achieve speed and scale. The read-only query language is designed to make the syntax easy to read, author, and automate. The language provides a natural progression from one-line queries to complex data processing scripts for efficient query execution.

ADX is available in 41 Azure regions and is supported by a growing ecosystem of partners, including ISV’s and system integrators.

Next steps

Azure Data Factory Mapping Data Flow – Visual, zero-code experience for data transformation

Azure Data Factory (ADF) is a hybrid cloud-based data integration service for orchestrating and automating data movement and transformation. ADF provides over 80 built-in connectors to structured, semi-structured, and unstructured data sources.

With Mapping Data Flow in ADF, customers can visually design, build, and manage data transformation processes without learning Spark or having a deep understanding of their distributed infrastructure.

Azure Data Factory Mapping Data Flow

Mapping Data Flow combines a rich expression language with an interactive debugger to easily execute, trigger, and monitor ETL jobs and data integration processes.

Azure Data Factory is available in 21 regions and expanding, and is supported by a broad ecosystem of partners including ISV’s and system integrators.

Next steps

Azure is the best place for data analytics

With these technical innovations announced today, Azure continues to be the best cloud for analytics. Learn more why analytics in Azure is simply unmatched.

Posted on Leave a comment

Solving a common corporate conundrum: Making sense of all that data

Most companies today are collecting enormous amounts of data, and chances are they know that data contains crucial insights about everything from what customers want to purchase at 10 p.m. on Friday versus 7 a.m. on Wednesday to how they could run their businesses more efficiently any day of the week. What’s more, that data is all available in real time.

But too often companies can’t hear those signals, or they hear them too late.

“It would be an understatement to say we’re able to see just the tip of that iceberg. It’s more like we are analyzing a single ice cube out of that iceberg,” said Daniel Yu, director of product marketing, Azure data and artificial intelligence at Microsoft.

Part of the problem is that there’s so much data, and it’s so difficult to understand. Much of this valuable information is in what’s called unstructured or semi-structured data — generated from customer interactions on the web, software as a service apps, social media, mobile apps or IoT devices such as connected refrigerators and intelligent assistants. It is then stored in the cloud, where many of the tools for analyzing it are still maturing.

Yu said that’s left companies feeling they have two choices: They can either have powerful systems that can do really sophisticated analysis but require them to know exactly what their needs are upfront, or they can opt for more flexible systems that don’t give them as many choices for sophisticated analysis and are more time-consuming to manage .

“We think that’s a false choice,” Yu said. “You can have both power and flexibility in analytics, at a reasonable cost.”

Two women sit at a desk, chatting with a man standing next to them
From left, Lidia Rozhentsova , Sofia Iasonidou and Niklas Arbin of BookBeat discuss data analytics tools they have used to make sense of the overwhelming amounts of raw data they gather every day. Photo by Alexander Donka for Microsoft.

On Thursday, Microsoft announced that its customer offerings are getting an upgrade, with the general availability of Azure Data Explorer (ADX) and Azure Data Lake Storage (ADLS).

Microsoft says its Azure-based cloud analytics platform delivers the industry’s best price-performance ratio, a standard that measures the speed of a system against its hourly cost. According to independent testing by GigaOM, analytics with Azure SQL Data Warehouse is up to 14 times faster and costs 94 percent less than other cloud analytics offerings.

Microsoft said ADX can analyze 1 billion records of streaming data per second, using simple query language, while leaving the data and its metadata in its original state. ADLS provides a repository for storing massive amounts of structured or unstructured data, with the efficiency and security features of Azure Blob Storage. This combination is optimized for analytics.

That’s exactly the kind of offering the company says everyone from small startups to big established businesses need.

A good example is BookBeat, a European-based streaming audiobook service that runs its business on the Azure platform. The company uses data analytics to serve customers crisp recommendations based on their own reading history and those of customers with similar interests. It also relies on data to launch new business models, like a shared family account that it rolled out after its data predicted, correctly, that it would succeed.

“All our teams are data-driven,” said Niklas Arbin, head of developers at BookBeat. “We use it for everything.”

Arbin said that, with Azure managing its server infrastructure, BookBeat’s technical specialists are free to work on other essential tasks. That includes mining the data to deliver real insight and value and building internal tools for managing the vast streams of books it offers.

“Azure enables us to use the highest standards in application development, which has been very hard to do in any business intelligence toolset,” Arbin said. “It gives our developers freedom to choose the best tool for the job.”

A desktop with a collection of books neatly stacked against the cubicle wall
Many of the desks at BookBeat’s offices in Stockholm, Sweden, are loaded with analog books as well. Photo by Alexander Donka for Microsoft.

The company doesn’t even have a traditional IT department.

“We don’t have to worry about things like uptime for servers,” Arbin said. “We don’t like working in the middle of the night.”

To John Chirapurath, general manager of Azure data, Blockchain and AI at Microsoft, that sounds like success. He said Microsoft’s goal is to remove complexity for customers wherever possible, from ingesting data to presenting it.

“We always strive to make it very easy for IT staff to adopt analytics and for line of business people to utilize and deliver powerful insights using beautiful products,” Chirapurath said.

Microsoft says another selling point for customers is the company’s long history of securing its Azure cloud, which includes helping customers conform to privacy standards and regulation such as the General Data Protection Regulation, or GDPR.

To Yu, the ongoing advances in cloud analytics technology, and the relentless flow of useful data, are bringing customers to a tipping point. For many, he said it’s no longer going to make sense to host all that data on premises, and devote so many resources to managing those resources, when there are so much more interesting analytics that can be done in the cloud.

“Data and analytics are changing everything for businesses,” Yu said. “There’s not a single company that isn’t thinking about this.”

Related:

Posted on Leave a comment

Microsoft for Healthcare: technology and collaboration for better experiences, insights and care

The healthcare industry’s leading minds are getting ready to educate, intrigue, and inspire attendees next week at the HIMSS19 conference—a leading healthcare IT event in the US. We expect to see many innovative ideas and solutions to the most prevalent and persistent challenges in modern health, and we are excited to show new technologies making a real difference in people’s lives and demonstrate Microsoft’s commitment to transforming how healthcare is experienced and delivered.

Over the last few years, we have been learning alongside industry experts and making steady progress in helping health organizations navigate complex technology transformations. We have been so pleased by the enthusiastic response of the providers, payors, software developers, device manufacturers and pharmaceutical companies we’ve been working with.

But what drives us most is the profound impact on people. As we all look for more personalized and transparent approaches for healthcare services, technology transformation will help providers deliver modern patient experiences that promote patient engagement, satisfaction, and well-being while increasing the chances of more successful treatment.

This year at HIMSS, we will talk about how Microsoft’s technology and partnerships are helping empower care teams, improve clinical and operational outcomes and advance precision healthcare, with a specific focus on putting people’s privacy at the center. To kick things off, today we’re announcing several new innovations supporting the industry’s transformation:

  • Microsoft 365 for health organizations: New capabilities in Microsoft Teams that enable healthcare teams to communicate and collaborate in a secure hub for teamwork, and ultimately improve patient care.
  • Microsoft Healthcare Bot: Now generally available, this service helps organizations create AI-powered, compliant virtual assistants and chatbots for a variety of healthcare experiences.
  • Azure API for FHIR®: A new tool to help health systems interoperate and share data in the cloud.

Empowering health organizations with secure messaging and AI-powered tools

People are at the heart of healthcare – physicians, nurses, clinicians and of course, their patients. We are committed to empowering care teams with the tools they need to deliver their best care as well as empowering people as they interact with various aspects of the healthcare system.

When it comes to secure communications, many clinicians report having to choose between convenience and compliance. Adhering to compliance has often meant having to wait for critical information at the point of care. Conversely, many clinicians have turned to consumer messaging apps that facilitate communication but can compromise security.

Microsoft is working hard to ensure convenience and compliance are no longer a zero-sum equation. Today, we are announcing new capabilities in Microsoft Teams, a secure hub for teamwork that enables secure messaging and collaboration workflows that tap the wealth of patient information housed in electronic medical records.

Enable secure workflows in Microsoft Teams: The new priority notifications feature in Teams alerts a recipient of an urgent message on their mobile and desktop devices until a response is received, every two minutes for up to 20 minutes; message delegation enables clinical staff members to delegate their messages to another recipient when they are in surgery or otherwise unavailable. We are also announcing the ability to integrate FHIR-enabled electronic health records (EHR) data with Teams. The ability to view EHR data is enabled through partnerships with leading interoperability providers, including Dapasoft, Datica, Infor Cloverleaf, Kno2 and Redox. Clinical or hospital staff can securely access patient records in the same app where they can take notes, message with other team members, and start a video meeting, all in a single place to coordinate care.

For health organizations looking to optimize operational processes or create new experiences for their people and patients, we are also announcing the Microsoft Healthcare Bot general availability.

Microsoft Healthcare Bot: The Microsoft Healthcare Bot service is now generally available after first being introduced as a research project in 2017. It is designed to empower healthcare organizations to build and deploy compliant, AI-powered virtual health assistants and chatbots, and includes important features like healthcare intelligence, medical content and terminology, and a built-in symptom checker. The Microsoft Healthcare Bot service is fully extensible to help organizations adjust the bot to solve their own business problems, and can connect to health systems, like EHRs. In addition to partners like Premera, today we are announcing bots available, or available soon, from Quest Diagnostics, Children’s Healthcare of Atlanta and Clalit Health Services.

Securely connecting data for better clinical and operational outcomes

Our bodies are a lot like complex computers, and each interaction with today’s health system creates a new data point. These data points are often spread across multiple records, with valuable insights somewhat hidden in siloes. Microsoft is committed to helping address this opportunity by developing technology that connects data and surfaces important insights at exactly the right time, with privacy and security at the core.

A better-connected healthcare system would provide clinicians with more complete profiles of their patients, researchers with more complete data to study, and individuals with more information to take ownership over their health. I hear this often from leading experts in the research and care delivery communities.

With this in mind, today we’re announcing the Azure API for FHIR, a tool to help health organizations better connect systems and harness the power of data in the cloud.

Azure API for FHIR: The Azure API for FHIR will provide a method for health systems and data to ‘talk’ – what is known as interoperability – so for example, health records can connect to collaboration tools, pharmacy systems, fitness devices and others far more seamlessly. Data and insights from this more connected system can then be served up when and where they’re needed most.

API is a term for technology that links software programs together. Similar to electrical outlets and plugs, APIs can most easily be compared to the adapters you need to use electronics while traveling in foreign countries. Though technical, its functionality is important to everyone who interacts with today’s healthcare systems, as interoperability is a foundational health technology need.

The Azure API for FHIR is available in public preview, and we have more than 25 technology partners in our early access program that can help health organizations build FHIR-enabled services today.

Advancing precision healthcare

Some of the most exciting breakthroughs at the intersection of science and technology are in precision healthcare. We all stand to gain from a health system that can precisely care for us based on our unique biology, environments and ailments. Cloud and advanced AI are the key tools that will help achieve that future.

To advance precision care, Microsoft continues to invest in a series of services and computational biology projects, including research support tools for next-generation precision healthcare, genomics, immunomics, CRISPR and cellular and molecular biologics.

For example, Microsoft Genomics, which provides accelerated sequencing and secondary analysis, enables research insights for organizations like St. Jude Children’s Research Hospital with the St. Jude Cloud, the world’s largest public repository of pediatric cancer genomics data.

Earlier this year, we published an update on our partnership with Adaptive Biotechnologies, announcing we’ve opened up our joint research to immunosequence 25,000 individuals, targeting ovarian cancer, pancreatic cancer, celiac disease, type 1 diabetes and Lyme disease.

Work also continues on several Microsoft Research projects, including intelligent scribe Project EmpowerMD, medical imaging Project InnerEye, machine reading Project Hanover and metagenomics Project Premonition. These projects are pushing the boundaries of how technology can be applied in healthcare and we are excited to see how they might be used by health organizations in the future.

Working with the experts

Improving healthcare is not a singular or silver bullet effort. Microsoft’s ambition is not to be a healthcare provider, but to enable and empower those who are doing good things for people around the world. We see strategic alliances with leaders like Walgreens Boots Alliance, Allscripts, Hill-Rom, Novarad and others leading the way, with support from our thousands of technology partners. Here are a few examples:

  • Walgreens Boots Alliance: Walgreens Boots Alliance (WBA) and Microsoft announced a strategic partnership aimed at transforming health care delivery. Our companies will combine the power of Microsoft’s cloud and AI technologies, health care investments, and retail solutions with WBA’s customer reach, convenient locations, outpatient health care services, and industry expertise with the goal of making health care delivery more personal, affordable and accessible for people around the world.
  • Veradigm: Veradigm, an Allscripts company, and Microsoft announced a collaboration focused on implementing an innovative, integrated model for clinical research, aiming to enhance clinical research design, conduct studies more efficiently and improve the research provider and participant experience.
  • Hill-Rom: Hill-Rom and Microsoft announced a collaboration to bring advanced, actionable point-of-care data and solutions to caregivers and healthcare provider organizations. Our collaboration will combine Hill-Rom’s deep clinical knowledge and streaming operational data from medical devices with Microsoft’s cloud, IoT and AI technologies to help drive enhanced patient outcomes.
  • Novarad: Novarad, a healthcare enterprise imaging company, recently obtained 510(k) clearance from the FDA for the OpenSight Augmented Reality System for Microsoft HoloLens. OpenSight received pre-operative clearance for augmented reality usage in surgical planning, giving physicians access to a new solution that can improve surgical procedures by enhancing accuracy and shortening operative times.
  • ThoughtWire: ThoughtWire, is helping save lives with its EarlyWarning application, designed to preempt and prevent patients from suffering cardiac arrest in hospitals. This solution has already reduced code blue calls, which signals a risk of cardiac arrest, by 61 percent at Hamilton Health Sciences, a medical group of seven hospitals and a cancer center. ThoughtWire will deliver the EarlyWarning app, running on Microsoft Azure, to health systems at scale.
  • Innovaccer: Innovaccer is a healthcare data activation platform company working towards solving data interoperability challenges in healthcare and helping health systems enhance their clinical and financial outcomes with a data-first approach. Innovaccer is a portfolio company of M12, Microsoft’s venture fund.

The future is bright – a more connected future to deliver better experiences, insights and care. We are looking forward to meeting many of you next week at HIMSS19 and sharing more about what we are working on. Please be sure to stop by our booth No. 2500 to see our solutions in action, and follow our HIMSS19 story on @Health_IT to learn more.

Posted on Leave a comment

New bot service helps organizations develop and deploy virtual health assistants

Every year, tens of millions of adults in the U.S. are asked to contact Quest Diagnostics for healthcare-related services that range from routine blood work to complex genetic and molecular testing. In today’s increasingly self-service healthcare industry, details such as where to go when and what to do beforehand are typically up to patients to figure out for themselves.

“They are really learning how to drive their healthcare experience and they have a lot of questions,” said Jason O’Meara, senior director of architecture for Quest Diagnostics in Cary, North Carolina. “To find answers to their questions,” he added, “many people don’t want to browse websites anymore if they can get to their answer more directly using a bot.”

Quest Diagnostics recently built and deployed a bot using a preview of the Microsoft Healthcare Bot service that helps people who visit the Quest Diagnostics website during call center hours find testing locations, schedule appointments and get answers to non-medical questions such as whether to fast before a blood draw or when to expect results. If the bot is unable to answer a question, or the user gets frustrated, the bot will transfer the user, along with the context of the conversation, to a person who can help – all without having the user pick up the phone.

Microsoft announced Thursday that the Microsoft Healthcare Bot service is now generally available in the Azure Marketplace. The cloud service includes out-of-the-box healthcare intelligence such as the ability to triage complex medical questions and a set of prebuilt services including the handoff feature and a symptom checker. Customers can extend and customize the bot to solve their unique business problems. Built-in privacy controls include the ability for bots to learn and adapt to user preferences and for users to ask bots what they know about them and to ask to be forgotten.

“You don’t have to start from scratch,” said Hadas Bitran, head of Microsoft Healthcare Israel. “It has healthcare content knowledge such as a symptom checker and information about conditions, medications and procedures. It has language models trained to understand healthcare terminology. It understands if you are complaining or if you are asking about what doctor you should see or if you are thinking about side effects of a medication.”

Virtual assistant for healthcare

Bitran, who worked on Microsoft’s virtual assistant Cortana prior to joining the health group, and her team, launched the Healthcare Bot service as a research project in 2017 to determine the feasibility of a toolbox that would allow healthcare organizations to quickly and efficiently build virtual assistants tuned to their brands, along with the workflows and terminology unique to the healthcare industry.

“We were asking ourselves, ‘What are the biggest pain points of healthcare customers? How can we best help self-serve healthcare users? What would be the use cases that would be most interesting for customers,’” Bitran said.

Premera Blue Cross, a customer who used the service during the private preview stage of the project, built and deployed a bot, Premera Scout, to help consumers easily look up the status of claims and find answers to questions about benefits and services available from the health insurance provider.

“People didn’t need to call the call center and wait on the line anymore,” Bitran said. In turn, she added, customer-service employees at Premera Blue Cross now have more time to focus on complicated requests.

Building compliant health assistants

The Microsoft research and development team also knew that any bot service for the healthcare industry would need to leverage a secure cloud platform with built-in privacy controls and tools to support the user’s compliance with regulations such as the Health Insurance Portability and Accountability Act, known as HIPAA, and the General Data Protection Regulation, or GDPR.

The compliance support helps the healthcare industry keep pace with a larger trend of companies deploying conversational AI as a go-to interface for consumers to seek and find information. Quest Diagnostics, for example, found in a user-experience survey that about 50 percent of their clients would prefer to engage with a chatbot instead of a search box or frequently-asked-questions feature on a website, said O’Meara.

The Microsoft Healthcare Bot service enables organizations in the healthcare industry to meet the demand for bots that provide timely information, freeing up medical professionals to treat and care for their patients, noted Bitran.

“Virtual assistants will never replace medical professionals,” she said, adding that bots built with the Microsoft Healthcare Bot service never make a diagnosis or offer treatment. “That is not what they are for. Rather, virtual assistants help ease the burden from the healthcare system, helping medical professionals optimize their time.”

Related:

John Roach writes about Microsoft research and innovation. Follow him on Twitter.

Posted on Leave a comment

Microsoft Translator now certified compliant to meet your needs

Microsoft Translator is happy to announce that it is now certified for ISO, HIPAA, and SOC compliance. This comes as a result of Azure’s commitment to privacy and security.

Last year, Translator announced that it was GDPR compliant as a data processor. Now, Microsoft Translator is ISO, HIPAA, and SOC compliant, in addition to receiving CSA and FedRAMP public cloud attestation.

ISO: Microsoft Translator is ISO certified with five certifications applicable to the service. The International Organization for Standardization (ISO) is an independent nongovernmental organization and the world’s largest developer of voluntary international standards. Translator’s ISO certifications demonstrate its commitment to providing a consistent and secure service. Microsoft Translator’s ISO certifications are:

  • ISO 27001 Information Security Management Standards
  • ISO 9001:2015 Quality Management Systems Standards
  • 27018:2014 Code of Practice for Protecting Personal Data in the Cloud
  • 20000-1:2011: Information Technology Service Management
  • ISO 27017:2015: Code of Practice for Information Security Controls

HIPAA: The Microsoft Translator service complies with the US Health Insurance Portability and Accountability Act (HIPAA) Health Information Technology for Economic and the Clinical Health (HITECH) Act, which govern how cloud services can handle personal health information. This ensures that the health services can provide translations to clients knowing that personal data is kept private. Microsoft Translator is included in Microsoft’s HIPAA Business Associate Agreement (BAA). Health care organizations can enter into the BAA with Microsoft to detail each party’s role in regard to security and privacy provisions under HIPAA and HITECH.

Learn more about HIPAA compliance

 

SOC: The American Institute of Certified Public Accountants (AICPA) developed the Service Organization Controls (SOC) framework, a standard for controls that safeguard the confidentiality and privacy of information stored and processed in the cloud, primarily in regard to financial statements. Microsoft Translator is now SOC type 1, 2, and 3 compliant.

Learn more about SOC Compliance

 

CSA STAR: The Cloud Security Alliance (CSA) defines best practices to help ensure a more secure cloud computing environment, and to helping potential cloud customers make informed decisions when transitioning their IT operations to the cloud. The CSA published a suite of tools to assess cloud IT operations: the CSA Governance, Risk Management, and Compliance (GRC) Stack. It was designed to help cloud customers assess how cloud service providers follow industry best practices and standards, and comply with regulations. Microsoft Translator has received CSA STAR Attestation.

Learn more about CSA STAR

 

FedRAMP: The US Federal Risk and Authorization Management Program (FedRAMP) attests that Microsoft Translator adheres to the security requirements needed for use by US government agencies in the public Azure cloud. The US Office of Management and Budget requires all executive federal agencies to use FedRAMP to validate the security of cloud services. FedRAMP attestation for Microsoft Translator in the dedicated Azure Government cloud is forthcoming.

Learn more about FedRAMP

The Microsoft Translator service is subject to annual audits on all of its certifications to ensure the service continues to be compliant. View more information about Microsoft’s commitment to compliance in the Microsoft Trust Center