Posted on Leave a comment

Coming soon: Microsoft System Center 2019 for managing Windows Server and data centers

This blog post was authored by Vithalprasad Gaitonde, Principal PM Manager, System Center.

As customers grow their deployments in the public cloud and on-premises data centers, management tools are evolving to meet customer needs. System Center suite continues to play an important role in managing the on-premises data center and the evolving IT needs with the adoption of the public cloud.

Today, I am excited to announce that Microsoft System Center 2019 will be generally available in March 2019. System Center 2019 enables deployment and management of Windows Server 2019 at a larger scale to meet your data center needs.

System Center 2019 has been in private preview through the Windows Server Technical Adoption Program (TAP) customers since December 2018.  A big thank you to everyone who have given us feedback so far.

I would like to take a moment and give you an overview about the new release. System Center 2019 has the following areas of focus:

  • First-class tools to monitor and manage data centers
  • Support and manage capabilities in the latest versions of Windows Server
  • Enable hybrid management and monitoring capabilities with Azure

System Center 2019 is our LTSC (Long Term Servicing Channel) release and provides the 5 years of standard and 5 years of extended support that customers can rely on. Subsequent to the GA of System Center 2019, the suite will continue to accrue value through the Update Rollup releases every six months over the mainstream support window of 5 years.

System Center 2019 is designed to deliver value in the following areas:

Hybrid

As enterprise environments now span on-premises to the cloud, customers look to leverage the innovation in Azure services using their on-premises tools. To enable this, we have integrated System Center with a set of management services in Azure to augment the on-premises tools.

  • With Service Map integration with System Center Operations Manager (SCOM), you can automatically create distributed application diagrams in Operations Manager (OM) that are based on the dynamic dependency maps in Service Map.
  • With Azure Management Pack, you can now view perf and alert metrics in SCOM, integrate with web application monitoring in Application Insights, and monitor more PaaS services, such as Azure Blob Storage, Azure Data Factory, etc.
  • Virtual Machine Manager (VMM) 2019 enables simplified patching of VMs by integrating with Azure Update Management.

Dashboard for Azure resources in SCOM web console

Dashboard for Azure resources in SCOM web console

Security

With the security threats growing in number and sophistication, security continues to be top priority for customers.

  • System Center products now support service logon and shun the dependency on interactive logon aligning with security best practice.
  • VMM 2019 now includes a new role, VM administrator, which provides just enough permissions for read-only visibility into the fabric of the data center, but prevents escalation of privilege to fabric administration.

Virtual machine administrator role in virtual machine manager

VM Administrator Role in VMM

Software defined data center

Hyper Converged Infrastructure (HCI) is a significant trend in on-premises data centers today. Customers see lowered costs by using their servers with high performant local disks to run compute and storage needs at the same time.

  • With VMM 2019, you can manage and monitor HCI deployment more efficiently – from upgrading or patching Storage Spaces Direct clusters without downtime to monitoring the health of disks.
  • VMM 2019 storage optimization enables you to optimize placement of VHDs across cluster shared volumes and prevents VM outages caused when the storage runs full.

Storage Health in virtual machine manager

Storage Health in VMM

Modernizing operations and monitoring

Customers have come to rely on SCOM for its extensibility and the ecosystem of management packs to monitor Microsoft and third-party workloads.

  • With HTML5 dashboards and drill down experiences in the SCOM web console, you will now be able to use a simplified layout and extend the monitoring console using custom widget and SCOM REST API.
  • Taking modernization a step further, email notifications in SCOM have been modernized as well with support for HTML-email in SCOM 2019.
  • SCOM 2019 brings a new alerts experience for monitor-based alerts whereby alerts have to be attended to and cannot be simply closed by operators when the respective underlying monitors are in unhealthy state.
  • SCOM has enhanced your Linux monitoring by leveraging Fluentd; and now is resilient to management server failovers in your Linux environments.
  • All the SCOM management packs will now support Windows Server 2019 roles and features.

System Center Operations Manager web console

SCOM web console

Faster backups with Data Protection Manager 2019

Data Protection Manager (DPM) 2019 will provide backups optimized in time (faster) and space (consumes less storage).

  • DPM improves performance of your backups with a 75 percent increase in speed and enables monitoring experience for key backup parameters via Log Analytics.
  • DPM further supports backup of VMware VMs to tape. In addition to Windows Server 2019, DPM now provides backups for new workloads such as SharePoint 2019 and Exchange 2019.

Data Protection Manager alerts and reports using Log Analytics

DPM alerts and reports using Log Analytics

Orchestrator 2019 and Service Manager 2019

Orchestrator 2019 supports PowerShell V 4.0 and above, enabling you to run 64-bit cmdlets. Service Manager 2019 will ship with an improved Active Directory (AD) connector that is now capable of synchronizing with a specific domain controller.

Changes to release cadence

Finally, we are making changes to System Center release cadence to optimize the way we are delivering new features. System Center has two release trains today – LTSC and SAC. There is also a release train called Update Rollups (URs).

Most of our customers use Long Term Servicing Channel (LTSC) like System Center 2016 to run their data center infrastructures. LTSC provides five years of mainstream support and five years of extended support – with Update Rollups (UR) providing the incremental fixes and updates. From talking to customers, we learned that LTSC works better for most System Center deployments as the update cycles are longer and more stable.

Based on the learnings, we will start to focus our resources on innovation plans for System Center in LTSC releases and stop SAC releases. System Center 2019 will support upgrades from two prior SAC releases so customers running System Center 1801 or System Center 1807 will be able to upgrade to System Center 2019; just as System Center 2016 can be upgraded to System Center 2019.

System Center Configuration Manager (SCCM) is not impacted by the 2019 release change and will continue current branch release cadence of three times per year as noted in the documentation, “Support for Configuration Manager current branch versions.”

Call to action

In March, customers will have access to System Center 2019 through all the channels! We will publish a blog post to mark the availability of System Center 2019 soon. As always, we would love to hear what capabilities and enhancements you’d like to see in our future releases. Please share your suggestions, and vote on submitted ideas, through our UserVoice channels.

Frequently asked questions

Q: When will I be able to download the System Center 2019?

A: System Center 2019 will be generally available in March 2019. We will update this blog to inform that the build is available for download through the Volume Licensing Service Center (VLSC).

Q: Is there any change in pricing for System Center 2019?

A: No.

Q: Will there be a new Semi-Annual Channel release along with System Center 2019?

A: No. There will not be Semi-Annual Channel releases, but new features before the next Long-Term Servicing Channel (LTSC) release will be delivered through Update Rollups.

Posted on Leave a comment

Microsoft and Intel partner to speed deep learning workloads on Azure

This post is co-authored with Ravi Panchumarthy and Mattson Thieme from Intel.

We are happy to announce that Microsoft and Intel are partnering to bring optimized deep learning frameworks to Azure. These optimizations are available in a new offering on the Azure marketplace called the Intel Optimized Data Science VM for Linux (Ubuntu).

Over the last few years, deep learning has become the state of the art for several machine learning and cognitive applications. Deep learning is a machine learning technique that leverages neural networks with multiple layers of non-linear transformations, so that the system can learn from data and build accurate models for a wide range of machine learning problems. Computer vision, language understanding, and speech recognition are all examples of deep learning at play today. Innovations in deep neural networks in these domains have enabled these algorithms to reach human level performance in vision, speech recognition and machine translation. Advances in this field continually excite data scientists, organizations and media outlets alike. To many organizations and data scientists, doing deep learning well at scale poses challenges due to technical limitations.

Often, default builds of popular deep learning frameworks like TensorFlow are not fully optimized for training and inference on CPU. In response, Intel has open-sourced framework optimizations for Intel® Xeon processors. Now, through partnering with Microsoft, Intel is helping you accelerate your own deep learning workloads on Microsoft Azure with this new marketplace offering.

“Microsoft is always looking at ways in which our customers can get the best performance for a wide range of machine learning scenarios on Azure. We are happy to partner with Intel to combine the toolsets from both the companies and offer them in a convenient pre-integrated package on the Azure marketplace for our users” 

– Venky Veeraraghavan, Partner Group Program manager, ML platform team, Microsoft.

Accelerating Deep Learning Workloads on Azure

Built on the top of the popular Data Science Virtual Machine (DSVM), this offer adds on new Python environments that contain Intel’s optimized versions of TensorFlow and MXNet. These optimizations leverage the Intel® Advanced Vector Extensions 512 (Intel® AVX-512) and Intel® Math Kernel Library for Deep Neural Networks (Intel® MKL-DNN) to accelerate training and inference on Intel® Xeon® Processors. When running on an Azure F72s_v2 VM instance, these optimizations yielded an average of 7.7X speedup in training throughput across all standard CNN topologies. You can find more details on the optimization practice here.

As a data scientist or AI developer, this change is quite transparent. You still code with the standard TensorFlow or MXNet frameworks. You can also use the new set of Python (conda) environments (intel_tensorflow_p36, intel_mxnet_p36) on the DSVM to run your code to take full advantage of all the optimizations on an Intel® Xeon Processor based F-Series or H-Series VM instance on Azure. Since this product is built using the DSVM as the base image, all the rich tools for data science and machine learning are still available to you. Once you develop your code and train your models, you can deploy them for inferencing on either the cloud or edge.

“Intel and Microsoft are committed to democratizing artificial intelligence by making it easy for developers and data scientists to take advantage of Intel hardware and software optimizations on Azure for machine learning applications. The Intel Optimized Data Science Virtual Machine (DSVM) provides up to a 7.7X speedup on existing frameworks without code modifications, benefiting Microsoft and Intel customers worldwide”

Binay Ackalloor, Director Business Development, AI Products Group, Intel.

Performance

In Intel’s benchmark tests run on Azure F72s_v2 instance, here are the results comparing the optimized version of TensorFlow with the standard TensorFlow builds.

Bar graph of Default TensorFlow vs. Intel Optimization for TensorFlow

Figure 1: Intel® Optimization for TensorFlow provides an average of 7.7X increase (average indicated by the red line) in training throughput on major CNN topologies. Run your own benchmarks using tf_cnn_benchmarks. Performance results are based on Intel testing as of 01/15/2019. Find the complete testing configuration here.

Getting Started

To get started with the Intel Optimized DSVM, click on the offer in the Azure Marketplace, then click “GET IT NOW”. Once you answer a few simple questions on the Azure Portal, your VM is created with all the DSVM tool sets and the Intel optimized deep learning frameworks pre-configured and ready to use.

Screenshot of Intel Optimized Data Science VM for Linux in Azure marketplace

The Intel Optimized Data Science VM is an ideal environment to develop and train deep learning models on Intel Xeon processor-based VM instances on Azure. Microsoft and Intel will continue their long partnership to explore additional AI solutions and framework optimizations to other services on Azure like the Azure Machine Learning service and Azure IoT Edge.

Next steps

Posted on Leave a comment

How AutoPilot helps schools manage shared devices

[youtube https://www.youtube.com/watch?v=JrEU84KK2lQ?feature=oembed&w=500&h=281]

In my job I chat a lot with both school leaders and IT admins about how they can simplify the management of their devices, making it faster and easier for their end users (usually students, teachers and admin staff) to get started and complete the work they need to do on a device.

Unlike many corporate environments, schools have a very high number of “shared devices” in operation, where students of different year levels require access on the same device and, in some scenarios, even teachers need to log into the same device and access different apps and security settings. In the next two blog posts I’m going to go a bit deeper into how schools can approach this challenge with modern deployment practices, leveraging cloud identity in AzureAD, easier enrollment of devices using Microsoft Autopilot and finally a couple of tweaks for a faster user sign-in experience using Microsoft Intune as the Mobile Device Management (MDM) tool.

Overview:

As this blog post will be a little longer (and more technical) I’m going to give you a break down of what is to come so you can skip to the important sections relevant to you:

  • Part One:
    • Identity – why use a cloud identity?
    • Why use AutoPilot?
    • Configuring Autopilot
    • Enrolling your device
  • Part Two:
    • Intune vs Intune for Education
    • What are CSP?
    • Building a custom CSP Policy
    • Using LOB App Deployment in Intune

Now it’s worth stating at this stage that I am not an IT administrator by profession. Whilst I’m probably more technical than many, I’ve got the following working through a combination of relying on the detailed guides in the Microsoft Docs and awesome technical colleagues who have shared some of their expertise with me. Additionally, like you, I read a bunch of blogs to see how people have done this in the past. This blog is a small contribution to the community who like to learn from other’s experiences. If you’re reading this and are more technical than me and see some improvements or corrections in what I’ve done – I’d love to hear from you in the comments section below.

With that said, let’s get started!

Identity – why use a cloud identity?

It’s amazing how many conversations I’ve been having around cloud identities recently as school leaders are starting to understand they need to be able to simplify user access to key resources via Single Sign On (SSO), and open up both cloud/internet solutions as well as traditional on-premise hosted solutions. There are plenty of confusing diagrams out there trying to explain what this is about – the following is the simplest I could find:

AzureAD.png

AzureAD.png

Essentially, the above is showing two scenarios:

  1. The user may sign into an “on-premise” identity platform (on the left), in this case Active Directory (still incredibly common in schools) which, through the use of a tool called AzureAD Connect can automatically sign into a cloud identity as well, in this case Azure Active Directory (AzureAD or even AAD).
  2. Alternatively, the user may sign directly into a cloud service (on the right) using their AzureAD credentials. In fact, if their device is managed by the school, it may even be joined to AzureAD only.

Why does this matter? As an example, I was talking with a school recently where teachers were required to use up to four different usernames/passwords to access their key platforms such as signing into a computer, accessing their email, accessing their Student Management System (SMS) and accessing their cloud collaboration suite (Office365 in this case). Simplifying this through a single cloud identity saves time and frustration for everyone! It also improves security as people are more likely to choose a secure password if they only have one to remember.

Additionally, schools are increasingly wanting to sign into third party cloud apps with the same credentials – this blog post I wrote shows a school accessing eight different solutions with just their AzureAD identity.

The key point is: identity matters. If your school does not have a cloud identity of some sort, you’re going to be inherently limited in what you can do.

As the focus of this blog is primarily around AutoPilot, I’m not going to go deep into Identity – some useful background reading I would share is earlier blog posts I’ve written around:

For the purposes of this blog, if you’re wanting to use AutoPilot then your Office365 Tenant must have either the AzureAD P1 or P2 plans – see the differences here. With many schools opting for the M365 A3 Suite, this includes AzureAD P1:

Azure Active Directory Premium P1. In addition to the Free and Basic features, P1 also lets your hybrid users access both on-premises and cloud resources. It also supports advanced administration, such as dynamic groups, self-service group management, Microsoft Identity Manager (an on-premises identity and access management suite) and cloud write-back capabilities, which allow self-service password reset for your on-premises users.

To proceed with AutoPilot you need your users in AzureAD (and licensed with P1 or P2) so if you’ve not got that far, best to stop and sort before continuing on (if you want help with this, check out School Data Sync which can automatically add users from your Student Information System).

Why use AutoPilot?

It’s always a good question to ask, and before answering if you’re brand new to AutoPilot then it’s worth watching the video at the top of this blog post and then getting into the official AutoPilot Documentation here. If you’re coming from an Apple device management world and are familiar with the Device Enrollment Program (DEP) then the concepts of AutoPilot will be very familiar for you.

AutoPilot

AutoPilot

Windows Autopilot is a collection of technologies used to set up and pre-configure new devices, getting them ready for productive use. You can also use Windows Autopilot to reset, re-purpose and recover devices.
This solution enables an IT department to achieve the above with little to no infrastructure to manage, with a process that’s easy and simple.

Windows Autopilot is designed to simplify all parts of the life cycle of Windows devices, for both IT and end users, from initial deployment through the eventual end of life. Leveraging cloud-based services, it can reduce the overall costs for deploying, managing, and retiring devices by reducing the amount of time that IT needs to spend on these processes and the amount of infrastructure that they need to maintain, while ensuring ease of use for all types of end users.

Back to the why use it…..

  • Devices become enrolled / locked to your organisation. If a user (authorised or not) resets the Win10 OS back to factory settings, as soon as it connects to the internet again it will register back to your organisation, making it largely useless to anyone if it was stolen.
  • Speeds up and simplifies the Win10 setup process – you can optionally skip quite a few of the steps you normally need to undertake in Win10 e.g. requiring the user to agree to the EULA, choosing their privacy settings, configure whether the user will be an Administrator or a Standard user, and depending on deployment mode, can even skip keyboard preferences.
  • Devices can be assigned to specific users, meaning when they turn it on for the first time, connect to the internet they’re greeted by name as part of their organisation.
  • AutoPilot Reset allows an IT Admin to remotely reset the device, returning it to the original state, but keeping it joined to AzureAD and enrolled into Intune for management – think of this like a “spring clean” at the beginning of the school year or new Term.

In short, AutoPilot is designed to make your life easier!

Configuring Autopilot

For my demo and testing, I’m using an Acer B117 laptop, something that is available in the NZ Education Right Device Campaign, a low cost, low spec Win10 device with 4GB RAM and options around 64/128GB SSD storage. One of the beauties of AutoPilot is that supported OEM devices can send the unique Hardware Identifier (HWID) to the purchasing organisation / school in advance of receiving the devices, allowing for the configuration of the entire environment in advance of even receiving the hardware.

An obvious upside for this would be the ability to ‘drop ship’ devices to remote employees directly from purchase, without the need for IT Admins to even site the device.

In my case, I needed to manually extract the HWID from the Acer laptop, which can easily be accomplished with some basic PowerShell (run as local Administrator):

md c:HWID Set-Location

c:HWID Set-ExecutionPolicy Unrestricted

Install-Script -Name Get-WindowsAutoPilotInfo

Get-WindowsAutoPilotInfo.ps1 -OutputFile AutoPilotHWID.csv

PowerShell.PNG

PowerShell.PNG

Basic PowerShell commands will allow you to extract the unique Hardware Identifier (HWID) for your existing device – this is required for AutoPilot to run

With the HWID obtained, the process to complete the configuration of AutoPilot is easily followed by these step by step instructions here, but largely consist of the following steps:

  1. Add your devices (HWID) into Intune
  2. Create an AutoPilot Device Group (tells Intune which devices in your organisation should be managed by AutoPilot). Note you can do both static and dynamic rules for adding devices here.
  3. Create an AutoPilot Deployment Profile – this is the configuration settings you want to choose and allows you to skip a number of the standard Win10 decisions that need to be made when a device is being set up for the first time.
  4. Assign an AutoPilot Deployment Profile to a Device Group – this matches what you’ve created in Step 2 with Step 3
  5. Assign a user to a specific AutoPilot Device – this optional step allows you to match a user in your organisation with a specific device. The net result of this is the first time the user turns on the computer and connects it to the internet their name and email address is pre-populated in the setup process, meaning they only need to confirm their password during the setup – very cool!

The documentation I’ve linked to is pretty clear – it took me about thirty minutes to follow along and set the above up the first time I ran it.

Enrolling Your Device

Now the fun really beings. With the configuration completed, you can take your brand new ‘out of the box’ device and enroll it using AutoPilot for a truly streamlined, managed experience.

I took some photos of the experience using my phone camera (photo quality is average) and anyone that has ever set up Windows 10 will be familiar with this process:

2019-03-01 15.09.13

2019-03-01 15.09.13

A user must always choose their region

2019-03-01 15.09.31

2019-03-01 15.09.31

Keyboard preference remains a requirement

2019-03-01 15.09.41

2019-03-01 15.09.41

2019-03-01 15.10.17

2019-03-01 15.10.17

At this point, once the device is connected to the internet it will automatically join AzureAD and enroll into Intune because the HWID is registered with your tenant. Further Win10 setup steps can be optionally skipped at this point based on the Autopilot Profile configuration.

2019-03-01 15.10.30

2019-03-01 15.10.30

The device immediately starts to configure based on AutoPilot Deployment Profile you’ve created and assigned to the Device Group

2019-03-08 11.43.43

2019-03-08 11.43.43

This screenshot shows AutoPilot busily configuring the device and giving progress updates – the time this takes varies based on how many apps you’ve chosen to push to the device.

2019-03-01 15.12.38_LI

2019-03-01 15.12.38_LI

Done! Note the following: 1) School logo is displayed 2) User is greeted by name if the device is specifically assigned to a user 3) The school/organisation name is displayed; 4) The user’s email address is displayed 5) A customisable welcome message is displayed with contact details for assistance.

At this point, the device settings and applications are installed (or possibly still coming down over the internet) but the device is ready for us.

The end user had minimal choices and actions required of them:

  1. Choose their country
  2. Choose their keyboard
  3. Connect to the internet (this could even be their home WiFi)
  4. Enter their organisation password (Office365)

My Thoughts

Modern deployment relies on the cloud for identity and provisioning of devices – there are no on-premise servers in the above model. This allows for fast, flexible and lower cost management of devices – something that appeals to education institutes where every dollar counts!

Whilst I’ve gone through the configuration pretty quickly above, along with a high level ‘rationale’ of why you’d want to do this, the next post will go a bit deeper into when to use Intune vs Intune for Education, and a couple of tweaks to make your devices run even faster at sign in and have key applications appear instantly whenever a new user signs in. I’ll likely post this in the next week or so.

Posted on Leave a comment

Is drought on the horizon? Researchers turn to AI in a bid to improve forecasts

As winter drags on, some people wonder whether to pack shorts for a late-March escape to Florida, while others eye April temperature trends in anticipation of sowing crops. Water managers in the western U.S. check for the possibility of early-spring storms to top off mountain snowpack that is crucial for irrigation, hydropower and salmon in the summer months.

Unfortunately, forecasts for this timeframe — roughly two to six weeks out — are a crapshoot, noted Lester Mackey, a statistical machine learning researcher at Microsoft’s New England research lab in Cambridge, Massachusetts. Mackey is bringing his expertise in artificial intelligence to the table in a bid to increase the odds of accurate and reliable forecasts.

“The subseasonal regime is where forecasts could use the most help,” he said.

Mackey knew little about weather and climate forecasting until Judah Cohen, a climatologist at Atmospheric and Environmental Research, a Verisk business that consults about climate risk in Lexington, Massachusetts, reached out to him for help using machine learning techniques to tease out repeating weather and climate patterns from mountains of historical data as a way to improve subseasonal and seasonal forecast models.

The preliminary machine learning based forecast models that Mackey, Cohen and their colleagues developed outperformed the standard models used by U.S. government agencies to generate subseasonal forecasts of temperature and precipitation two to four weeks out and four to six weeks out in a competition sponsored by the U.S. Bureau of Reclamation.

Mackey’s team recently secured funding from Microsoft’s AI for Earth initiative to improve and refine its technique with an eye toward advancing the technology for the social good.

“Lester is working on this because it is a hard problem in machine learning, not because it is a hard problem in weather forecasting,” noted Lucas Joppa, Microsoft’s chief environmental officer who runs the AI for Earth program, as he explained why his group is helping fund the research. “It just so happens that the techniques he is interested in exploring have huge applicability in weather forecasting, which happens to have huge applicability in broader societal and economic domains.”

Fields being irrigated on the edge of the desert in the Cuyama Valley Photo by Getty Images.

AI on the brain

Mackey said current weather models perform well up to about seven days in advance, and climate forecast models get more reliable as the time horizon extends from seasons to decades. Subseasonal forecasts are a middle ground, relying on a mix of variables that impact short-term weather such as daily temperature and wind and seasonal factors such as the state of El Niño and the extent of sea ice in the Arctic.

Cohen contacted Mackey out of a belief that machine learning, the arm of AI that encompasses recognizing patterns in statistical data to make predictions, could help improve his method of generating subseasonal forecasts by gleaning insights from troves of historical weather and climate data.

“I am basically doing something like machine learned pattern recognition in my head,” explained Cohen, noting that weather patterns repeat throughout the seasons and from year to year and that therefore pattern recognition can and should inform longer-term forecasts. “I thought maybe I can improve on what I am doing in my head with some of the machine learning techniques that are out there.”

Using patterns in historical weather data to predict the future was standard practice in weather and climate forecast generation until the 1980s. That’s when physical models of how the atmosphere and oceans evolve began to dominate the industry. These models have grown in popularity and sophistication with the exponential rise in computing power.

“Today, all of the major climate centers employ massive supercomputers to simulate the atmosphere and oceans,” said Mackey. “The forecasts have improved substantially over time, but they make relatively little use of historical data. Instead, they ingest today’s weather conditions and then push forward their differential equations.”

A combine harvester moving on a snow-covered fieldPhoto by Getty Images.

Forecast competition

As Mackey and Cohen were discussing a research collaboration, Cohen received notice of a competition sponsored by the U.S. Bureau of Reclamation to improve subseasonal forecasts of temperature and precipitation in the western U.S. The government agency is interested in improved subseasonal forecasts to better prepare water managers for shifts in hydrologic regimes, including the onset of drought and wet weather extremes.

“I said, ‘Hey, what do you think about trying to enter this competition as a way to motivate us, to make some progress,’” recalled Cohen.

Mackey, who was an assistant professor of statistics at Stanford University in California prior to joining Microsoft’s research organization and remains an adjunct professor at the university, invited two graduate students to participate on the project. “None of us had experience doing work in this area and we thought this would be a great way to get our feet wet,” he said.

Over the course of the 13-month competition, the researchers experimented with two types of machine learning approaches. One combed through a kitchen sink of data containing everything from historical temperature and precipitation records to data on sea ice concentration and the state of El Niño as well as an ensemble of physical forecast models. The other approach focused only on historical data for temperature when forecasting temperature or precipitation when forecasting precipitation.

“We were making forecasts every two weeks and between those forecasts we were acquiring new data, processing it, building some of the infrastructure for testing out new methods, developing methods and evaluating them,” Mackey explained. “And then every two weeks we had to stop what we were doing and just make a forecast and repeat.”

Toward the end of the competition, Mackey’s team discovered that an ensemble of both machine learning approaches performed better than either alone.

Final results of the were announced today. Mackey, Cohen and their colleagues captured first place in forecasting average temperature three to four weeks in advance and second place in forecasting total precipitation five and six weeks out.

A flooded river under a walking bridgePhoto by Getty Images.

Forecast for the future

After the competition, the collaborators combined their ensemble of machine learning approaches with the standard models used by U.S. government agencies to generate subseasonal forecasts and found that the combined models improved the accuracy of the operational forecast by between 37 and 53 percent for temperature and 128 and 154 percent for precipitation. These results are reported in a paper the team posted on arXiv.org.

“I think we will continue to see these types of approaches be further refined and increase in the breadth of their use within the field of forecasting,” said Kenneth Nowak, water availability research coordinator with the U.S. Bureau of Reclamation, who organized the forecast rodeo. He added that government agencies will “look for opportunities to leverage” machine learning in future generations of operational forecast models.

Microsoft’s AI for Earth program is providing funding to Mackey and colleagues to hire an intern to expand and refine their machine learning based forecasting technique. The collaborators also hope that other machine learning researchers will be drawn to the challenge of cracking the code to accurate and reliable subseasonal forecasts. To encourage these efforts, they have made available to the public the dataset they created to train their models.

Cohen, who kicked off the collaboration with Mackey out of a curiosity about the potential impact of AI on subseasonal to seasonal climate forecasts, said, “I see the benefit of machine learning, absolutely. This is not the end; more like the beginning. There is a lot more that we can do to increase its applicability.”

Related:

John Roach writes about Microsoft research and innovation. Follow him on Twitter.

Posted on Leave a comment

Game with Captain Marvel at your side with Xbox’s latest Custom Console Sweepstakes

To celebrate the upcoming release of Marvel Studios’ “Captain Marvel,” Xbox and Marvel Studios are excited to announce the Xbox One X Captain Marvel Custom Console Sweepstakes!

Xbox is thrilled to be celebrating one of Marvel Cinematic Universe’s most powerful heroes, timed with the movie’s much anticipated release tomorrow. If you’re a fan of Marvel, this sweepstakes is your opportunity to game in style with Captain Marvel at your side on the world’s most powerful console.

Visit our Xbox Twitter channel to see the custom Captain Marvel Xbox One X console and enter for your chance to win! For more details, including eligibility, please visit the official rules for terms and restrictions.

Set in the 1990s, Marvel Studios’ “Captain Marvel” is an all-new adventure from a previously unseen period in the history of the Marvel Cinematic Universe that follows the journey of Carol Danvers as she becomes one of the universe’s most powerful heroes. While a galactic war between two alien races reaches Earth, Danvers finds herself and a small cadre of allies at the center of the maelstrom. The film stars Brie Larson, Samuel L. Jackson, Ben Mendelsohn, Djimon Hounsou, Lee Pace, Lashana Lynch, Gemma Chan, Rune Temte, Algenis Perez Soto, Mckenna Grace, Annette Bening, Clark Gregg, and Jude Law.

Marvel Studios’ “Captain Marvel” is produced by Kevin Feige and directed by Anna Boden & Ryan Fleck. Louis D’Esposito, Victoria Alonso, Jonathan Schwartz, Patricia Whitcher and Stan Lee are the executive producers. The story is by Nicole Perlman & Meg LeFauve and Anna Boden & Ryan Fleck & Geneva Robertson-Dworet, and the screenplay is by Anna Boden & Ryan Fleck & Geneva Robertson-Dworet. “Captain Marvel” opens on March 8, 2019 in U.S. theaters.

Stay tuned to Xbox Wire for future sweepstakes opportunities and don’t forget to catch the premiere of Marvel Studios’ “Captain Marvel”!

Posted on Leave a comment

Progress report on digital transformation in healthcare

Two scientists using digital tablet in laboratory

It’s been an incredible year so far for the health industry. We’ve seen the dream and the opportunity of digital transformation and AI start to really take shape in the marketplace.

We saw many examples of this last month at HIMSS 2019, many of our partners and other cloud providers are offering commoditized access to complex healthcare algorithms and models to improve clinical and business outcomes.

Trust

These examples show how cloud computing and AI can deliver on the promise of digital transformation. But for health organizations to realize that potential, they have to trust the technology—and their technology partner.

Microsoft has always taken the lead on providing cloud platforms and services that help health organizations protect their data and meet their rigorous security and compliance requirements. Recently, we announced the HIPAA  eligibility and HITRUST certifications of Microsoft Cognitive Services and Office 365.

It’s crucial for health organizations to feel utterly confident not only in their technology partner’s ability to help them safeguard their data and infrastructure, and comply with industry standards, but also in their partner’s commitment to help them digitally transform their way—whatever their needs or objectives are. Our mission is to empower every person and every organization on the planet to achieve more. So whether you’re a health provider, pharmaceutical company, or retailer entering healthcare, your mission is our mission. Our business model is rooted in delivering success rather than disruption for our customers.

Interoperability

Another point of vital importance as we support the movement of healthcare as an industry—and healthcare data specifically—to the cloud is ensuring that we avoid the sins of the past, specifically data silos.

To that end, we jointly announced with other leading cloud providers that we’re committed to healthcare data interoperability among cloud platforms and supporting common data standards like Fast Healthcare Interoperability Resources (FHIR). And I was particularly thrilled to see the excitement in the health industry in reaction to our launch last last month with Azure API for FHIR and our commitment to develop open source FHIR servers. I hope you’ll join the huge movement behind health interoperability fueled by FHIR and encourage your technologists to start actively using the open-source project to bring diverse data sets together—and to build systems that learn from those data sets.

As my colleague, Doug Seven, recently wrote, interoperability helps you bring together data from disparate sources, apply AI to it to gain insights, and then enrich care team and patient tools with those insights to help you achieve your mission. That’s a crucial step in the digital transformation of health.

Teamwork

Another crucial step is supporting health teamwork. With the changing nature of care delivery, health services increasingly require coordination across multiple care settings and health professionals. So we launched a new set of capabilities to our Teams platform that provides workflows for first-line clinical workers such as doctors and nurses that they can use to access patient information and coordinate care in real time.

The end game

Why does all of this matter? To answer that question, I always come back to the quadruple aim, which all of us in the health industry strive for: enhancing both patient and caregivers’ experiences, improving the health of populations, and lowering the costs of healthcare.

Empowering care teams and patients with data insights and tools that help them coordinate care—and that they and your health organization can trust—will help bring about the desired outcomes of the quadruple aim. Not only will this systemic change improve clinical and business outcomes, but also, at an individual level, enhance the day-to-day and digital experiences of clinical workers and patients alike—creating better experiences, better insights, and better care across the delivery system.

Learn more about real-world use cases for AI in the e-book: “Breaking down AI: 10 real applications in healthcare.”

Posted on Leave a comment

Immersive Reader comes to OneNote iPhone; Mac and iPad now support Math in Immersive Reader

In March we released a set of inclusive updates to enable students of all abilities to more easily access content.  These updates are across both OneNote and Word, and have rolled out broadly and for the most part are available today.

  • Immersive Reader for OneNote iPhone – we brought the Immersive Reader to OneNote for iPhone, making content even more accessible on the go. This is at 50% rolled out worldwide, and going to 100% next week (March 11th)iPhone.jpgiPhone.jpg
  • OneNote iPad and Mac support for math and equations – if a student has math equations on a page, when launching the Immersive Reader, the math and equations are recognized, and all immersive reader capabilities work, including Read Aloud, Line Focus, Page Theme Colors, and more. Great for story problems as welliPad.jpgiPad.jpg
  • Word Online support for Math equations in Immersive Reader – this capability has now rolled out to 100% worldwide!

Thanks for your continued feedback, and if you have any questions or suggestions, don’t hesitate to reach out! 🚀

Mike Tholfsen

Microsoft Education

@mtholfsen

This post was originally published on this site.

Posted on Leave a comment

Secure your digital transformation through simplicity with help from a new Forrester study

Sometimes, technology can make things overly complex.

Even with the best of intentions, there can be too much of a good thing. In the world of cybersecurity, complexity has been a mainstay, but in recent years, it has grown beyond its breaking point and has become a liability for security practitioners.

The Forrester study, titled Security Through Simplicity (Dec. 2018)—commissioned by the Microsoft Security team—clearly shows that digital transformation, while necessary for business success, compounds the complexity of an already tangled security threat landscape. However, the study also found a correlation between vendor consolidation and strategy modernization to reduce security complexity.

Digital transformation introduces new levels of complexity

Digital transformation is a critical shift under which businesses are using data-powered platforms and applications to improve nearly every aspect of their business operations. New open ecosystems and the democratization of data means more users in varied locations sharing data across more applications, devices, platforms, and environments—both internally and externally.

As businesses continue to digitize processes, security teams must contend with an increase in attack vectors and more complicated management, all while keeping pace with increasingly sophisticated attackers. In the face of this massive challenge, security teams must evaluate and refresh their legacy security procedures, tools, and skill sets to accommodate a new and adaptable approach to enterprise security.

In the study, paid for by Microsoft, Forrester asked 481 IT security decision makers, “How challenging are the following security goals/objectives to achieve?” and found them all to be highly or extremely challenging:

Infographic showing 59% correlate security alerts from disparate technologies to detect actual threats, 57% hire trained IT security staff, 57% modernize their organization's IT security strategies, and 60% retrain IT security staff.

Reducing security complexity

So how are enterprise IT security teams successfully reducing complexity to improve their security efforts in the face of digital transformation? The study found an interesting correlation between vendor consolidation and strategy modernization in successfully achieving both business and security initiatives, when executed in concert with each other.

A high number of disparate security solutions in place for on-premises and cloud infrastructure and applications makes visibility and central management extremely difficult. Reducing the number of disparate security point solutions that must interact with each other—particularly older, legacy ones—brings complexity down to a manageable level and allows businesses the visibility, security, and control to expand their digital adoption with confidence. Vendor consolidation and modernization can also yield cost savings by lowering technology budgets, increasing management efficiencies, and avoiding the costs of a data breach or regulatory noncompliance.

A small subset (11 percent) of enterprises that have successfully achieved both critical initiatives, modernization, and vendor consolidation, have been able to reduce complexity and reap the rewards of digital transformation. These organization are:

  • 54 percent more likely to feel that their IT security strategy helps them to digitally transform their organization.
  • 42 percent more likely to feel that their IT security strategy helps reduce risk of a customer data breach.
  • 33 percent more likely to feel that their IT security strategy improves their customers’ experiences.

Key recommendations

Companies undergoing digital transformation seek new ways to engage with customers, create additional revenue streams, and place innovation at the forefront of their corporate strategy. Failing to secure their digital assets can lead to those same organizations forfeiting hard-won successes.

Forrester’s in-depth survey of 481 IT security decision makers yielded several important recommendations:

  • Implement security by design.
  • Consolidate security vendors and security solutions.
  • Increase measurement, analytics, and reporting capabilities.
  • Discover and manage shadow IT.
  • Adapt security to users.

Get your copy of the full study.

Posted on Leave a comment

All-new ‘Inside Xbox’ coming March 12

Inside Xbox is back on Tuesday, March 12, at 2 p.m. PT / 5 p.m. ET with even more exclusive news, content, interviews and footage you won’t see anywhere else!

We’ll have some exciting news involving Halo: The Master Chief Collection! The hourlong episode will also feature DayZ, One Piece World Seeker and of course, Xbox Game Pass! Of course, there’s much more that we can’t reveal just yet, so be sure to tune in live on Mixer, Twitch, YouTube, Facebook, and Twitter on Tuesday, March 12th at 2 p.m PT / 5 p.m. ET.

Posted on Leave a comment

Coming soon to Xbox Game Pass: ‘Just Cause 4,’ ‘LEGO Batman 2,’ ‘Fallout 4’ and more

Hey, Xbox Game Pass members, we’ve got even more video games coming your way because well… that’s what we do. Honestly, would you have it any other way? Whether you want to soar through explosions in a rainbow-colored wingsuit or stroll through hordes of Bloatflies in an irradiated ocean (no judgment, if that’s what you’re into), we’ve got you covered. All that is in addition to the 100+ great games you already have at your fingertips. If you can’t wait, we should also mention that you can remotely install these games to your home console right when they drop by using the Xbox Game Pass mobile app. Then they’ll be ready to play as soon as you are.

Alright, enough talk. Let’s get to the games:

Just Cause 4 (March 6)

Is it a bird? Is it a plane? No, it’s Rico Rodriguez gliding to Xbox Game Pass on March 6! Xbox One players will be able to play as the rugged rogue agent and venture to the Island of Solis where you can unleash chaos in a number of explosive & creative ways. With Rico’s newly customizable grappling hook, you’ll be able to stride, glide and ride through exotic landscapes besieged by extreme weather, pushing you and enemy Black Hand militia to their limits as you lead an army of chaos. Lightning-ridden rainforests, roaring deserts, howling snowstorms…Oh and did we mention tornados!? Take in the sights and stay a while. Just Cause 4 is an open world sandbox designed for exploratory, experimental, over-the-top fun, ready for you on Xbox Game Pass.

LEGO Batman 2 (March 6)

Calling all crime fighters! Batman, Robin, Wonder Woman, Superman, and more playable DC Comics characters than you can shake a shark repellent covered stick at, are back to save Gotham City. Play as different members of the Justice League in this open world single or multiplayer game, while stopping Lex Luthor, Joker, and more notorious villains in their tracks.

F1 2018 (March 14)

The official videogame of the 2018 FIA Formula One World Championship, F1 2018 challenges you to make headlines as you become immersed in the world of F1 more than ever before. You will have to build your reputation both on and off the track, with time-pressured media interviews that influence your career in the sport. Do you exhibit sportsmanship or showmanship? Will you develop your team to the top or send your agent to target a rival team and driver? F1 2018 puts you in control of your destiny. Featuring all of the official teams, the drivers and all 21 circuits of the thrilling 2018 season. 2018 sees the return of the French and German Grand Prix to the calendar meaning that you can now race at Circuit Paul Ricard for the first time ever in the series while the Hockenheimring makes a return.

Fallout 4 (March 14)

Returning to your Xbox Game Pass library in a shiny set of Power Armor, Fallout 4 drops you back into the Commonwealth as you set out to search for your missing son. Explore the wasteland, fight vicious Deathclaws and gain new abilities to help shape the fate of a post-apocalyptic Boston. The more you explore, the more you discover, so be sure to scour every inch of the game’s huge map. After all, there will always be another settlement that needs your help.

So many games, so little time. Hopefully that’ll be enough to satisfy your gaming palate… for now! If you want to stay up-to date on all the gaming goodness coming at you every week, be sure to follow us on Twitter and Instagram, and download the mobile app. And no worries if you get lost between now and our next drop, ‘cause we’ll keep the LEGO Bat-Signal on for ya!

Join Xbox Game Pass Today

Xbox Game Pass is the new way for you to discover and play your next favorite game. Enjoy over 100 great games for one low monthly price. Plus, even more games are added all the time, including highly-anticipated new Xbox One exclusives the day they’re released. If you haven’t tried Xbox Game Pass, join today and get your first month for $1.