Posted on Leave a comment

Xbox now connects with Cortana and Alexa-enabled devices

Here at Team Xbox, we’ve had a long history in offering voice controls as a way to interact with your Xbox console through Kinect and headsets. Today, starting with select Xbox U.S. Insiders, we’re expanding voice support by introducing the Xbox Skill, which enables you to navigate and interact with Xbox One using voice commands through your Cortana and Alexa-enabled devices.

With the Xbox Skill, you can use voice commands to power your Xbox One console, adjust volume, launch games and apps, start and stop broadcasts on Mixer, capture screenshots, and more. It’s the fastest way to get into your games and one of the easiest ways to interact with your console for everyday tasks. For example, if you have the skill enabled on your Echo and you’re a part of the Insider preview, just say “Alexa, start Rocket League.” and this command will automatically turn on your console, sign you in, and launch your game.

The Xbox Skill integrates with your Cortana and Alexa-enabled device such as a Windows 10 PC, Amazon Echo, Harman Kardon Invoke, Sonos One, or Cortana and Alexa apps on iOS and Android, enabling voice commands to control your Xbox One console.

For Xbox Insiders* in the U.S. who want to try the Xbox Skill with Cortana or Alexa, here’s how to get started:

If you use Cortana:

  1. Sign into the Xbox you want to control.
  2. On your Windows 10 PC, click here and sign in with your Microsoft account to link the skill.
  3. Try your first command! “Hey Cortana, tell Xbox to open Netflix.”

If you use Alexa:

  1. Sign into the Xbox you want to control.
  2. Click here, sign in with your Amazon Account, and click Enable.
  3. Sign in with your Microsoft account to link the skill.
  4. Let Alexa discover your console, then follow the instructions to pair your console with Alexa.
  5. Try your first command! “Alexa, start Rocket League.”

Wondering what else the Xbox Skill can do? Just say “Ask Xbox what can I say?” to discover more commands for your console. For a full list of commands, troubleshooting assistance, and to give the team feedback and ideas, you can visit the Xbox Insider Subreddit.

As always, your feedback is important to us and our partners as we continue to evolve this experience and grow our voice integration across devices, digital assistants and voice services.

*Note to Xbox Insiders: We will be rolling the Xbox Skill out to Xbox Insider rings gradually. If the Digital Assistant setting is visible on your console in Settings -> Devices, then you are currently eligible to test the Xbox Skill. If it doesn’t appear, then please be patient as we are working quickly to add more Insider rings to the beta.

Posted on Leave a comment

Living on the edge: How a Microsoft apprenticeship helped a former gang member turn his back on the streets

“Our main users were women and they said the guys at their gym were pervs or muscly and intimidating; or it was a scary concept – what should be like going to a doctor is more like trying to work out in front of the strongest guy in class,” Uwadiae says.

Three people turned up to WeGym’s first group session, which was led by a friend of Uwadiae, and two went to the next one. His new business was up and running.

In the months that followed, Uwadiae taught himself to code so he could improve the website during the day. At night he would sneak around London in a hoodie to put up posters advertising WeGym, which would get ripped down again the next morning.

“I would put up posters for three or four hours until 2am or 3am, go home, sleep, get up at 8am and take bookings. We got five or six customers the first time I did it, then 10 after the next one.

“We have the opportunity to change the narrative around who can access a personal trainer and what the product of a personal trainer is. We’ve democratised it in a small way, for a small subset of people.

Posted on Leave a comment

Microsoft Hackathon 2017 winner powers Mixer’s massively successful HypeZone

HypeZone, released in December 2017, rapidly gained millions of new users to livestream community Mixer. HypeZone’s secret weapon? The 2017 Hackathon Grand Prize Winner, Watch For, a Microsoft Garage project.

Last month, Microsoft’s fifth annual One Week Hackathon wrapped up with astounding numbers. This year, during the largest private hackathon on the planet, over 23,000 employees registered to hack, and ultimately created 5,800 projects. As judging for this year’s projects begins and eager hackers await the winner announcements, it’s the perfect time to reconnect with last year’s Grand Prize Winner.

Originally called Lookout, the project team now known as Watch For has made tremendous strides in both personal growth and Microsoft business growth. Over the past year, team members Lenin Ravindranath Sivalingam, Matthai Philipose and Peter Bodik have been working as an incubation startup within Microsoft Research with autonomy and ownership to steer their project in a direction they desire.

The team’s original idea, which won the 2017 Hackathon, was an app to monitor live video streams on behalf of a user and notify him or her when specified events occur. Such a seemingly simple idea can be very powerful using artificial intelligence with many different applications.

2017 Hackathon winning team: Hackathon 2017 winning team: Matthai Philipose, Lenin Sivalingam, Yifan Wu, Peter Bodik and Victor Bahl. (Photo by Elizabeth Ong)
Hackathon 2017 winning team: Matthai Philipose, Lenin Sivalingam, Yifan Wu, Peter Bodik and Victor Bahl. (Photo by Elizabeth Ong)

As part of Microsoft Research, the project team members previously worked on video analytics for enterprise scenarios in their day jobs. One of their biggest partners was working with the city to monitor and analyze traffic cameras for a better understanding of how pedestrians, bikes, and vehicles crossed intersections.

Not surprisingly, livestreams are big in enterprise settings, and that translates as well to consumer settings. For Lenin, Matthai, and Peter, the most interesting part of working on a hack project was experimenting with how best to apply video analysis to consumer scenarios.

“What attracted me to this hackathon project was the chance to apply AI in large scale and at low-cost to the consumer setting. Our project really pushes the envelope on how efficient the AI systems would need to be, and it’s also meaningful in that my kids and mother can understand it and use it.” Matthai explained, adding, “And I love the idea of working with Lenin and Peter.”

The team took what they learned over the years about video analytics and traffic cams, and created such a compelling project that not only did Microsoft CEO Satya Nadella put his influence behind them, but the senior leadership team took notice and became excited about the possibilities. Ed Essey, principal program manager of Microsoft Garage, helped prepare the team to think and work like a lean startup.

Over the course of several months, they fine-tuned a business strategy for their product – including the team’s special blend of expertise, knowledge, experience, and idea-leadership – that led the team to work on Watch For full time.
In September 2017, a few weeks after the team’s Hackathon win, the Mixer group reached out to the team, having seen their project video. Mixer, acquired by Microsoft in 2016 as Beam, is a next-generation, interactive live streaming platform with a large gaming audience.

Taking a community-first focus on features, Matt Salsamendi, principal software engineering lead, Mixer and Chad Gibson, general manager, Mixer saw huge opportunity to accelerate Mixer’s vision in the computer vision space and were excited to partner with other Microsoft teams working in this area.

HypeZone Fortnite

The more popular games on Mixer tend to be multiplayer battle-royale style competitions where the last person standing wins. “Games like PlayerUnknown’s Battlegrounds (PUBG) and Fortnite are pretty new. For these games, a very simple thing works very well to light up Mixer scenarios.” Peter explained.

The scenario that Matt and Chad of Mixer wanted to execute on was how best to surface the most interesting parts of streams to a bigger audience. There are thousands of streams at any given time, of which only a couple hundred get viewed by most people. How do the rest of the streamers get any visibility and how do you avoid wasting those assets? How do Mixer fans discover those hidden gems? “The game streaming ecosystem has lots of undiscovered content, people wanting to be discovered, and viewers wanting to discover more compelling moments.”

“The game streaming ecosystem has lots of undiscovered content, people wanting to be discovered, and viewers wanting to discover more compelling moments.”

Lenin, Matthai, and Peter started to work closely with the Mixer team last September, and an ambitious goal organically formed, of launching new channels in winter of 2017 tailored with content discovered by AI models trained to “Watch For” specific events in streams. The timing coincided with PUBG’s release on Xbox One, which was fast becoming one of the most popular games on Mixer.

Mixer already had a front-end design where a single channel could host many different people’s streams continuously – they took advantage of that, and queried Watch For’s backend to determine when to switch between streams for the most interesting content. Thus, HypeZone was born – channels on Mixer using Watch For algorithms to highlight the final, nail-biting rounds of last-person standing games like PUBG that viewers found so engaging to watch.

“Matt already had the idea of HypeZone itself, to switch from stream to stream within a channel – but the experience of HypeZone evolved very quickly during our collaboration.” Lenin recalled. “We met with Matt and Chad early September. Two weeks later we had a prototype that we showed them. Then we kept improving its accuracy. By mid-October we had another prototype that they could use to run their HypeZone experience. We tested it for another 3 weeks. Then, 2 days before release, PUBG changed their UI. 1 day before release, we had to completely change all our models.”

Despite the whirlwind of activity, the Watch For team appreciated Mixer’s style of working fast and friendly. “As a business group, Mixer is very agile and easy to work with. We work close and we work well together.”

VictoryRoyale

“The choice of content for HypeZone is determined by all the analysis Watch For does. Which is one of the reasons why we were able to move so fast,” Peter explained. Peter and team had to tailor their AI models for HypeZone by building core video analytics skills specific to each game.

Over the last several months, HypeZone channels were among the most popular channels on Mixer. “It’s a win-win product. Viewers love it because it shows only the most exciting content, and streamers love it because they get featured on Mixer’s front page and get new followers. They start streaming more because they want to be featured on HypeZone and gain followers.” Game producers can also be counted among the many fans as HypeZone provides more exposure for their games.

The biggest challenge – and the team’s biggest accomplishment – was how to get HypeZone to scale, and at low-cost.

“HypeZone is driven by Watch For’s large-scale video analysis of every stream that’s coming into Mixer. Every stream we try to understand what’s on the screen. We look for various metadata that tell us the game is exciting. Text on the screen, icons that tell you state of the game, player stats and score. Over time we have evolved to understand more and more.” Lenin explained.

The secret sauce is very much a combination of Matthai’s AI expertise and Lenin and Peter’s end-to-end distributive systems knowledge that allows them to deeply and efficiently analyze and understand each stream’s content in real-time.

“This is one of the advantages of being in a company like Microsoft. The Garage and Hackathon gave us visibility, but there was a product group (Mixer) out there looking around who had a great understanding of their customers, and that Watch For might light up their market.” Matthai recalled how it all came together. “There was an element of luck that battle royale type games came into vogue around the same time. It’s a combination of all of these things that made this partnership work so well.”

“It’s one thing to have cool demos and enthusiasm from senior leadership, but it’s another thing to see our customers enjoying, laughing and crying , wanting to see more. That’s what really lit a fire under the whole project, that connection.”

A game-changer for streaming content platforms and how content can be surfaced and consumed – Watch For is a stellar example of using artificial intelligence for consumer scenarios. What’s next for Watch For? The team continues to work with Mixer, and other groups, to create awesome experiences yet to come using the power of AI.

Story by Meixia Huang

Check out HypeZone on Mixer https://mixer.com/
Get videos on the Mixer Channel One on YouTube
Follow Mixer on Twitter: https://twitter.com/WatchMixer
Read more about this Hackathon team:
Artificial intelligence eclipses cloud and mobile projects to win the day at Microsoft 2017 Hackathon

Posted on Leave a comment

Meet green warriors from India enabling a sustainable future

With a trained AI algorithm, the team hopes to classify the urban and rural areas, identify forest cover, river beds and other water bodies from satellite images, and create a precise grid map for the region. The team hopes to apply computer vision to create a comprehensive database of biodiversity in the region to help policymakers and local communities make better-informed economic, ecological, and infrastructure-related decisions.

“You can’t save an ecosystem if you don’t fully understand it,” exclaims Dr. Mariappan. “That’s where our data along with Microsoft’s AI resources can help.”

Tracking the monkey population in urban areas using AI-powered image recognition

A woman sitting on a table with a coffee cupThe monkey population in urban India has spiraled out of control in recent years. India’s capital city, New Delhi, alone reports at least five cases of monkey bites daily that can cause rabies and be fatal. It is estimated that 7,000 monkeys prowl the streets of the capital, damaging public property and attacking people. With their natural habitat shrinking owing to urbanization, authorities are struggling to avoid monkey attacks.

Managing the growth of the population is critical. Currently, there is no way to identify which monkeys have already been given birth control or sterilized without further handling such as tattooing a code or embedding a microchip in the monkeys. Ankita Shukla, a PhD student at Indraprastha Institute of Information Technology Delhi (IIIT Delhi), aims to use computer vision as a non-invasive alternative for identifying and tracking monkeys as it is safer and less stressful for the animals, as well as humans.

Shukla, a native of a small town near Lucknow, had earlier worked with the Wildlife Institute of India on a project to classify endangered tigers in a nature reserve with machine learning and distance-object recognition algorithms. She wants to combine this experience in wildlife monitoring with machine learning to create a tangible solution for the simian problem in cities.

She is creating an AI-enabled app that can help the community tag monkeys in photographs and upload it to a cloud where authorities can track the simian population’s growth, vaccination history, and movements. “With a bird’s eye view of the monkey population, we can deploy contraceptives more efficiently,” she says. “Training a deep neural network with image recognition to identify a monkey and its species, and whether it’s already been sterilized could go a long way towards solving this crisis,” Shukla adds.

Having teamed up with Saket Anand, a professor at IIIT Delhi, she pitched the idea to the AI for Earth panel earlier this year. The team plans to leverage the Microsoft Azure platform for the processing power required to train the AI model.

“The Microsoft resources and technical assistance helped us develop a genuinely useful app,” says Shukla. “We’re now trying to take things to the next level so that we can find a solution to the monkey menace in a scientific and humane manner.”

Posted on Leave a comment

Imagine Cup 2018: AI inspires next generation of developers

This post is authored by Nile Wilson, Software Engineer Intern at Microsoft.


Imagine Cup 2018 winning teams: smartARM (first place, front and center),
iCry2Talk (second place, attired in pink), and Mediated Ear (third place, at the right).

Every year, Microsoft hosts the Imagine Cup, a global competition bringing together creative, bright, and motivated students to develop technologies that will shape how we live, work, and play. This year, tens of thousands of students from across the world registered for the competition, but only 49 teams were selected to compete in the World Finals. In addition to the first, second and third place winners, this year’s competition also awarded the top projects in Artificial Intelligence (AI), Big Data, and Mixed Reality.

Of the 49 finalists, team smartARM won the competition with their innovative, inexpensive, AI-enabled prosthetic hand. The team was comprised of Samin Khan from the University of Toronto and Hamayal Choudhry from the University of Ontario Institute of Technology. Although smartARM took home the top prize, all teams in the finals impressed the judges with their creativity and drive to have a positive impact on the world.

One other thing most all the winning teams had in common – they used AI as a core part of their solutions.

Recent developments have accelerated the application of machine learning technologies across a wide variety of fields, from self-driving cars to AI-guided disease detection. There was a palpable sense of excitement around the profound and untapped capabilities of AI among the teams that participated at this year’s Imagine Cup.

“AI is empowerment.”
Joseph Sirosh, Corporate Vice President and CTO of AI, Microsoft

Winning AI Solutions

From helping farmers manage diseased plants to helping new parents identify the meaning of their babies’ cries, the Imagine Cup 2018 winners tackled a broad spectrum of problems. Although these solutions addressed different problems, their paths to success have a similar underlying structure – each team began by identifying a problem they were passionate about. Next, they carefully considered the resources available to them, including AI, and cleverly used those resources to build prototypes and solutions with potentially outsized impact. AI is truly in the mainstream now, empowering motivated individuals to turn their great ideas into reality and have impact on the world.

smartARM

Team SmartARM from Canada won the first place with their AI-infused robotic prosthetic arm, which is designed to provide users with a low-cost, multi-grip prosthesis. When individuals with upper-limb amputations decide whether they want to use a functional prosthesis, they are typically faced with the choice of purchasing a simple, single-grip arm or spending tens of thousands of dollars on a more advanced myoelectric arm. The high cost of advanced prostheses prevents many from having access to a device that could greatly aid in their day-to-day life. smartARM reduces this barrier by providing multi-grip functionality for approximately $100.


The smartARM on the stage of the Imagine Cup 2018 World Finals.

The trick to the low price achieved by SmartARM is the use of 3D printing and readily accessible low-cost cloud technology. The camera in the palm of the smartARM captures a video of whatever is in view of the arm. Once the user points the palm towards an object of interest, the video frames are sent to the Azure cloud where the Cognitive Services Custom Vision model helps to identify the object and returns most appropriate grip, based on the object’s size and shape. The grip determined by the model is then actuated on the smartARM once the user sends the activation signal.

This allows users to interact with a variety of objects, ranging from picking up house keys to holding a cup.

It’s a truly intriguing solution, the notion of integrating a camera and vision into a prosthetic arm. In a companion blog post, we plan to big a bit deeper into the smartARM story and the underlying technology. Meanwhile, to learn more about smartARM, you can visit their Imagine Cup team page and LinkedIn company page.

iCry2Talk

Team iCry2Talk from Greece won second place with their mobile app, which is designed to aid parents in identifying the needs of their crying baby. New parents and especially parents with hearing-impairments or mothers with postpartum depression often have difficulties when it comes to immediately identifying the needs of their crying baby. This result is unwanted stress for both the parents and the child, typically multiple times each day.

Promptly responding to the cry is essential for the healthy physiological and psychosocial development of the infant. Although babies cannot articulate their needs through the use of language, they do give hints through their cries and body language. The cries themselves contain information that could help parents identify what the baby’s immediate needs are, but these cues can be hard for untrained ears to pick up.

Team iCry2Talk strives to improve the quality of the communication and the relationship between parents and babies by helping parents understand their child’s cries.


Mother and child looking at the iCry2Talk app. Photo courtesy of the iCry2Talk team.

How does the app work? The parent records the cry of their baby through the app using a smart phone or a digital assistant, like Alexa or Cortana. This audio clip is then sent to the cloud, processed, and classified using their custom Deep Learning models. The result is then sent back to the phone or digital assistant within seconds, and the parent is notified of the meaning of the cry through text, image, voice feedback, and sign language.

The team takes a holistic approach by not only focusing on the technology, but by also building and engaging with a community consisting of parents and doctors, in addition to designing for inclusivity, accessibility, and personalization.

iCry2Talk continues to collect donated audio clips of cries from parents involved in their community effort and is constantly improving their models. Parents with babies up to 12 months old who wish to contribute to the database and join the community can contact iCry2Talk through their Facebook page or e-mail them directly at icry2talkinfo@gmail.com.

To learn more about iCry2Talk you can visit their Imagine Cup team page or their website.

Mediated Ear

Team Mediated Ear from Japan took the third place with their mobile app designed to help the hearing-impaired clearly hear the voice of specific persons in noisy environments. Individuals with hearing loss may use hearing aids to amplify sound, but often find it difficult to isolate single speakers in noisy environments. This can make conversations in public settings and meetings difficult as multiple people may be speaking at once. The team began to develop Mediated Ear when a friend of theirs with hearing loss talked about their challenges when communicating in noisy, multi-speaker environments.

Speaker registering their voice to be recognized by the Mediated Ear mobile app. Image courtesy of the Mediated Ear team.

Clearly isolating individual speakers from each other and from background noises is not a simple task. The team had to work hard to develop an approach that would reliably isolate individual speakers from a mixed audio source. Mediated Ear works as a smartphone application that listens to the current conversation and allows the user to adjust the volume for individual speakers played through their earphones.

To isolate a given speaker’s voice from other voices and background noise, the user hands the phone over to the speaker of interest and asks them to speak into the phone for one minute. After the speaker reads the passage displayed through the app, the audio file of the voice is sent into the cloud, where it is processed and fed into a modified WaveNet deep learning model. Once the model learns the speaker’s voice, the app allows the user to pick up on the speaker’s voice and selectively amplify it, making it easier for the user to understand what is being said and confidently engage in the conversation.

With Mediated Ear, people with hearing impairments have control over who they hear and have an easier time focusing on the people they want to listen to in noisy environments.

Want to learn more about Mediated Ear? Visit their Imagine Cup team page or their website.

SochWare

Team SochWare from Nepal won the AI Award for their mobile app to help farmers identify plant diseases and take steps to mitigate crop damage. Agriculture plays a critical role in the livelihood of Nepal and its people, but difficulties in farming have led to a decline in agriculture. When farmers notice a diseased plant, it is often difficult to recognize what kind of disease the plant is inflicted with. The challenge of disease identification makes it hard to properly address the situation. Farmers often suffer losses when they either do not treat crops or apply improper chemicals to handle disease. There are also situations where excessive chemical use on crops leads to negative health effects on consumers.


Local farmer checking plant disease status with the E-Agrovet mobile app. Photo courtesy of the SochWare team.

Understanding the vitality of agriculture to their country and their families, Team SochWare decided to focus their efforts on developing a solution for this problem. Their solution takes the form of E-Agrovet, a mobile app that uses computer vision to help farmers identify plant disease and learn the proper next-steps for treatment.

How does E-Agrovet work? The farmer takes a photo of the plant of interest through the app. This photo is then sent to the cloud, processed, and fed into their Cognitive Services Custom Vision model. A report is generated based on the results of the Deep Learning image classification model and is sent back to the notification hub on the mobile phone. This report informs the farmer of what the disease is, how to mitigate it, and may also connect the farmer with experts to allow them to act.

Through E-Agrovet, SochWare strives to aid farmers and reduce the use of unnecessary chemicals on crops, improving the quality of life for everyone.

Want to learn more about SochWare? Visit their Imagine Cup team page or their website.

DrugSafe

Team DrugSafe from India won the Big Data Award for their consumer-side and vendor-side apps that help fight the consumption and distribution of counterfeit drugs. The development all began when a friend was suddenly in a lot of pain. The team was surprised to find that their friend had unknowingly taken counterfeit drugs for his diabetes and suffering as a result. Upon further investigation, the team became aware of widespread drug counterfeiting, estimated to be a $75 billion industry in 2010, according to the World Health Organization (WHO). Realizing the widespread prevalence and severity of fake medicine, and its role in the unnecessary spread of disease and suffering, the team sought to develop a solution. Because this is such a large-scale problem – one that affects both consumers and legitimate drug vendors – the DrugSafe team decided to develop both a consumer-side and vendor-side solution via mobile apps and an online dashboard.


The Drugsafe mobile app (consumer side). Image courtesy of the DrugSafe team.

What do these apps do?

The mobile app consists of a simple interface that allows users to take photos of drug labels to check for anomalies in the text and label color. The app sends the photo through a custom pipeline involving Azure Cosmos DB and Cognitive Services and notifies the user of the validity of the drug. Users have the option of reporting the drug if it is found to be illegitimate. Selecting to report the issue will take the user to the built-in chatbot to expedite the reporting process. The app also has a community component and can warn users about pharmacies that are seeing an increase in counterfeit drug selling.

The vendor app allows drug vendors to monitor the conditions of their deliveries using an MXChip IoT Board, to ensure that their drugs are delivered safely. The IoT board records information such as temperature, pressure, humidity, and acceleration, which can help vendors monitor the condition of their shipments.

Through the adoption and use of the DrugSafe app, the team hopes to reduce the spread of disease and help individuals gain access to authentic safe medication.

To learn more about DrugSafe you can visit their Imagine Cup team page.

The Impact of AI

The number and quality of AI-centric entries at this year’s Imagine Cup tells us that the next generation of developers recognize AI’s game-changing potential. Although each of these student projects addressed very different needs, teams took a holistic approach and appropriately infused their solutions with AI.

We are inspired by the boldness of these young innovators. Their work reinforces our own mission, to “empower every person and every organization on the planet to achieve more”. We couldn’t be more excited to support the next generation of developers at future Imagine Cups and to see how their ideas – coupled with AI – will move and shake the world.

If you are interested to build your own intelligent solutions, we recommend getting started with the AI School – we have free tutorials that provide step-by-step instructions on how to build real world solutions on the Microsoft AI platform.

Nile

Posted on Leave a comment

Check out the roundup of newly announced PC gaming hardware

With gamescom and IFA now behind us, we’re taking a look at some of the hottest Windows gaming hardware announcements across various events—bringing them all into one, easy place for you to read. The last few weeks have seen a handful of new gaming laptops, GPUs, monitors and more announced for PC gamers of all varieties. Check out the list below to see what new hardware announced you’re most looking forward to!

Acer Predator Triton 900

Acer Predator Triton 900 Image

Acer Predator Triton 900 Image

The newly refreshed Acer Predator Triton 900 gaming rig sports the latest 8th Generation Intel Core i7 processor for ultimate power and speed, while the laptop’s hinge pivots the screen to fit whatever angle you’d like to game at. With the garage door, you can insert an Xbox controller dongle to play Xbox games seamlessly, and a custom-engineered dual-fan keeps things cool while the RGB backlit keyboard and overclocking capabilities customize your gaming experience.

Lenovo Legion Gaming Devices and Their Beefier GPUs

Lenovo Legion Image

Lenovo Legion Image

Lenovo has refreshed three gaming devices announced at E3 with upgraded GPUs: the NVIDIA GeForce GTX 1060 on the Lenovo Legion Y530 Laptop, and the latest NVIDIA GeForce RTX 2070 and RTX 2080 on the Lenovo Legion T730 tower and Lenovo Legion C730 cube.

A testament to Lenovo engineering and innovation, the Lenovo Legion Y530 15.6-inch FHD display gaming laptop is thermally optimized to run cooler and quieter with a full-sized white backlit keyboard. At 24 mm thin and 2.3 kg light, it’s been redesigned to deliver the ideal balance between epic gaming performance and practical portability. It also comes with powerful NVIDIA graphics, DDR4 memory and more, all optimally cooled via a re-engineered dual-channel cooling system.

The Lenovo Legion T730 is a 28-liter desktop that gives you the style and power you demand in a gaming tower, thanks to customizable RGB LED system lighting, a transparent side panel, optional liquid cooling, and extreme processing and graphics power with overclocking options to completely immerse you in your favorite titles. The space-saving Lenovo Legion C730 is a striking 19L cube with fully loaded overclocked components – boosted by the GeForce RTX 2070 and RTX 2080 GPU upgrades.

MSI P65 Creator

MSI P65 Creator Image

MSI P65 Creator Image

Looking for a notebook to enable you to keep gaming-on-the-go, but capturing the appeal of your inner creator? Look no further. The P65’s high-performance specifications are comparable to MSI’s top-of-the-line gaming offerings. It features Intel’s latest 8th Generation Core i7 processor and up to an NVIDIA GeForce GTX 1070 Max-Q GPU, allowing for fast rendering times and multitasking. The P65 has up to almost three times more graphical performance than leading competitor in its category. And it uses MSI’s Cooler Boost Trinity, the same system found in the company’s gaming laptops, to keep the notebook cool even during intense workloads.

With its ultra-light aluminum chassis, the P65 weighs just 4.14 pounds and measures 0.69 inches thick. At the same time, the slim design does not sacrifice battery life. The P65 has an 82Whr battery for up to more than 9 hours of regular use.

At launch, the P65 will be available in both silver and a limited-edition white. The White Limited Edition shares many of the same specifications as the Silver Edition but comes with an NVIDIA GeForce GTX 1070 Max-Q GPU, Hi-Res Audio, and Thunderbolt 3. At launch, the Silver Edition will be available with either an NVIDIA GeForce GTX 1060 Max-Q or GTX 1050 Ti. The White Limited Edition also includes an extended one-year warranty and protective laptop sleeve.

OMEN by HP

At gamescom in Cologne, Germany, OMEN by HP unveiled new hardware and mobile software innovations for gaming enthusiasts. The OMEN Obelisk is the latest addition to the company’s desktop gaming family. Also announced are new features for the OMEN Mindframe Headset – the world’s first headset with active earcup cooling technology – to add comfort and audio benefits for both competitive play and streaming.

Gamers are always hungry for more power—such as graphics cards and memory. The OMEN Obelisk will be one of the first systems to carry the NVIDIA GeForce RTX 20 Series of GPUs by way of the NVIDIA GeForce RTX 2080, powered by all-new NVIDIA Turing architecture, giving gamers access to the world’s ultimate gaming architecture. The integration of HyperX high-performance memory within the OMEN Obelisk is another critical part of engineering a desktop to utilize the latest GPUs, RAM and other components through an industry-standard upgrade path vital to gaming at the highest levels, both now and in the future.

OMEN Desktop Teardown Image

OMEN Desktop Teardown Image

The OMEN Obelisk Desktop addition to the family of OMEN Desktops has been engineered with peak performance and customization so PCs can keep up with the latest in gaming. With a DIY-friendly, industry-standard upgrade path, along with options for the latest graphics and processor architectures, gamers can continually achieve the performance they expect.

Top features of the OMEN Obelisk are detailed in HP’s press release and include world-class power and storage, latest series of NVIDIA GPUs, HyperX integration, strategic cooling with extensive venting that takes advantage of thoughtfully placed components to generate optimal airflow, and a customizable and clean design.

Speaking of top features, the forthcoming OMEN Mindframe Headset is the world’s first headset with active earcup cooling by way of HP’s patented thermoelectric driven Frostcap technology. It has been built with comfort in mind, while also delivering top-notch sound quality. The OMEN by HP team has made further additions prior to launch, including noise cancellation, real-time audible feedback via sidetone to take advantage of noise cancellation, and new fabric cups for improved breathability.

Regarding pricing and availability, the OMEN Obelisk Desktop is expected to be available in September via HP.com and other retailers for a starting price of $849.99. OMEN Game Stream for Android is expected to be available later this year via the Google Play Store for free. And the OMEN Mindframe Headset is expected to be available in October via HP.com and other retailers for a starting price of $199.99.

NZXT HUE 2 for PC Builders

On Tuesday, NZXT announced 10 products comprising the new HUE 2 RGB family, delivering the broadest line of RGB lighting accessories for PC builders. NZXT, a leading developer of software-powered hardware solutions for PC gaming, noted that the new HUE 2 ecosystem – including the four-channel HUE 2 RGB controller and the external HUE 2 Ambient Lighting controller, along with several new accessories – provides PC builders more options for customizing their systems.

At the heart of the HUE 2 ecosystem are four complete RGB kits that include everything PC builders need to get started: the HUE 2 RGB Lighting Kit, HUE 2 Ambient Lighting Kit, Aer RGB 2 Starter Kit (120mm) and Aer RGB 2 Starter Kit (140mm). Choose one to get started and add accessories to complete and perfect your PC’s lighting.

Builders can choose from a variety of RGB accessories to express their creativity and build the perfect lighting for their gaming PC. All HUE 2 accessories require a HUE 2 RGB controller, purchased separately in one of the HUE 2 kits. The new HUE 2 family of products will be available in North America in mid to late August, and in the rest of the world in late August. Read more here.

Alienware by Dell – Monitors, Gaming Desktops and more!

Alienware Aurora

Alienware Aurora Image

Alienware Aurora Image

The new Alienware Aurora gaming desktop, designed for VR experiences, and currently the designated training PC for Team Liquid, is both a proven powerhouse for gaming professionals as well as a complete package for gaming beginners and veterans who need significant performance and upgradability under a reliable, high-quality chassis.

For gamers looking for a mid-tower desktop, the new Alienware Aurora is ready for current Oculus VR or HTC/VIVE requirements and can also support higher performance configurations for the near future’s more demanding VR, as well as 4K gaming. Gamers can choose from a wide variety of upgradability options, spanning many performance and budgetary considerations.

Inspired by the Alienware Area-51, the external design of the system includes AlienFX customizable lighting and has incorporated various ergonomic features for user-handling and component access. Its internal arrangement and fan positioning delivers ambient air towards the priority components with exhausts at the top and rear of the system to deliver maximum performance.

Alienware 15 and 17

Alienware 15 Image

Alienware 15 Image

With up to 8th Gen Intel Core i9 k-Series processors, the Alienware 15 and 17 are engineered with an improved thermal solution and a thin, hinge-forward design to extract the maximum performance from top-of-the-line components.

The design introduces premium materials such as anodized aluminum, magnesium alloy, steel-reinforcements and copper thermal management to ensure a level of stiffness, rigidity, thinness and high-quality feel – all without sacrificing gaming performance. You can also get to internal components easier through a new bottom door mechanism.

With an upgraded Alienware TactX keyboard, with per key RBG LED, Alienware 15 and 17 now support n-key rollover, enabling more than 108-key commands for maximum actions per minute. It’s the only keyboard on a laptop with 2.2mm of key travel, allowing for rapid response for any keystroke. Reinforced with a steel back plate for rigidity and uniform feedback, it’s guaranteed to stay functional for up to 10 million keystrokes.

Dell 24 and 27 gaming monitors (S2419HGF and S2719DGF)

Dell Monitors Image

Dell Monitors Image

With Dell’s 24 (S2419HGF) and 27 (S2719DGF) gaming monitors, you’ll experience sharp, tear-free graphics with a swift overclocked refresh rate at 144Hz and 155Hz respectively. You’ll also leave lag behind, with blazing fast and responsive gameplay at an extremely rapid 1 millisecond response time.

These monitors support AMD FreeSync for smooth gameplay and give you the flexibility to game with the No-Sync fast refresh option on your existing hardware. You can also personalize and preset up to three gaming profiles, in addition to three factory preset modes, with a gamer-geared menu. Both monitors are distinguished by metallic paint in recon-blue to complement Dell gaming PCs for unified look and gaming experience.

The all-new gaming monitors provide maximum reliability with Dell Premium Panel Exchange, which allows a free panel replacement during the Limited Hardware Warranty period even if only one bright pixel is found.

Thanks for reading about all of the exciting new PC gaming hardware coming soon from our preferred partners. For the latest in PC gaming hardware and games, keep it tuned here to Xbox Wire.

Posted on Leave a comment

Podcast: Ben Cutler talks about putting the cloud under the sea

ben cutler podcast

Ben Cutler from Microsoft Research. Photo by Maryatt Photography.

Episode 40, September 5, 2018

Data centers have a hard time keeping their cool. Literally. And with more and more data centers coming online all over the world, calls for innovative solutions to “cool the cloud” are getting loud. So, Ben Cutler and the Special Projects team at Microsoft Research decided to try to beat the heat by using one of the best natural venues for cooling off on the planet: the ocean. That led to Project Natick, Microsoft’s prototype plan to deploy a new class of eco-friendly data centers, under water, at scale, anywhere in the world, from decision to power-on, in 90 days. Because, presumably for Special Projects, go big or go home.

In today’s podcast we find out a bit about what else the Special Projects team is up to, and then we hear all about Project Natick and how Ben and his team conceived of, and delivered on, a novel idea to deal with the increasing challenges of keeping data centers cool, safe, green, and, now, dry as well!

Related:


Episode Transcript

Ben Cutler: In some sense we’re not really solving new problems. What we really have here is a marriage of these two mature industries. One is the IT industry, which Microsoft understands very well. And then the other is a marine technologies industry. So, we’re really trying to figure out how do we blend these things together in a way that creates something new and beneficial?

(music plays)

Host: You’re listening to the Microsoft Research Podcast, a show that brings you closer to the cutting-edge of technology research and the scientists behind it. I’m your host, Gretchen Huizinga.

Host: Data centers have a hard time keeping their cool. Literally. And with more and more data centers coming online all over the world, calls for innovative solutions to “cool the cloud” are getting loud. So, Ben Cutler and the Special Projects team at Microsoft Research decided to try to beat the heat by using one of the best natural venues for cooling off on the planet: the ocean. That led to Project Natick, Microsoft’s prototype plan to deploy a new class of eco-friendly data centers, under water, at scale, anywhere in the world, from decision to power-on, in 90 days. Because, presumably for Special Projects, go big or go home.

In today’s podcast we find out a bit about what else the Special Projects team is up to, and then we hear all about Project Natick, and how Ben and his team conceived of, and delivered on, a novel idea to deal with the increasing challenges of keeping data centers cool, safe, green, and, now, dry as well! That and much more on this episode of the Microsoft Research Podcast.

Host: Ben Cutler. Welcome to the podcast.

Ben Cutler: Thanks for having me.

Host: You’re a researcher in Special Projects at MSR. Give us a brief description of the work you do. In broad strokes, what gets you up in the morning?

Ben Cutler: Well, so I think Special Projects is a little unusual. Rather than have a group that always does the same thing persistently, it’s more based on this idea of projects. We find some new idea, something, in our case, that we think is materially important to the company, and go off and pursue it. And it’s a little different in that we aren’t limited by the capabilities of the current staff. We’ll actually go out and find partners, whether they be in academia or very often in industry, who can kind of help us grow and stretch in some new direction.

Host: How did Special Projects come about? Has it always been “a thing” within Microsoft Research, or is it a fairly new idea?

Ben Cutler: Special Projects is a relatively new idea. In early 2014, my manager, Norm Whitaker, who’s a managing scientist inside Microsoft Research was recruited to come here. Norm had spent the last few years of his career at DARPA, which is Defense Advanced Research Projects Agency, which has a very long history in the United States, and a lot of the seminal technology achieved is not just on the defense side, where we see things like stealth, but also on the commercial or consumer side had their origins in DARPA. And so, we’re trying to bring some of that culture here into Microsoft Research and a willingness to go out and pursue crazy things and a willingness not just to pursue new types of things, but things that are in areas that historically we would never have touched as a company, and just be willing to crash into some new thing and see if it has value for us.

Host: So, that seems like a bit of a shift from Microsoft, in general, to go in this direction. What do you think prompted it, within Microsoft Research to say, “Hey let’s do something similar to DARPA here?”

Ben Cutler: I think if you look more broadly at the company, with Satya, we have this very different perspective, right? Which is, not everything is based on what we’ve done before. And a willingness to really go out there and draw in things from outside Microsoft and new ideas and new concepts in ways that we’ve never done, I think, historically as a company. And this is in some sense a manifestation of this idea of, you know, what can we do to enable every person in every organization on the planet to achieve more? And a part of that is to go out there and look at the broader context of things and what kind of things can we do that might be new that might help solve problems for our customers?

Host: You’re working on at least two really cool projects right now, one of which was recently in the news and we’ll talk about that in a minute. But I’m intrigued by the work you’re doing in holoportation. Can you tell us more about that?

Ben Cutler: If you think about what we typically do with a camera, we’re capturing this two-dimensional information. One stage beyond that is what’s called a depth camera, which is, in addition to capturing color information, it captured the distance to each pixel. So now I’m getting a perspective and I can actually see the distance and see, for example, the shape of someone’s face. Holoportation takes that a step further where we’ll have a room that we outfit with, say, several cameras. And from that, now, I can reconstruct the full, 3-D content of the room. So, you can kind of think of this as, I’m building a holodeck. And so now you can imagine I’m doing a video conference, or, you know, something as simple as like Facetime, but rather than just sort of getting that 2-D, planar information, I can actually now wear a headset and be in some immersive space that might be two identical conferences rooms in two different locations and I see my local content, but I also see the remote content as holograms. And then of course we can think of other contexts like virtual environments, where we kind of share across different spaces, people in different locations. Or even, if you will, a broadcast version of this. So, you can imagine someone’s giving a concert. And now I can actually go be at that concert even if I’m not there. Or think about fashion. Imagine going to a fashion show and actually being able to sit in the front row even though I’m not there. Or, everybody gets the front row seats at the World Cup soccer.

Host: Wow. It’s democratizing event attendance.

Ben Cutler: It really is. And you can imagine I’m visiting the Colosseum and a virtual tour guide appears with me as I go through it and can tell me all about that. Or some, you know, awesome event happens at the World Cup again, and I want to actually be on the soccer field where that’s happening right now and be able to sort of review what happened to the action as though I was actually there rather than whatever I’m getting on television.

Host: So, you’re wearing a headset for this though, right?

Ben Cutler: You’d be wearing an AR headset. For some of the broadcast things you can imagine not wearing a headset. It might be I’ve got it on my phone and just by moving my phone around I can kind of change my perspective. So, there’s a bunch of different ways that this might be used. So, it’s this interesting new capture technology. Much as HoloLens is a display, or a viewing technology, this is the other end, capture, and there’s different ways we can kind of consume that content. One might be with a headset, the other might just be on a PC using a mouse to move around much as I would on a video game to change my perspective or just on a cell phone, because today, there’s a relatively small number of these AR/VR headsets but there are billions of cell phones.

Host: Right. Tell me what you’re specifically doing in this project?

Ben Cutler: In the holoportation?

Host: Yeah.

Ben Cutler: So, really what’s going on right now is, when this project first started to outfit a room, to do this sort of a thing, might’ve been a couple hundred thousand dollars of cost, and it might be 1 to 3 gigabits of data between sites. So, it’s just not really practical, even at an enterprise level. And so, what we’re working on is, with the HoloLens team and other groups inside the company, to really sort of dramatically bring down that cost. So now you can imagine you’re a grandparent and you want to kind of play with your grandkids who are in some other location in the world. So, this is something that we think, in the next couple years, actually might be at the level the consumers can have access to this technology and use it every day.

Host: This is very much in the research stage, though, right?

Ben Cutler: We have an email address and we hear from people every day, “How do I buy this? How can I get this?” And you know, it’s like, “Hey, here’s our website. It’s just research right now. It’s not available outside the company. But keep an eye on this because maybe that will change in the future.”

Host: Yeah. Yeah, and that is your kind of raison d’etre is to bring these impossibles into inevitables in the market. That should be a movie. The Inevitables.

Ben Cutler: I think there’s something similar to that, but anyway…

Host: I think a little, yeah. So just drilling a little bit on the holoportation, what’s really cool I noticed on the website, which is still research, is moving from a room-based hologram, or holoported individual, into mobile holoportation. And you’ve recently done this, at least in prototype, in a car, yes?

Ben Cutler: We have. So, we actually took an SUV. We took out the middle seat. And then we mounted cameras in various locations. Including, actually, the headrests of the first-row passengers. So that if you’re sitting in that back row we could holoport you somewhere. Now this is a little different than, say, that room-to-room scenario. You can imagine, for example, the CEO of our company can’t make a meeting in person, so he’ll take it from the car. And so, the people who are sitting in that conference room will wear an AR headset like a HoloLens. And then Satya would appear in that room as though he’s actually there. And then from Satya’s perspective, he’d wear a VR headset, right? So, he would not be sitting in his car anymore. He would be holoported into that conference room.

(music plays)

Host: Let’s talk about the other big project you’re doing: Project Natick. You basically gave yourself a crazy list of demands and then said, “Hey, let’s see if we can do it!” Tell us about Project Natick. Give us an overview. What it is, how did it come about, where it is now, what does it want to be when it grows up?

Ben Cutler: So, Project Natick is an exploration of manufactured data centers that we place underwater in the ocean. And so, the genesis of this is kind of interesting, because it also shows not just research trying to influence the rest of the company, but that if you’re working elsewhere inside Microsoft, you can influence Microsoft Research. So, in this case, go back to 2013, and a couple employees, Sean James and Todd Rawlings, wrote this paper that said we should put data centers in the ocean and the core idea was, the ocean is a place where you can get good cooling, and so maybe we should look at that for data centers. Historically, when you look at data centers, the dominant cost, besides the actual computers doing the work, is the air conditioning. And so, we have this ratio in the industry called PUE, or Power Utilization Effectiveness. And if you go back a long time ago to data centers, PUEs might be as high as 4 or 5. A PUE of 5 says that, for every watt of power for computers, there’s an additional 4 watts for the air conditioning, which is just kind of this crazy, crazy thing. And so, industry went through this phase where we said, “OK, now we’re going to do this thing called hot aisle/cold aisle. We line up all the computers in a row, and cold air comes in one side and hot air goes out the other.” Now, modern data centers that Microsoft builds have a PUE of about 1.125. And the PUE we see of what we have right now in the water is about 1.07. So, we have cut the cooling cost. But more importantly we’ve done it in a way that we’ve made the data center much colder. So, we’re about 10-degrees Celsius cooler than land data centers. And we’ve known, going back to the middle of the 20th century, that higher temperatures are a problem for components and in fact, a factor of 10-degree Celsius difference can be a factor of 2 difference of the life expectancy of equipment. So, we think that this is one way to bring reliability up a lot. So, this idea of reliability is really a proxy for server longevity and how do we make things last longer? In addition to cooling, there’s other things that we have here. One of which is the atmosphere inside this data center is dry nitrogen atmosphere. So, there’s no oxygen. And the humidity is low. And we think that helps get rid of corrosion. And then the other thing is, data centers we get stuff comes from outside. So, by having this sealed container, safe under the ocean we hopefully have this environment that will allow servers to last much longer.

Host: How did data center technology and submarine technology come together so that you could put the cloud under water?

Ben Cutler: Natick is a little bit unusual as a research project because in some sense we’re not really solving new problems. What we really have here is a marriage of these two mature industries. One is the IT industry, which Microsoft understands very well. And then the other is a marine technologies industry. So, we’re really trying to figure out, how do we blend these things together in a way that creates something new and beneficial?

Host: And so, the submarine technology, making something watertight and drawing on the decades that people have done underwater things, how did you bring that together? Did you have a team of naval experts…?

Ben Cutler: So, the first time we did this, we just, sort of, crashed into it, and we, literally, just built this can and we just kind of dropped it in the water, and ok, we can do this, it kind of works. And so, then the second time around, we put out what we call a Request for Information. We’re thinking of doing this thing, and we did this to government and to academia and to industry, and just to see who’s interested in playing this space? What do they think about it? What kind of approaches would they take? And you know, we’re Microsoft. We don’t really know anything about the ocean. We’ve identified a bunch of folks we think do know about it. And on the industry side we really looked at three different groups. We looked to ship builders, we looked to people who were doing renewable energy in the ocean, which we should come back to that, and then we looked to oil and gas services industry. And so, we got their response and on the basis of that, we then crafted a Request for Proposal to actually go off and do something with us. And that identified what kind of equipment we put inside it, what our requirements were in terms of how we thought that this would work, how cool it had to be, the operating environment that needed to be provided for the servers, and also some more mundane stuff like, when you’re shipping it, what’s the maximum temperature things can get to when it’s like, sitting in the sun on a dock somewhere? And, on the basis of that, we got a couple dozen proposals from four different continents. And so, we chose a partner and then set forward. And so, in part, we were working with University of Washington Applied Physics Lab… is one of three centers of excellence for ocean sciences in the United States, along with Woods Hole and Scripps. And so, we leveraged that capability to help us go through the selection process. And then the company we chose to work with is a company called Naval Group, which is a French company, and among other things, they do naval nuclear submarines, surface ships, but they also do renewable energies. And, in particular, renewable energies in the ocean, so offshore wind, they do tidal energy which is to say, gaining energy from the motion of the tides, as well as something called OTEC which is Ocean Thermal Energy Conversion. So, they have a lot of expertise in renewable energy. Which is very interesting to us. Because another aspect of this that we like is this idea of co-location with offshore renewable energies. So, the idea is, rather than connecting to the grid, I might connect to renewable energies that get placed in the same location where we put this. That’s actually not a new idea for Microsoft. We have data centers that are built near hydroelectric dams or built near windfarms in Texas. So, we like this idea of renewable energy. And so, as we think about this idea of data centers in the ocean, it’s kind of a normal thing, in some sense, that this idea of the renewables would go with us.

Host: You mentioned the groups that you reached out to. Did you have any conversation with environmental groups or how this might impact sea life or the ocean itself?

Ben Cutler: So, we care a lot about that. We like the idea of co-location with the offshore renewables, not just for the sustainability aspects of this, but also for the fact that a lot of those things are going up near large populations centers. So, it’s a way to get close to customers. We’re also interested in other aspects of sustainability. And those include things like artificial reefs. We’ve actually filed an application for a patent having to use this idea of undersea data centers, potentially, as artificial reefs.

Host: So, as you look to maybe, scaling up… Say this thing, in your 5-year experiment, does really well. And you say, “Hey, we’re going to deploy more of these.” Are you looking, then, with the sustainability goggles on, so to speak, for Natick staying green both for customers but also for the environment itself?

Ben Cutler: We are. And I think one thing people should understand too, is you look out at the ocean and it looks like this big, vast open space, but in reality, it’s actually very carefully regulated. So anywhere we go, there are always authorities and rules as to what you can do and how you do them, so there’s that oversight. And there’s also things that we look at directly, ourselves. One of the things that we like about these, is from a recyclability standpoint, it’s a pretty simple structure. Every five years, we bring that thing back to shore, we put a new set of servers in, refresh it, send it back down, and then when we’re all done we bring it back up, we recycle it, and the idea is you leave the seabed as you found it. On the government side, there’s a lot of oversight, and so, the first thing to understand is, typically, like, as I look at the data center that’s there now, the seawater that we eject back into the ocean is about 8/10 of a degree warmer, Celsius, than the water that came in. It’s a very rapid jet, so, it very quickly mixes with the other seawater. And in our case, the first time we did this, a few meters downstream it was a few thousandths of a degree warmer by the time we were that far downstream.

Host: So, it dissipates very quickly.

Ben Cutler: Water… it takes an immense amount of energy to heat it. If you looked at all of the energy generated by all the data centers in the world and pushed all of them at the ocean, per year you’d raise the temperature a few millionths of a degree. So, in net, we don’t really worry about it. The place that we worry about it is this idea of local warming. And so, one of the things that’s nice about the ocean is because there are these persistent currents, we don’t have buildup of temperature anywhere. So, this question of the local heating, it’s really just, sort of, make sure your density is modest and then the impact is really negligible. An efficient data center in the water actually has less impact on the oceans than an inefficient data center on land does.

Host: Let’s talk about latency for a second. One of your big drivers in putting these in the water, but near population centers, is so that data moves fairly quickly. Talk about the general problems of latency with data centers and how Natick is different.

Ben Cutler: So, there are some things that you do where latency really doesn’t matter. But I think latency gets you in all sorts of ways, and in sometimes surprising ways. The thing to remember is, even if you’re just browsing the web, when a webpage gets painted, there’s all of this back-and-forth traffic. And so, ok, so I’ve got now a data center that’s, say, 1,000 kilometers away, so it’s going to be 10 milliseconds, roundtrip, per each communication. But I might have a couple hundred of those just to paint one webpage. And now all of a sudden it takes me like 2 seconds to paint that webpage. Whereas it would be almost instantaneous if that data center is nearby. And think about, also, I’ve got factories and automation and I’ve got to control things. I need really tight controls there in terms of the latency in order to do that effectively. Or imagine a future where autonomous vehicles become real and they’re interacting with data centers for some aspect of their navigation or other critical functions. So, this notion of latency really matters in a lot of ways that will become, I think, more present as this idea of intelligent edge grows over time.

Host: Right. And so, what’s Natick’s position there?

Ben Cutler: So, Natick’s benefit here, is more than half the world’s population lives within a couple hundred kilometers of the ocean. And so, in some sense, you’re finding a way to put data centers very close to a good percentage of the population. And you’re doing it in a way that’s very low impact. We’re not taking land because think about if I want to put a data center in San Francisco or New York City. Well turns out, land’s expensive around big cities. Imagine that. So, this is a way to go somewhere where we don’t have some of those high costs. And, potentially, with this offshore renewable energy, and not, as we talked about before, having any impact on the water supply.

Host: So, it could solve a lot of problems all at once.

Ben Cutler: It could solve a lot of problems in this very, sort of, environmentally sustainable way, as well as, in some sense, adding these socially sustainable factors as well.

Host: Yeah. Talk a little bit about the phases of this project. I know there’s been more than one. You alluded to that a little bit earlier. But what have you done stage wise, phase wise? What have you learned?

Ben Cutler: So, Phase 1 was a Proof of Concept, which is literally, we built a can, and that can had a single computer rack in it, and that rack only had 24 servers. And that was about one-third of the space of the rack. It was a standard, what we call, 42U rack, which reflects the size of the rack. Fairly standard for data centers. And then other two thirds were filled with what we call load trays. Think of them as, all they do is, they’ve got big resistors that generate heat. So, it’s like hairdryers. And so, they’re used, actually, today in data centers to just, sort of, commission new data centers. Test the cooling system, actually. In our case, we just wanted to generate heat. Could we put these things in the water? Could we cool it? What would that look like? What would be the thermal properties? So, that was a Proof of Concept just to see, could we do this? Could we just, sort of, understand the basics? Were our intuitions right about this? What sort of problems might we encounter? And just, you know, I hate to use… but, you know, get our feet wet. Learning how to interact…

Host: You had to go there.

Ben Cutler: It is astonishing the number of expressions that relate to water that we use.

Host: Oh gosh, the puns are…

Ben Cutler: It’s tough to avoid. So, we just really wanted to get some sense of, what it like was to work with the marine industry? Every company and, to some degree, industry, has ways in which they work. And so, this was really an opportunity for us to learn some of those and become informed, before we go to this next stage that we’re at now. Which is more as a prototype stage. So, this vessel that we built this time, is about the size of a shipping container. And that’s by intent. Because then we’ve got something that’s of a size that we can use standard logistics to ship things around. Whether the back of a truck, or on a container ship. Again, keeping with this idea of, if something like this is successful, we have to think about what are the economics of this? So, it’s got 12 racks this time. It’s got 864 servers. It’s got FPGAs, which is something that we use for certain types of acceleration. And then, each of those 864 servers has 32 terabytes of disks. So, this is a substantial amount of capability. It’s actually located in the open ocean in realistic operating conditions. And in fact, where we are, in the winter, the waves will be up to 10 meters. We’re at 36 meters depth. So that means the water above us will vary between 26 and 46 meters deep. And so, it’s a really robust test area. So, we want to understand, can this really work? And what, sort of, the challenges might be in this realistic operating environment.

Host: So, this is Phase 2 right now.

Ben Cutler: This is Phase 2. And so now we’re in the process of learning and collecting data from this. And just going through the process of designing and building this, we learned all sorts of interesting things. And so, turns out, when you’re building these things to go under the ocean, one of the cycling that you get is just from the waves going by. And so, as you design these things, you have to think about how many waves go by this thing over the lifetime? What’s the frequency of those waves? What’s the amplitude of those waves? And this all impacts your design, and what you need to do, based on where you’re going to put it and how long it will be. So, we learned a whole bunch of stuff from this. And we expect everything will all be great and grand over the next few years here. But we’ll obviously be watching, and we’ll be learning. If there is a next phase, it would be a pilot. And now we’re talking to build something that’s larger scale. So, it might be multiple vessels. There might be a different deployment technology than what we used this time, to get greater efficiency. So, I think those are things that, you know, we’re starting to think about, but mostly, right now, we’ve got this great thing in the water and we’re starting to learn.

Host: Yeah. And you’re going to leave it alone for 5 years, right?

Ben Cutler: This thing will just be down there. Nothing will happen to it. There will be no maintenance until it’s time to retire the servers, which, in a commercial setting, might be every 5 years or longer. And then we’ll bring it back. So, it really is the idea of a lights-out thing. You put it there. It just does its thing and then we go and pull it back later. In an actual commercial deployment, we’d probably be deeper than 36 meters. The reason we’re at 36 meters, is, it turns out, 40 meters is a safe distance for human divers to go without a whole lot of special equipment. And we just wanted that flexibility in case we did need some sort of maintenance or some sort of help during this time. But in a real commercial deployment, we’d go deeper, and one of the reasons for that, also, is just, it will be harder for people to get to it. So, people worry about physical security. We, in some sense, have a simpler challenge than a submarine because a submarine is typically trying to hide from its adversaries. We’re not trying to hide. If we deploy these things, we’d always be within the coastal waters of a country and governed by the laws of that country. But we do also think about, let’s make this thing safe. And so, one of the safety aspects is not just the ability to detect when things are going around you, but also to put it in a place where it’s not easy for people to go and mess with it.

Host: Who’s using this right now? I mean this is an actual test case, so, it’s a data center that somebody’s accessing. Is it an internal data center or what’s the deal on that?

Ben Cutler: So, this data center is actually on our global network. Right now, it’s being used by people internally. We have a number of different teams that are using it for their own production projects. One group that’s working with it, is we have an organization inside Microsoft called AI for Earth. We have video cameras, and so, one of the things that they do is, they’re watching the different fish going by, and other types of much more bizarre creatures that we see. And characterizing and counting those, and so we can kind of see how things evolve over time. And one of the things we’re looking to do, potentially, is to work with other parties that do these more general assessments and then provide some of those AI technologies to them for their general research of marine environment and how, when you put different things in the water, how that affects things, either positively or negatively. Not just, sort of, what we’re doing, but other types of things that go in the water which might be things as simple as cables or marine energy devices or other types of infrastructure.

Host: I would imagine, when you deploy something in a brand-new environment, that you have unintended consequences or unexpected results. Is there anything interesting that’s come out of this deployment that you’d like to share?

Ben Cutler: So, I think when people think of the ocean, they think this is like a really hostile and dangerous place to put things. Because we’re all used to seeing big storms, hurricanes and everything that happens. And to be sure, right at that interface between land and water is a really dangerous place to be. But what you find is that, deep under the waves on the seabed, is a pretty quiet and calm place. And so, one of the benefits that we see out of this, is that even for things like 100-year hurricanes, you will hear, acoustically, what’s going on, on the surface, or near the land… waves crashing and all this stuff going on. But it’s pretty calm down there. The idea that we have this thing deep under the water that would be immune to these types of things is appealing. So, you can imagine this data center down there. This thing hits. The only connectivity back to land is going to be fiber. And that fiber is largely glass, with some insulating shell, so it might be fuse so it will break off. But the data center will keep operating. Your data center will still be safe, even though there might be problems on land. So, this diversity of risk is another thing that’s interesting to people when we talk about Natick.

Host: What about deployment sites? How have you gone about selecting where you put Project Natick and what do you think about other possibilities in the future?

Ben Cutler: So, for this Phase 2, we’re in Europe. And Europe, today, is the leader in offshore renewable energies. Twenty-nine of the thirty largest offshore windfarms are located in Europe. We’re deployed at the European Marine Energy Center in the Orkney Islands of Scotland. The grid up there is 100% renewable energy. It’s a mix of solar and wind as well as these offshore energies that people are testing at the European Marine Energy Center or EMEC. So, tidal energy and wave energy. One of the things that’s nice about EMEC is people are testing these devices. So, in the future, we have the option to go completely off this grid. It’s 100% renewable grid, but we can go off and directly connect to one of those devices and test out this idea of a co-location with renewable energies.

Host: Did you look at other sites and say, hey, this one’s the best?

Ben Cutler: We looked at a number of sites. Both test sites for these offshore renewables as well as commercial sites. For example, go into a commercial windfarm right off the bat. And we just decided, at this research phase, we had better support and better capabilities in a site that was actually designed for that. One of the things is, as I might have mentioned, the waves there get very, very large in the winter. So, we wanted some place that had very aggressive waters so that we know that if we survive in this space that we’ll be good pretty much anywhere we might choose to deploy.

Host: Like New York. If you can make it there…

Ben Cutler: Like New York, exactly.

Host: You can make it anywhere.

Ben Cutler: That’s right.

(music plays)

Host: what was your path to Microsoft Research?

Ben Cutler: So, my career… I would say that there’s been very little commonality in what I’ve done. But the one thing that has been common is this idea of taking things from early innovation to market introduction. So, a lot of my early career was in startup companies, either as a founder or as a principle. I was in super computers, computer storage, video conferencing, different types of semiconductors, and then I was actually here at Microsoft earlier, and I was working in a group exploring new operating system technologies. And then, after that, I went to DARPA, where I was there for a few years working on different types of information technology. And then I came back here. And, truthfully, when I first heard about this idea that they were thinking about doing these underwater data centers, it just sounded like the dumbest idea to me, and… But you know, I was willing to go and then, sort of, try and think through, ok, on the surface it sounds ridiculous. But a lot of things start that way. And you have to be willing to go in, understand the economics, understand the science and the technology involved, and then draw some conclusion of whether you think that can actually go somewhere reasonable.

Host: As we close, Ben, I’m really interested in what kinds of people you have on your team, what kinds of people might be interested in working on Special Projects here. Who’s a good fit for a Special Projects research career?

Ben Cutler: I think we’re looking for people who are excited about the idea of doing something new and don’t have fear of doing something new. In some sense, it’s a lot like people who’d go into a startup. And what I mean by that is, you’re taking a lot more risk, because I’m not in in a large organization, I have to figure out a lot of things out myself, I don’t have a team that will know all these things, and a lot of things may fall on the floor just because we don’t have enough people do get everything done. It’s kind of like driving down the highway and you’re, you know, lashed to the front bumper of the car. You’re fully exposed to all the risk and all the challenges of what you’re doing. And you’re, you know, wide open. There’s no end of things to do and you have to figure out what’s important, what to prioritize, because not everything can get done. But have the flexibility to really, then, understand that even though I can’t get everything done, I’m going to pick and choose the things that are most important and really drive in new directions without a whole lot of constraints on what you’re doing. So, I think that’s kind of what we look to. I have only two people who actually directly report to me on this project. That’s the team. But then I have other people who are core members, who worked on it, who report to other people, and then across the whole company, more than two hundred people touched this Phase 2, in ways large and small. Everything from helping us design the data center, to people who refurbished servers that went into this. So, it’s really a “One Microsoft” effort. And so, I think that there’s always opportunities to engage, not just by being on a team, but interacting and providing your expertise and your knowledge base to help us be successful. Because only in that way that we can take these big leaps. And so, in some sense, we’re trying to make sure that Microsoft Research is really staying true to this idea of pursuing new things but not just five years out, in known fields, but look at these new fields. Because the world is changing. And so, we’re always looking for people who are open to these new ideas and frankly are willing to bring new ideas with them as to where they think we should go and why. And that’s how we as a company I think grow and see new markets and are successful.

(music plays)

Host: Ben Cutler, it’s been a pleasure. Thanks for coming on the podcast today.

Ben Cutler: My pleasure as well.

To learn more about Ben Cutler, Project Natick, and the future of submersible data centers, visit natick.research.microsoft.com.

Posted on Leave a comment

Forza Racing World Championship 2018 goes to the finish line in London Oct. 20-21

This year’s Forza Racing Championship (ForzaRC) is in its home stretch with the ForzaRC Series 2 Playoffs taking place in Mexico City from Sept. 29-30, where the top 36 ForzaRC drivers will compete for $75,000 in cash prizes. After the Playoffs, the top 24 drivers will have the chance to move on to the biggest ForzaRC event of the year – the Forza Racing World Championship (ForzaRWC) 2018 taking place on from Oct. 20 – 21 at Gfinity Esports Arena in London.

At the ForzaRWC 2018, drivers will race for their share of over $100,000 in cash prizes and the title of 2018 Forza Racing World Champion. Catch all the action live on Oct. 20-21 at our Mixer channel watch.ForzaRC.com, where viewers will have a chance to earn rewards and get involved in the competition by voting on race features, or on Twitch to try out the ForzaRC extension and receive in-game rewards through Twitch Drops.

Love watching online but wish you could be there to experience ForzaRWC in person? Join the team onsite and watch the competition unfold all weekend long! Tickets to attend the ForzaRWC go on sale today at 11:00 a.m. PDT. For more information about the onsite event and to purchase tickets, please visit ForzaRWC.eventbrite.com.

We’re thrilled to share these details around ForzaRWC and you can be sure there’s more to come! Shortly after the Series 2 Playoffs, you can look forward to the final car and track combinations and the full lineup of competing driver. For more updates including special guests, viewership rewards, tournament structure and more, tune in to a special ForzaRWC livestream on Oct. 10 at 12:00 p.m. PDT at watch.ForzaRC.com.

For more information on all of the above, head over to ForzaRC.com and follow us on Twitter for the latest news and updates.

Posted on Leave a comment

How one nonprofit turned a golf course into a ‘no-fail’ job training program

Shawn Bennett was familiar with the feeling of failure when he was younger. Wrestling with anxiety and substance abuse, he had repeated run-ins with the law – and lacked the support needed to put his life on track.

“I was a self-run riot on cruise control to somewhere no one wanted to be,” is how he described his life after spending time in prison for operating a vehicle while intoxicated.

His situation was like many that John Schmidt, a corporate executive, had in mind when he and other founders – committed members of their community looking for solutions for people living in poverty – created Riverview Gardens. This unusual “no-fail” job training program in Appleton, Wisconsin, has helped more than 1,200 people, including Bennett, regain their footing and reclaim their lives. Schmidt, who has also served for years on the board of a local homeless shelter, knows any of us could face poverty, and even homelessness, because of bad luck and without a support system.

“There’s a very fine line today between the haves and have-nots,” says Schmidt. “There’s oftentimes this perception that there’s something wrong with someone who might be homeless. But most times, these are everyday people whose luck wasn’t quite as good as somebody else’s luck in life.”

The community center at Riverview Gardens, which was formerly a country club. This nonprofit program is supported in part by Microsoft TechSpark Wisconsin, a civic program that fosters greater economic opportunity and job creation in local communities across the country.

Riverview Gardens is situated on 72 bucolic acres of a former country club and golf course along the Fox River in Appleton, in the northeastern part of the state. Appleton has historically been known for its paper mills. It also has a legacy of firsts: The first electricity for sale came from a hydroelectric plant built by a paper company executive in the 1880s. It’s also home to the first telephone system in the country and the first electric trolley system.

In another kind of first, Schmidt and the founding members converted the Riverview Country Club into Riverview Gardens. This private country club, Wisconsin’s oldest, filed for bankruptcy in 2011.

a greenhouse
Lettuce is grown without soil in the pool at the former country club.

About 25 acres of the site are used for the certified organic farming of fruits and vegetables including beets, potatoes, carrots, herbs, tomatoes, onions and kale. There are 20 passive solar greenhouses also on site. The country club pool has gone from a place for swimming laps to growing lettuce without soil.

The hydroponic greenhouse is often tended to by individuals who are veterans who may have post-traumatic stress disorder or are experiencing homelessness.  They find it a more calming environment, in contrast to the noise and activity of the farm.

This nonprofit program is supported in part by Microsoft TechSpark Wisconsin, a civic program that fosters greater economic opportunity and job creation in local communities across the country, particularly in those outside major metropolitan centers. Appleton, with a population of about 75,000, is one of those communities, as is Green Bay, which is about 30 miles away.

Now crop land, Riverview Gardens was once home to a golf course and country club.

TechSpark is assisting Riverview Gardens in three areas. First, it provides technology that is being used in the hydroponics greenhouse to monitor and adjust water temperature, pH balance and nutrient levels in the pool. That should help to reduce the grow time. The technology is already making a difference, providing a more consistent harvest.

Second, Microsoft’s FarmBeats initiative is being employed. FarmBeats, also an AI for Earth feature project, uses ground-based sensors, Power BI, the Internet of Things (IoT) and TV white spaces (which leverage unused broadcasting frequencies to deliver broadband connectivity) to measure soil irrigation needs. It also helps determine the right time to apply fertilizer and other inputs, as well as how much to apply, to grow a more productive crop. Third, Microsoft is helping Riverview Gardens undergo a digital transformation. In the past, the organization has kept much of its data – records like the seed distribution log, grow crop log, even handwashing logs – on paper. Riverview Gardens is now moving some of it to electronics records for better efficiency.

Riverview Gardens took 25 acres of the former golf course and converted it to a rich growing environment for fruits and vegetables, as well as a growing environment for the people it serves.

With these tools, Riverview Gardens can increase its farm yields and raise more money from the sale of produce, which funds the program’s operations. These tools also help give the Riverview Gardens staff more time to spend with the people who need it: their clients.

The technology is “helping us understand our farming better, our water quality better, streamlining our business processes and taking a lot of variability out of the entire operation so that we can focus on the people that we’re serving, and not have to worry as much about other aspects of the business,” says Schmidt.

Photo of John Schmidt
John Schmidt, CEO of U.S. Venture, is among the founders of Riverview Gardens.

Those who participate in Riverview Gardens’ program also can work in the kitchen or otherwise help with setup at events at the club, now a community center. They also work to help clean Appleton’s downtown streets, early in the morning, after they’ve received training on equipment used for cleanup. Or they might do maintenance – such as painting, lawn care or snow removal – at other nonprofits and businesses in town.

Once participants complete 90 hours of work – known as ServiceWorks – along with ongoing counseling about job and life skills, Riverview Gardens helps them find – and keep – jobs by following their progress for three years.

“No matter how long it takes you to do 90 hours, whether that’s three weeks or three years, we will always accept you back into the program, and you will just continue where you left off,” says Pilar Martinez, the director of community engagement at Riverview Gardens.

Baked into Riverview Gardens’ recipe for success is its “no-fail” policy. Those who may have experienced roadblocks in the past are provided the tools and opportunities to not fail.

“’Success’ is a subjective term and can be different for many different people,” says Martinez. “We look at the resiliency of the people we serve and the barriers they overcome to move themselves forward.”

No matter how long it takes you to do 90 hours, whether that’s three weeks or three years, we will always accept you back into the program.

Shawn Bennett at Riverview Gardens’ Earn-A-Bike shop. Those going through Riverview Gardens’ program, as well as volunteers who help out on the farm, can earn a refurbished bike by working a certain number of hours.

For Bennett, “coming out of prison, not having any family – there was no real support, no real comfort,” he says. Bennett, 49, earned his high school equivalency diploma in prison, and is now working as a tech intern at Fox Valley Technical College, which serves about 50,000 students a year.

There, he has earned an associate degree in computer support, and is working on two other related degrees. He was awarded a Fox Valley Technical College Foundation scholarship for an essay about his personal story, something he wrote after going through Riverview Gardens five years ago.

“The sense of community at Riverview Gardens really helped me,” says Bennett. “To be in a place like this, it makes you feel like you’re welcome here. You’re part of something.”

Carl Gustavson says Riverview Gardens made a huge difference in his life.

Carl Gustavson, 29, is also among those who found success after going through a tough time. Things became difficult for him after moving to Nashville to pursue his dream of being a musician.

“I thought I was going to be like Woody Guthrie; he rode the rails and played his guitar for people,” says Gustavson. “I kind of had a romantic view of being a musician. But the reality is you can end up living in a tent, like I did, and just start feeling like you can’t do anything.”

Gustavson is grateful for the help he has received at Riverview Gardens.

“I was frustrated – by society and by my situation,” he says. “I didn’t think it was ever going to get better. I thought I was going to be stuck in a rut forever.”

After completing ServiceWorks, he was placed in a job doing detail work at a car dealership last spring. He feels optimistic about the future, and at some point, says he would like to use the bachelor’s degree in marketing he earned in 2011.

’Success’ is a subjective term and can be different for many different people,” says Martinez. “We look at the resiliency of the people we serve and the barriers they overcome to move themselves forward.

Much of the spark and enthusiasm at Riverview Gardens comes from its staff, led by executive director Cindy Sahotsky. She is often right in the middle of the action, no matter the job. When program participants visited Sacred Heart Parish to help remove large stones where a tree once stood, Sahotsky grabbed a shovel and plunged into the work at hand.

“She values people, and she expects that if she’s going to ask them to do something, she has to do her part,” says Laura Savoie, the parish’s business manager. “She pitches right in. And she does have high expectations. She expects you to do what you said that you’d do.”

Riverview Gardens executive director Cindy Sahotsky, front, center, wearing a dark sweatshirt, surrounded by some of the nonprofit’s staff, and Microsoft TechSpark Wisconsin manager Michelle Schuler, front, third from left.

There is also a three-year “follow” program, based on findings that show individuals who have been incarcerated and are tracked for that length of time, with guidance and counseling, have the lowest recidivism rate, according to Sahotsky.

The follow program offers support with Riverview Gardens alumni who are now employees elsewhere, and also offers those employers guidance regarding behavior. For employees, it can include concerns like how to get a bus pass or feeling like a boss doesn’t like a worker. For employers, it might mean getting Riverview Gardens’ help coaching an employee who is taking breaks too often, or guiding an employee to be more patient in the workplace.

“The people we serve are individuals who have multiple barriers to long-term employment,” says Sahotsky. “Riverview Gardens really came to be to address that root cause of homelessness. It’s not because our folks can’t get jobs. It’s that they struggle to keep them because they have barriers.”

To be in a place like this, it makes you feel like you’re welcome here. You’re part of something.

Sahotsky, who also oversees the COTS homeless shelter in Appleton, the same place where Schmidt volunteers, is a former corporate human resources manager. She knows how such issues can loom large for the clients Riverview Gardens serves.

“Getting the job is just one part of that whole process,” says Sahotsky. “Keeping that job, getting to work, getting along with others – those are all part of it. Having expectations that people who have multiple barriers to stable employment are just going to get a job and keep it is probably not realistic. They’re going to need support to continue along in this process.”

The program is free to participants. In addition to the money raised from the sale of produce grown at Riverview Gardens, revenue from the rental of the country club building for special events is used to run Riverview Gardens.

Volunteers often work on the farm alongside program participants and staff. “We believe all people have value and contribute to the community in which they live,” is part of the nonprofit’s credo.

And not only are area employers involved in hiring Riverview Gardens’ clients, but many from throughout the state also come to work on the farm as volunteers. So do many residents of Appleton. It’s a true partnership. Working together in the fields, no one knows the other person’s title, or background, or standing. They just know one another by the smiles and first names they share.

“The partnerships Riverview Gardens has with employers and the larger community, to create economic opportunities for those who need them, is one of the things that makes it so effective,” says Microsoft TechSpark Wisconsin manager Michelle Schuler. She also serves on this nonprofit’s board. “It’s a real pleasure for those of us at Microsoft to work with Riverview Gardens to help digitally transform their services, and as a result, even more lives.”

That transformation, Schmidt points out, is about recognizing that any of us could be in a position in which we need retraining or other support to help put our lives on better paths.

Top photo: Microsoft’s FarmBeats initiative is being employed at Riverview Gardens in Wisconsin. FarmBeats uses ground-based sensors, Power BI, the Internet of Things (IoT) and TV white spaces to measure soil irrigation needs. Follow @MSFTissues on Twitter. 

Photos courtesy of www.ImageStudios.com

Posted on Leave a comment

AI for Earth: Helping save the planet with data science

Wee Hyong Tok is a data scientist. He has a passion for numbers, a faith in technology – and a mission that might make a superhero think twice.

“I want to save the Earth,” he says matter-of-factly. “That seems like a very bold statement. But, I strongly believe that artificial intelligence (AI) can play an important role in monitoring the health of our planet.”

Singapore-born and educated, Wee Hyong has been a data guy and techie all his working life – first in academia, and later with Microsoft in China and the United States where he helped create ground-breaking products in the cloud.

For more than a year now, he has been leading an elite global research team for AI for Earth – a five-year, US$50 million Microsoft initiative that supports, and partners with, environmental groups and researchers. They are tackling some of the world’s most intractable problems by marshaling the immense power of AI, machine learning (ML), and the cloud.

Wee Hyong Tok, Principal Data Science Manager, AI & Research.

In a recent interview during a quick visit back to Singapore, Wee Hyong summed up the challenge: We live on planet Earth, and yet we know very little about it.

We have limited time to learn how to conserve its resources. Fresh water supplies are being dangerously overexploited. Land is being exhausted and degraded to produce more food for more people in ever-growing cities. Thousands of species are fading fast into extinction as their habitats disappear in a whirl of industrialization and a haze of pollution. The oceans are choking on plastics and the carbon-charged climate is changing. Precious things that are vital to our existence are under threat and, if lost, might never come back.

I strongly believe that AI can play an important role in monitoring the health of our planet.

When we hear such things, most of us tend to shrug helplessly. Such problems just seem too big, too hard, and too scary to fix. But Wee Hyong and his colleagues at AI for Earth and Microsoft Research are convinced that solutions can come in our time – if data, technology, and imagination are put to work.

“I am an optimist,” he says before describing the technical complexities surrounding his team’s quest. “We can learn how to leverage AI to solve some of the sustainability challenges facing humanity today.”

Asia’s elusive and endangered Snow Leopard. Photo: Peter Bolliger.

Boiled down, AI for Earth aims to create sustainable solutions across four areas that are key to the health of the planet and the future of humankind: agriculture, water, biodiversity, and climate change.

Wee Hyong proudly points to some early breakthroughs. The Farm Beats project is pioneering new data-driven agriculture to guide farmers in India and the United States on where and when to plant crops for the greatest yield.

Equally impressive are the strides being made in land cover mapping – traditionally a time-consuming, expensive tool that is essential for environmental management and precision conservation. Recently, the entire United States was mapped by machine-learning algorithms that processed nearly 200 million aerial images in just over 10 minutes. Done the usual way, such a project would have taken many months and cost a fortune. Deployed globally and locally, this new way of mapping could revolutionize how we mitigate the effects of urbanization, pollution, deforestation, and even natural disasters.

Endangered species are also being given new hope. Traditionally, analysts pore over of thousands of images taken from satellites, drones or camera traps in the wild to study the range, populations, and behaviors of animals otherwise rarely seen by humans. It’s laborious work that takes time, skill, and concentration. “Try spotting a herd of zebra on the African savannah from a satellite image,” Wee Hyong says. “it’s not easy.”

High resolution imagery of zebra on the African savannah. Photo: Courtesy of Save The Elephants

Now computers can take on this role thanks to deep learning techniques that enable them to make sense of the thousands of pixels in an image. This is freeing up the expert time of scientists to do and study more. It’s already adding invaluable knowledge about elusive snow leopards in Kyrgyzstan in Central Asia and dwindling elephant populations in Congo in Africa where AI is also being used to in the fight against the twin scourges of poaching and the ivory trade.

Project Premonition uses insects as de facto “field biologists”. The project uses AI to analyze blood that mosquitoes take from animals across an  an ecosystem to glean valuable data. To achieve this, AI for Earth is developing drones that autonomously locate mosquito hotspots, robotic traps to collect specimens, and cloud-scale genomics and machine learning algorithms to identify each animal bitten.

The rise of the intelligent cloud and the ability to deploy machine learning models to the intelligent edge is accelerating and enabling new exciting possibilities to study and save wildlife from the remotest corners of the Earth to suburban backyards.

African bush elephants with Mount Kilmanjaro in the background. Picture: Courtesy of Save the Elephants

It goes beyond just technology, right? They want to tell their kids they are trying to save the Earth.

Pursuing research is worthy in itself, but real value comes when a solution is launched into action in the real world. It is here that Wee Hyong’s motivation shines through: He wants to leave the world in better shape for his two children – and for all children in the world.

The same goes for his team of data scientists and software engineers who left exciting and satisfying roles in commercial product development to join AI for Earth.

“Every single person who came for a job interview said they wanted to be able to tell their kids and families that they were serving a higher purpose. It goes beyond just technology, right? It goes beyond just new deep learning techniques and approaches, or whatever. They want to tell their kids they are trying to save the Earth.”