Over the years, we have had a front-row seat to digital transformation occurring across all industries and regions around the world. And in 2020, we’ve seen that digitally transformed organizations have successfully adapted to sudden disruptions. What lies at the heart of digital transformation is also the underpinning of organizations who’ve proven most resilient during turbulent times—and that is data. Data is what enables both analytical power—analyzing the past and gaining new insights, and predictive power—predicting the future and planning ahead.
To harness the power of data, first we need to break down data silos. While not a new concept, achieving this has been a constant challenge in the history of data and analytics as its ecosystem continues to be complex and heterogeneous. We must expand beyond the traditional view that data silos are the core of the problem. The truth is, too many businesses also have silos of skills and silos of technologies, not just silos of data. And, this must be addressed holistically.
For decades, specialized technologies like data warehouses and data lakes have helped us collect and analyze data of all sizes and formats. But in doing so, they often created niches of expertise and specialized technology in the process. This is the paradox of analytics: the more we apply new technology to integrate and analyze data, the more silos we can create.
To break this cycle, a new approach is needed. Organizations must break down all silos to achieve analytical power and predictive power, in a unified, secure, and compliant manner. Your organizational success over the next decade will increasingly depend on your ability to accomplish this goal.
This is why we stepped back and took a new approach to analytics in Azure. We rearchitected our operational and analytics data stores to take full advantage of a new, cloud-native architecture. This fundamental shift, while maintaining consistent tools and languages, is what enables the long-held silos to be eliminated across skills, technology, and data. At the core of this is Azure Synapse Analytics—a limitless analytics service that brings together data integration, enterprise data warehousing, and Big Data analytics into a single service offering unmatched time to insights. With Azure Synapse, organizations can run the full gamut of analytics projects and put data to work much more quickly, productively, and securely, generating insights from all data sources. And, importantly, Azure Synapse combines capabilities spanning the needs of data engineering, machine learning, and BI without creating silos in processes and tools. Customers such as Walgreens, Myntra, and P&G have achieved tremendous success with Azure Synapse, and today we move to the global generally availability, so every customer can now get access.
But, just breaking down silos is not sufficient. A comprehensive data governance solution is needed to know where all data resides across an organization. An organization that does not know where its data is, does not know what its future will be. To empower this solution, we are proud to deliver Azure Purview—a unified data governance service that helps organizations achieve a complete understanding of their data. Azure Purview helps discover all data across your organization, track lineage of data, and create a business glossary wherever it is stored: on-premises, across clouds, in SaaS applications, and in Microsoft Power BI. It also helps you understand your data exposures by using over 100 AI classifiers that automatically look for personally identifiable information (PII), sensitive data, and pinpoint out-of-compliance data. Azure Purview is integrated with Microsoft Information Protection which means you can apply the same sensitivity labels defined in Microsoft 365 Compliance Center. With Azure Purview, you can view your data estate pivoting on classifications and labeling and drill into assets containing sensitive data across on-premises, multi-cloud, and multi-edge locations.
The combination of Azure Synapse Analytics and Azure Purview empowers organizations to invent with purpose by developing the capabilities to achieve both analytical power and predictive power.
To learn more about Azure Synapse Analytics and Azure Purview, please visit us here.
When Microsoft unveiled Azure Synapse Analytics a year ago, the company promised to put data and the power of analytics at people’s fingertips – anywhere in an organization – while freeing up skilled tech workers to focus on higher-value tasks than managing data infrastructure.
“We started with the hypothesis that it’s too difficult for many organizations to use their own data and deploy AI, and there aren’t enough software engineers on the planet to fill the shoes of all of the analytics that’s going to need to get done,” said John Macintyre, director of product, Azure Synapse and Analytics Platforms at Microsoft. “We knew we could make this tremendously simpler.”
With Azure Synapse, Microsoft offers limitless data warehousing and analytics, connecting and simplifying multiple sources of data so any organization can get more utility out of its own information.
On Thursday, Microsoft announced that the latest version of Azure Synapse is generally available, and the company also unveiled a new data governance solution, Azure Purview.
In the year since Azure Synapse was announced, Microsoft says the number of Azure customers running petabyte-scale workloads – or the equivalent of 500 billion pages of standard printed text – has increased fivefold.
That includes global delivery giant FedEx. The company is collaborating with Microsoft to build FedEx Surround, a new platform using Azure ecosystem products including Azure Synapse that helps its customers digitize their supply chains and use data to manage and track inventory in real time.
FedEx scans each of the 16 million packages it delivers daily more than a dozen times before the packages reach their destinations. That generates enormous amounts of useful logistics intelligence. That data is combined with information about traffic and weather and stored in Azure Data Lake Storage, a scalable data storage and analytics service. Using Azure Synapse and FedEx Surround, the company extracts insights that can enable faster, more efficient deliveries.
“The ability to respond to digital signals and adjust the supply chain for the benefit of our customers and their customers is a key differentiator for us. That’s the next-generation value that we want to bring to customers, and it can’t be done without leveraging the power of data,” said Sriram Krishnasamy, senior vice president, strategic programs at FedEx Services.
In the coming months, the company plans to deploy FedEx Surround to support the distribution of COVID-19 vaccines, which will require careful orchestration to keep them preserved in the necessary temperature range while moving them quickly through the company’s network.
“The insights we gain from continuous analysis help us optimize our network. So as FedEx moves critical high value shipments across the globe, we can often predict whether that delivery will be disrupted by weather or traffic and remediate that disruption by routing the delivery from another location,” Krishnasamy said.
For Wolters Kluwer, data plays an integral role in its strategic operations, notably in its Health division. Photo courtesy of Wolters-Kluwer.
Using data to better serve patients
Being able to predict and plan for changes – both immediate and longer-term – can make a difference in almost any business. For Wolters Kluwer, a global provider of professional information, software and services, data plays an integral role in its strategic operations, notably in its Health division.
For example, Wolters Kluwer built capabilities into its patient engagement platform that help healthcare providers personalize the approach to following up with a patient after they leave a hospital, based on their preferences. In addition, its clinical surveillance systems leverage real-time patient data from electronic health records to provide timely alerts about critical conditions using predictive models.
Another key area of focus for Wolters Kluwer is data standardization.
“Our customers are trying to normalize and make sense of massive amounts of data. With our Health Language solutions, we have the ability to clean and standardize data and medical terminology to enable analytics on top of it,” said Jean-Claude Saghbini, chief technology officer for Wolters Kluwer, Health.
Prior to adopting Azure Synapse, Wolters Kluwer consolidated much of its health data from multiple locations into Azure Data Lake, eliminating the “data siloes” that made it difficult to access and work with multiple sources of data. Azure Synapse provided the robust machine learning operations (MLOps) needed to create a data lake across products and data sources, as well as data pipelines to support analytics and advanced AI.
“That brought many of our key data assets into one place, so that people can use them and compute on them, and using Azure Synapse to process all of this data is one of the big enablers of that strategy,” Saghbini said.
In another example of the value of optimized data management, Wolters Kluwer was able to tailor its content to the 2 million clinicians who use its clinical decision support platform UpToDate every day. The company’s anonymized clinician search data has even been used by researchers to try to identify early signals of local or global healthcare trends. For example, one study showed that an increase in COVID-19 related searches on UpToDate could signal a spike in COVID-related deaths a month in the future.
A homegrown solution to cataloging data
As customers were previewing Azure Synapse over the past year, Microsoft engineers were busy developing a new data governance service to automate the discovery and cataloging of all data, whether from on-premises, multi-cloud or software as a service (SaaS) locations. Azure Purview, now available in public preview, will initially enable customers to understand exactly what data they have, manage the data’s compliance with privacy regulations and derive valuable insights more quickly.
Azure Purview began as a multi-year internal effort to assist in Microsoft’s own digital transformation and privacy compliance efforts. Mike Flasko, director of products for Azure Purview, heads the team that works with the company’s chief data, privacy and security officers to design analytics products and manage the company’s own volumes of data, as well as the complicated systems Microsoft deploys to manage them.
Like many companies, Microsoft’s data engineers, data scientists and business analysts all need to process and understand these large, intricate data streams.
“As we modernize and work through our own needs, we’ve learned a lot about what it takes to digitally transform Microsoft and manage data privacy,” Flasko said. “More and more customers were telling us that they needed to understand where all their data was, how it moves around and how they could access it. Their challenges were similar to what we were experiencing inside of Microsoft.”
Just as Azure Synapse represented the evolution of the traditional data warehouse, Azure Purview is the next generation of the data catalog, Microsoft says. It builds on the existing data search capabilities, adding enhancements to help customers comply with data handling laws and incorporate security controls.
“Azure Purview was designed to help customers maximize compliant use of their data,” Flasko said. “It ensures you have a comprehensive understanding of your data and how it moves and who you have shared it with, which is critical to effective data use and governance.”
The service includes three main components:
Data discovery, classification and mapping: Azure Purview will automatically find all of an organization’s data on premises or in the cloud and evaluate the characteristics and sensitivity of the data. Beginning in February, the capability will also be available for data managed by other storage providers.
Data catalog: Azure Purview enables all users to search for trusted data using a simple web-based experience. Visual graphs let users quickly see if data of interest is from a trusted source.
Data governance: Azure Purview provides a bird’s-eye view of a company’s data landscape, enabling data officers to efficiently govern data use. This enables key insights such as the distribution of data across environments, how data is moving and where sensitive data is stored.
Microsoft says these improvements will help break down the internal barriers that have traditionally complicated and slowed data governance.
“We wanted to make it as easy as possible for our applications, and our customers’ applications, to interact with each other. We did that by integrating and automating the data systems and teaching them how to speak to Azure Purview. That lets data engineers just be data engineers, and data scientists can just be data scientists,” Flasko said.
Related:
Top image: FedEx is collaborating with Microsoft to build FedEx Surround, which helps its customers digitize their supply chains and use data to manage and track inventory in real time. Image courtesy of FedEx.
Chris Stetkiewicz writes about technology and innovation for Microsoft.
Empowering organizations everywhere to gain insight from all their data sources and deliver personalized customer engagement
There is a fundamental change occurring across industries and organizations: Data now comes from everywhere and everything. As customers have access to more content, purchasing channels, and brand options than before, the touchpoints become exponential—every website visit, use of a product, and interaction with a customer service representative creates an observation or generates data. But this data is often siloed across multiple systems and organizational departments, making it difficult to gain a single source of truth.
With such an overload of information and choices available, organizations must demonstrate they both understand and value their customers. To this end, we’ve been working to bring customer experiences to the forefront of the business conversation with a new set of intelligent applications with our modern, unified, intelligent, and adaptable business applications. Today, I’d like to dig into our vision and strategy for Microsoft’s customer data platform—a critically important investment from Microsoft. Specifically, how it is helping organizations overcome data silos and leverage artificial intelligence to guide decisions and empower organizations to take meaningful actions for their business.
Dynamics 365 Customer Insights: Microsoft’s customer data platform
Historically, the customer interaction with a brand ended the moment they completed the purchase and walked out the door—limiting an organization’s understanding of why or how its customers are using its products and services. Dynamics 365 Customer Insights enables organizations to gain the most comprehensive view of their customers by unifying data across diverse sources—be it transactional, behavioral, or observational data—as well as uniquely enriching profiles with market insights and real-time product usage.
From data analysts to marketing, sales, and service professionals, every employee in an organization can leverage AI-driven insights. These include churn risk, customer LTV, and recommended next best action, to power business processes across the customer journey that help boost personalization and build richer relationships.
The Microsoft CDP enables breakthrough experiences for customers while maintaining the strictest compliance and security standards so that all customer data is securely managed and adheres to GDPR regulations. Built on a hyper-scale Microsoft Azure platform, the application allows organizations to run powerful analytical capabilities using Microsoft AI and Azure-based machine learning models. Customer Insights can easily extend and customize with the Microsoft Power Platform for even richer data processing and customizations. Customers can benefit from the Microsoft partner ecosystem for development of custom applications and solutions to fit specific industry or business needs.
Let’s look at a few organizations using our CDP to deliver business outcomes and rich customer experiences:
Empowering organizations worldwide
The United Nations Children’s Fund (UNICEF) works tirelessly in over 190 countries and territories to save the lives of millions of children. As private donors and volunteers are increasingly hard to find and retain, UNICEF Netherlands found that personally engaging supporters increased their overall commitment to the organization. Key details on donors, such as their contact information, philanthropic interests, and donation history, were housed in disparate data silos, making it difficult to gain a unified view of the donor base and personalize interactions at scale. To solve this problem, the team adopted Dynamics 365 Customer Insights to quickly and easily combine data from multiple sources, analyze the data to derive insights, and activate the insights via marketing and communication channels. Customer Insights’ out-of-the-box interoperability with Dynamics 365 Marketing helps the team create and optimize marketing campaigns on-the-fly, enabling UNICEF to better develop a customer lifetime value model, which will help it identify and optimize engagement with high-impact donors. Learn more in the UNICEF video below.
American Electric Power (AEP)—a competitive retail energy solutions company serving more than 400,000 residential, small-business, and commercial customers nationwide—is using Dynamics 365 Customer Insights to deliver efficient and sustainable energy solutions to customers. Previously, AEP faced cumbersome manual processes that required lots of analysis and input to piece together various customer information for a holistic view of its customer, which was both timely and costly. With a cloud-first approach, using Dynamics 365 Customer Insights, AEP can now easily migrate large data sets across various systems of record into a single, unified customer profile. As a result, AEP can better understand its customer needs, identify gaps in product offerings, and ensure both front and back-end operations are focused and efficient to deliver quality, tailored experiences to its customers. Learn more in the AEP video below.
What’s next?
Thus far, we’ve seen great momentum and impact resulting from the customer data platform and how businesses are evolving customer engagement and experiences. Looking ahead, we will continue to innovate on the platform and plan to deliver more opportunities and features in the coming months.
Days before the 2018 New Year, the brisk Amsterdam cold was settling into the city’s streets and canals and Ernst-Jan Stigter was beginning to wonder if he was about to make a major visionary move or take things too far.
As GM of Microsoft’s Netherlands subsidiary, he had been helping to steer, along with a group of involved employees, a full renovation of the organization’s office, a six-story glass edifice located on the outskirts of the city alongside the swooping planes of Schiphol Airport. But early on, Stigter and his team had decided that they would not simply rebuild the space. They would use the opportunity to take the next big step in a relentless experiment their organization had been conducting for more than a decade. Since 2005, Microsoft Netherlands has been obsessed with finding ever-new ways for employees to work and engage with customers, believing that if it harnessed the latest research in behavioral science and productivity, it could end up with happier, more innovative workers and stronger partner and customer relationships.
In short, the team was constantly trying to figure out not so much how to squeeze more into their days—more emails, more projects, more checkboxes checked—but how to get more out of their days—more focus, more balance, more relationship-building and collaboration. How, as they put it, to find “more life in a day.”
Against the backdrop of a broad transformation Microsoft embarked upon after CEO Satya Nadella came onboard in 2014, the Netherlands office was seeking to future-proof its workforce and accelerate success. It was a natural next step for the change-hungry team with a growth mindset, which had begun pioneering innovative office space and prioritizing work-life balance for employees before either was part of the mainstream business paradigm.
That culture of experimentation and customer obsession is how Stigter ended up feeling more than a little nervous as an important milestone loomed in January 2018. He and about 800 of his reports and other Microsoft employees were preparing for an almost unheard of exercise, one undertaken voluntarily as part of this new wave of transformation: while the Netherlands office was closed for 10 weeks of renovations, there would be no official temporary workspace.
Since its completion more than 675 years ago, the medieval cathedral of Notre-Dame has captivated millions of people with its incomparable beauty. From its legendary stained glass rose window to its towering spire, it’s widely regarded as one of the most stunning examples of medieval architecture in history.
A staple of the Parisian skyline and a global icon, the recent tragedy which saw the cathedral engulfed in flames shook Parisians and onlookers around the world. While thankfully no one was hurt, the iconic spire, oak frame and lead roof were lost.
The reconstruction and repair of the damage is a top priority, calling for the work to be completed within five years. One factor which could greatly aid in this process, is the vast amount of data, surveys and documentation that exist on Notre-Dame – information that has been recorded and collected by numerous parties over decades.
Microsoft and Iconem – an innovative startup that specialises in the recreation of endangered cultural heritage sites in 3D – have announced the Open Notre-Dame initiative. Together, they combine their skills to contribute to the restoration of Notre Dame through an open data project.
“Open Notre Dame” is a visual data provision in open source, designed to better understand and analyze the building in its history. This initiative will not only help gather and analyze as many existing documents as possible on the monument but also produce 3D models to make them available to everyone. Through this project, Iconem and Microsoft intend to contribute to preserve and spread the French heritage.
Having already created detailed 3D models of other French heritage sites such as Mont-Saint-Michel, Iconem’s access to data from third parties such as archival plans and photos can provide a historical evolution of the cathedral before the fire – improving the accuracy of the model and roof structure.
The temporal models of the Notre-Dame de Paris cathedral will be made available on GitHub, the world’s leading software development platform. The opening and the sharing of all these data via GitHub will come directly to feed different initiatives and competences: the EPFL of Lausanne (which creates dynamic models of cities), and Inria, the consortium Humanun (CNRS and Archeovision) – and will allow feeding all the scientific studies around the building. Many partners already contribute to the project thanks to their surveys, images, and plans that will serve as a basis for these open source models: aerial images (Yann Arthus-Bertrand, TSVP), very high-resolution images (Cornis company and first readings by Iconem), and thousands of pieces of documentation collected by Ubisoft.
If we continue to work together and sharing our knowledge, the great cathedral of Notre-Dame will be restored to its former glory once again.
Most companies today are collecting enormous amounts of data, and chances are they know that data contains crucial insights about everything from what customers want to purchase at 10 p.m. on Friday versus 7 a.m. on Wednesday to how they could run their businesses more efficiently any day of the week. What’s more, that data is all available in real time.
But too often companies can’t hear those signals, or they hear them too late.
“It would be an understatement to say we’re able to see just the tip of that iceberg. It’s more like we are analyzing a single ice cube out of that iceberg,” said Daniel Yu, director of product marketing, Azure data and artificial intelligence at Microsoft.
Part of the problem is that there’s so much data, and it’s so difficult to understand. Much of this valuable information is in what’s called unstructured or semi-structured data — generated from customer interactions on the web, software as a service apps, social media, mobile apps or IoT devices such as connected refrigerators and intelligent assistants. It is then stored in the cloud, where many of the tools for analyzing it are still maturing.
Yu said that’s left companies feeling they have two choices: They can either have powerful systems that can do really sophisticated analysis but require them to know exactly what their needs are upfront, or they can opt for more flexible systems that don’t give them as many choices for sophisticated analysis and are more time-consuming to manage .
“We think that’s a false choice,” Yu said. “You can have both power and flexibility in analytics, at a reasonable cost.”
From left, Lidia Rozhentsova , Sofia Iasonidou and Niklas Arbin of BookBeat discuss data analytics tools they have used to make sense of the overwhelming amounts of raw data they gather every day. Photo by Alexander Donka for Microsoft.
On Thursday, Microsoft announced that its customer offerings are getting an upgrade, with the general availability of Azure Data Explorer (ADX) and Azure Data Lake Storage (ADLS).
Microsoft says its Azure-based cloud analytics platform delivers the industry’s best price-performance ratio, a standard that measures the speed of a system against its hourly cost. According to independent testing by GigaOM, analytics with Azure SQL Data Warehouse is up to 14 times faster and costs 94 percent less than other cloud analytics offerings.
Microsoft said ADX can analyze 1 billion records of streaming data per second, using simple query language, while leaving the data and its metadata in its original state. ADLS provides a repository for storing massive amounts of structured or unstructured data, with the efficiency and security features of Azure Blob Storage. This combination is optimized for analytics.
That’s exactly the kind of offering the company says everyone from small startups to big established businesses need.
A good example is BookBeat, a European-based streaming audiobook service that runs its business on the Azure platform. The company uses data analytics to serve customers crisp recommendations based on their own reading history and those of customers with similar interests. It also relies on data to launch new business models, like a shared family account that it rolled out after its data predicted, correctly, that it would succeed.
“All our teams are data-driven,” said Niklas Arbin, head of developers at BookBeat. “We use it for everything.”
Arbin said that, with Azure managing its server infrastructure, BookBeat’s technical specialists are free to work on other essential tasks. That includes mining the data to deliver real insight and value and building internal tools for managing the vast streams of books it offers.
“Azure enables us to use the highest standards in application development, which has been very hard to do in any business intelligence toolset,” Arbin said. “It gives our developers freedom to choose the best tool for the job.”
Many of the desks at BookBeat’s offices in Stockholm, Sweden, are loaded with analog books as well. Photo by Alexander Donka for Microsoft.
The company doesn’t even have a traditional IT department.
“We don’t have to worry about things like uptime for servers,” Arbin said. “We don’t like working in the middle of the night.”
To John Chirapurath, general manager of Azure data, Blockchain and AI at Microsoft, that sounds like success. He said Microsoft’s goal is to remove complexity for customers wherever possible, from ingesting data to presenting it.
“We always strive to make it very easy for IT staff to adopt analytics and for line of business people to utilize and deliver powerful insights using beautiful products,” Chirapurath said.
Microsoft says another selling point for customers is the company’s long history of securing its Azure cloud, which includes helping customers conform to privacy standards and regulation such as the General Data Protection Regulation, or GDPR.
To Yu, the ongoing advances in cloud analytics technology, and the relentless flow of useful data, are bringing customers to a tipping point. For many, he said it’s no longer going to make sense to host all that data on premises, and devote so many resources to managing those resources, when there are so much more interesting analytics that can be done in the cloud.
“Data and analytics are changing everything for businesses,” Yu said. “There’s not a single company that isn’t thinking about this.”
Today, we are announcing the beta release of Clarity, an analytics product that empowers webmasters to visualize user behavior at scale to make data driven decisions on what exactly they should change and improve on their sites to optimize conversion, engagement and retention. For this purpose, Clarity supports playback of how users interacted and used their websites. Clarity also has other typical functionalities such as heatmap and scroll map.
Why do I need Clarity?
Building a compelling and user-friendly website requires a deeper understanding of user behavior, their pain points and what users are looking for. Traditionally, web developers utilize user research and A/B experimentation to update web experiences. While these have proven to be great techniques – each has its own limitations. Sample users in research studies may not fully represent your target audience. A/B experimentation tell how your metrics are affected by a change but not why. Clarity fills in these gaps by letting you replay your users’ session to see how your users really use your site. With session replay, you can see where people get stuck, or where they are highly engaged.
Clarity session replay lets you visualize what users see and their interactions with the web page. Being able to replay users’ mouse movements, touch gestures and click events allows you to empathize with users and understand their pain points.
In accordance with Microsoft’s principles and approach to privacy, Clarity respects users’ privacy by using text masking. Text is masked by default in the instrumentation layer and is not uploaded.
Clarity has yielded positive results within Microsoft and our pre-release partners, one of which is CookWithManali.com
Detecting Malware on Bing with Clarity
How can Clarity help find user experience problems? One example comes from Microsoft’s search engine, Bing. Engineers in Bing team used Clairty to look into cases where users are not successful in their search experience. In some cases, Bing’s engineers were surprised to see that some pages that had ads and other content that looked very different from the expected user experience.
The first image shows the page of Bing with malware while the second image shows Bing default experience after removal of malware.
Thanks to Clarity, Bing’s engineers were able see what the user actually saw – which wasn’t at all what they expected. To diagnose the cause of the strange content, they used Clarity to determine that the content came from malware installed on the end user’s machine, which were hijacking the page and modifying the content. As a result, people did not engage with Bing content and page load performance was also impacted. Clarity helped the developers to design and implement changes to defend Bing pages against this malware. Not only did this greatly improve the user experience, it led to a significant improvement in Bing’s business metrics.
Improving engagement on CookwithManali with Clarity
Bing is a multi-million user product, but web sites of all sizes can benefit from Clarity. One early partner of Clarity is a food blog, Cook with Manali. Cook with Manali is run by a young entrepreneur who does everything — from creating recipes to photos to running her site. Manali wants to create a great experience for her users and Clarity helped her accomplish that.
All bloggers want to engage their users. Typically, a cooking blog post begins with a story about the inspiration behind the recipe, followed by detailed instructions on how to prepare the meal, photographs of the dish and step by step pictures and a shorthand recipe card summarizing ingredients, instructions and nutritional information at the end. This long post format helps bloggers to connect emotionally with their readers and preemptively address any complications in the recipe.
When Manali started using Clarity, she realized that a lot of users were abandoning the page before reaching the bottom, which has important information about the recipe. After replaying sessions of users who abandoned her blog, she noticed that users who were only interested in the recipe, which is at the bottom, scrolled through the long post and gave up midway and abandoned the page. To fix this, she added a “Jump to recipe” button at the top of the page and flighted the change.
With the new button added, she was able to see users utilizing the new button and getting directly to the content they cared about. Abandonment for these pages went down significantly, indicating an improvement in user satisfaction and retention. Interestingly, many users now utilize the “Jump to Recipe” button to then scroll back up to read the larger story afterwards.
Can I use Clarity on my site?
Clarity works on any HTML webpage (desktop or mobile) after adding a small piece of JavaScript to the website. As soon as the script is added, Clarity receives your site’s data and you can start using Clarity.
The JavaScript code listens to browser events and instruments layout changes, network requests and user interactions. That data is then uploaded and stored in the Clarity server running on Microsoft Azure.
**Update 12/12, 12pm PST: Clarity is compatible with the most common two and three letter top level domains (such as .com, .edu, .au, .uk, etc); we currently working on increasing the compatibility to other general top level domains. Please check back soon if you are looking to use Clarity for a GTLD.**
How do I get started?
Sign up at the Clarity website using your Microsoft Account! (In case you don’t have one, you can sign-up here.)
When you create a new project, it will be added to the waitlist. A notification will be sent when your project is approved for onboarding and you can login to Clarity to retrieve the uniquely generated JS code for your project. Once you have added the code to your website, you can use Clarity dashboard to start replaying user sessions and gain insights.
Please reach out to ClarityMS@microsoft.com if you have any questions.
Contributing to Clarity
The Clarity team has also open sourced the JavaScript library which instruments pages to help understand user behavior on websites on GitHub . As Clarity is in active development with continuous improvements, join our community and contribute. Getting started is easy, just visit GitHub and read through our README.
What’s next?
Here are some of the exciting new feature the Clarity team is brewing up:
Interesting sessions are automatically bubbled up based on Clarity’s AI and machine learning capabilities to help web developers review user sessions with abnormal click or scroll behavior, session length, JavaScript errors and more. Web developers can spend less time and gain more insight into their users focusing on the sessions that Clarity marks as most relevant.
Related sessions are a grouping of similar sessions that are recommended based a single session. This feature allows web developers to quickly understand the scope of a specific user behavior and find other occurrences for the same user as well as other users.
Heatmaps provide a view into user behavior at an aggregate level through click/touch and scroll heatmaps. Click/touch heatmap provides distribution of user interactions across a webpage. Scroll heatmaps provide how far users scroll on your webpage.