Desktop Analytics is a toolset that is gaining traction in the enterprise space with organizations looking to leverage compatibility, performance and health data to drive their ongoing Windows 10 management. We are really exited about how we can action Desktop Analytics insight within our Dashworks platform, so we thought it was an opportune moment to update on some real world experience of the new Microsoft tooling.
One of our larger customers recently installed Desktop Analytics to use it as a source of supplemental data to manage their Windows 7 and Windows 10 1803 migration to Windows 10 1909. It is a large customer (>10,000 seats) with a significant number of applications.
Since the Desktop Analytics output in this example is quite typical of what we see with many of our larger enterprise customers, I thought it might be interesting to give you some real-world insights into the type of results you can expect to get using Microsoft Desktop Analytics and how you can use them as part of your IT transformation and Evergreen IT processes — with or without Juriba Dashworks.
Below, I will walk you through some of the raw data of the driver and application analysis output from Desktop Analytics before putting the output results into a Dashworks context and show you how you can turn them into actionable insights to accelerate your migration.
First, let's start with the driver data. It is important to know that the driver data created by Desktop Analytics only lists the things that will not work or need attention. In other words, any driver that is deemed 'GREEN' will not be included in the analysis provided.
In our case, we found that there were 257 unique driver versions causing issues resulting in 14,066 driver issue occurrences — namely, the driver not migrating to the new OS. By driver issue occurrences, I refer to the cumulative number of devices encountering one or multiple driver issues.
The majority of issues (86%) were caused by the top 20 drivers while more than 37% (or 5,246 occurrences) were caused by the top five drivers, which were:
For each of the 257 unique driver versions with known issues, Desktop Analytics provided guidance to help with the mitigation, including:
In addition to those suggestions given above, Microsoft Desktop Analytics' guidance for five drivers was simply "Multiple". While some of the guidance wording might sound confusing at first, Microsoft has a quick guide to its Desktop Analytics Compatibility Assessment which gives further insights.
In addition, Microsoft also provides the driver availability information such as "Available in-box and from Windows Update" (188 times) which corresponded with the guidance previously discussed (e.g., in this case "Although a new driver is installed during upgrade, a newer version is available from Windows Update [...]").
While the driver assessment returned a couple of hundred rows of data, the application analysis included 14,382 unique application name/manufacturer/versions. Most of the assessed applications (13,317) had the upgrade decision status "Ready" while 1,065 had been set to "Not Reviewed".
While you can edit the readiness rules within the Desktop Analytics dashboard, Microsoft will, by default, set the status "Ready" for certain applications (e.g., low importance apps), whether or not any supporting information exists. This upgrade decision status can be also reviewed and manually set in Desktop Analytics to indicate whether or not an application is ready to be upgraded. Essentially, it could be compared to a very, very basic version of Juriba Dashworks RAG status.
Another useful piece of data reveals which types of applications you have installed. For example, this organization has a large number of background process apps (e.g., Microsoft Visual C++ 2013 Redistributable or Microsoft .NET Framework 4.8), Productivity apps (Microsoft Teams), Windows Components (e.g., Microsoft Silverlight), Cloud Storage (OneDrive, Dropbox, etc), and more.
As you can see from the chart above, about 17% of the applications are marked as "Unknown". These include anything from custom apps to Microsoft product add-ins. You would always expect in-house applications to appear in this category.
Most of the assessed applications were categorized as low-risk, meaning that Desktop Analytics had found no signals hinting that those applications would encounter issues while upgrading, making it likely that they should work in the new environment as is.
Two percent of the applications were categorized as high risk (these applications will almost certainly fail during or after the upgrade and may require remediation) and a further two percent as medium risk, indicating that they might have impaired functionality, but that remediation is likely. Seven percent of the applications were categorized as "Unknown" which indicates that these applications could not be assessed.
Of all the applications that were marked as "Ready", 1,662 were "Highly Adopted" and 2,613 were "Adopted". These categorizations could be misleading as they could be misunderstood as the adoption rate within your organization.
In reality, "Highly Adopted" refers to applications that have been installed on at least 100,000 commercial Windows 10 devices, while the label "Adopted" is given to applications installed on at least 10,000 but fewer than 100,000 devices.
Here's a note about the applications marked as "Blank" and "Insufficient Data". Although there is little to no information available, Microsoft has determined that these applications are ready for an upgrade because they have a low install count or were categorized as 'not important'. These are definitely things to quickly scan through to be sure nothing was improperly categorized. For example, a large enterprise might have a small install count of a finance specific app that could be critical to the operation, but has effectively been ignored due to low penetration.
For example, 129 applications under the upgrade decision "Not Reviewed" had insufficient data. Fifty of them were of "Critical importance" and had significant installs, including important apps like the DXC Remote Rescue Calling Card, Zscaler-Network-Adapter, and WebEx Productivity Tools. These had "No Known Issues" identified but no guidance issued.
Now that we have looked at the application status, let's have a closer look at the identified issues and the guidance given by Desktop Analytics to remediate them. The tool identified 14,382 unique application versions for our customer, yet only 115 applications (0.8%) had known issues. The issues found break down as follows:
All in all, we found the application estate data (e.g., types of applications installed, number of installs, etc.) more interesting and useful than the issues found and the guidance given as those were often cryptic. But, more importantly, you aren't easily able to act on the guidance given or your findings. For example, you cannot easily switch devices to "Green" and assign them into deployment rings automatically based on criteria. The lack of user information also makes it difficult to see which departments or locations are most impacted and where your organizational quick wins may lie.
This is where Dashworks comes in.
When migrating or continuously managing more than 1,000 devices, it is imperative to utilize automation wherever possible to be able to keep up with the pace of technological change.
While Desktop Analytics delivers insightful information that is very helpful in augmenting other existing data points, it is — compared to Juriba Dashworks — a very basic tool in terms of actionable data analysis, smart workflow management, project command and control, and automation.
For example, while Desktop Analytics can be used to determine certain readiness statuses, it makes these decisions based only upon application readiness data which, in turn and in most cases, is based on telemetry data and preset determinations. It isn't based on actual testing and real-life RAG status decisions. In addition, Desktop Analytics does not track any of RAG statuses or readiness details, but with the right back-end processes set up, you could use this data to turn these devices ready.
If you are using Desktop Analytics and Juriba Dashworks, I recommend that you identify all your deployment rings as well as your in-scope applications for each ring based on Juriba Dashworks reporting and then use Desktop Analytics to supplement the data around application compatibility. You can then layer on the application smoke testing and UAT processes, end user communications and other readiness activities (e.g. free hard drive space, business go/no-go approval etc). This way, you can make an informed and data-based decision whilst driving the velocity of deployment in the most efficient way.