.NET Component Vulnerability Analysis in Production

.NET Component Vulnerability Analysis in Production 

At Black Duck, we’ve been excited to participate in the flurry of growth in the .NET ecosystem. Our Visual Studio Extension helps developers detect open source risks early, when it is easiest and most cost-effective to eliminate them. However, in some cases, a Visual Studio project or any build file or other composition metadata may not be available. Perhaps an application's source code (and the component data that comes with it) has been lost. Perhaps the application was provided by a vendor who has never made the source code available in the first place. Or perhaps, in addition to scanning application dependencies, we want to include the actual production runtime in our scan. Is such component analysis possible? 

There are two cases in which the answer is yes. First, if the application runs in a minimal, exclusive execution environment, such as a container. In that case, that entire environment/container can be scanned, and all its contents can be assumed to be components of the application. The other case is more elusive: when an application's dependency loading is sufficiently deterministic and transparent that the application's dependencies can be reliably extracted and analyzed. This is the case for .NET Core, the open source cloud-friendly blazing-fast platform. The directory created by running .NET core’s publish task contains the runtime, the application, and all of the dependent libraries. This directory can easily be scanned with the conventional Black Duck Hub scanner.

But what about the applications created for the more traditional proprietary Microsoft .NET runtime or Mono? Their dependencies and runtimes may be scattered across multiple locations. Fortunately, now we have a way to scan them too, without access to source code or build files.

Identifying Dependencies

Fortunately, .NET has just that kind of determinism and transparency. First, .NET libraries and applications (assemblies, in .NET jargon) most often use strong names to identify their dependencies. A strong name consists of a name, version, a digital signature and a public key, virtually guaranteeing that a specific DLL file will be loaded to satisfy each dependency. Even when strong names are not used, such as in native code invocations, .NET follows specific and predictable rules to resolve dependencies. No noodly, chaotic Java classpaths here.

And this is why it is possible to make a dependency scanner that examines a .NET application in its runtime environment (including Mono), traverses the dependency tree, and makes it available to the Black Duck Hub for vulnerability analysis. We have written such a scanner for you here. It uses .NET's own assembly loading facilities to create the dependency tree of .NET assemblies. It also inspects the target's IL for native code invocations (P-Invokes) and follows the DLL search path to find the referenced native libraries.

The scanner is released under the Apache 2.0 License, so even if you're not a Hub user, you can repurpose its dependency traversal logic for your needs. Or, to stay on top of open source vulnerabilities on virtually every platform, sign up for a trial of Black Duck Hub.

Black Duck Integrations for Microsoft Visual Studio

0 Comments
Sorry we missed you! We close comments for older posts, but we still want to hear from you. Tweet @black_duck_sw to continue the discussion.
0 Comments

MORE BY THIS AUTHOR

How to Get Developers to Adopt Your Product

| Apr 20, 2017

This post was originally published on the Red Hat Developers blog. Recently, I participated in a focus group where developers were asked to discuss how they make technology adoption decisions. Even “the big guys” seem unsure of how to get developers to notice and adopt their products. So, in this

| MORE >

Tackling Visibility in Microservices

| Feb 23, 2017

Are modern enterprise software architectures doomed to produce suboptimal processes and outcomes? Today, enterprise architects value componentization perhaps more than ever before, given the mass glorification of microservices. Microservices are loosely defined as isolated, independent components

| MORE >