nuget

Authenticode Signing Service and Client

September 12, 2016 Coding 1 comment , , , , ,

Authenticode Signing Service and Client

Last night I published a new project on GitHub to make it easier to integrate Authenticode signing into a CI process by providing a secured API for submitting artifacts to be signed by a code signing cert held on the server. It uses Azure AD with two application entries for security:

  1. One registration for the service itself
  2. One registration to represent each code signing client you want to allow

Azure AD was chosen as it makes it easy to restrict access to a single application/user in a secure way. Azure App Services also provide a secure location to store certificates, so the combination works well.

The service currently supports either individual files, or a zip archive that contains supported files to sign (works well for NuGet packages). The service code is easy to extend if additional filters or functionality is required.

Supported File Types

  • .msi, .msp, .msm, .cab, .dll, .exe, .sys, .vxd and Any PE file (via SignTool)
  • .ps1 and .psm1 via Set-AuthenticodeSignature

Deployment

You will need an Azure AD tenant. These are free if you don’t already have one. In the “old” Azure Portal, you’ll need to
create two application entries: one for the server and one for your client.

Azure AD Configuration

Server

Create a new application entry for a web/api application. Use whatever you want for the sign-on URI and App ID Uri (but remember what you use for the App ID Uri as you’ll need it later). On the application properties, edit the manifest to add an application role.

In the appRoles element, add something like the following:

{
  "allowedMemberTypes": [
    "Application"
  ],
  "displayName": "Code Sign App",
  "id": "<insert guid here>",
  "isEnabled": true,
  "description": "Application that can sign code",
  "value": "application_access"
}

After updating the manifest, you’ll likely want to edit the application configuration to enable “user assignment.” This means that only assigned users and applications can get an access token to/for this service. Otherwise, anyone who can authenticate in your directory can call the service.

Client

Create a new application entry to represent your client application. The client will use the “client credentials” flow to login to Azure AD
and access the service as itself. For the application type, also choose “web/api” and use anything you want for the app id and sign in url.

Under application access, click “Add application” and browse for your service (you might need to hit the circled check to show all). Choose your service app and select the application permission.



Finally, create a new client secret and save the value for later (along with the client id of your app).

Server Configuration

Create a new App Service on Azure (I used a B1 for this as it’s not high-load). Build/deploy the service however you see fit. I used VSTS connected to this GitHub repo along with a Release Management build to auto-deploy to my site.

In the Azure App Service, in the certificates area, upload your code signing certificate and take note of the thumbprint id. In the Azure App Service, go to the settings section and add the following setting entries:

Name Value Notes
CertificateInfo:Thumbprint thumbprint of your cert Thumbprint of the cert to sign with
CertificateInfo:TimeStampUrl url of timestamp server
WEBSITE_LOAD_CERTIFICATES thumbprint of your cert This exposes the cert’s private key to your app in the user store
Authentication:AzureAd:Audience App ID URI of your service from the application entry
Authentication:AzureAd:ClientId client id of your service app from the application entry
Authentication:AzureAd:TenantId Azure AD tenant ID either the guid or the name like mydirectory.onmicrosoft.com

Enable “always on” if you’d like and disable PHP then save changes. Your service should now be configured.

Client Configuration

The client is distributed via NuGet and uses both a json config file and command line parameters. Common settings, like the client id and service url are stored in a config file, while per-file parameters and the client secret are passed in on the command line.

You’ll need to create an appsettings.json similar to the following:

{
  "SignClient": {
    "AzureAd": {
      "AADInstance": "https://login.microsoftonline.com/",
      "ClientId": "<client id of your client app entry>",
      "TenantId": "<guid or domain name>"
    },
    "Service": {
      "Url": "https://<your-service>.azurewebsites.net/",
      "ResourceId": "<app id uri of your service>"
    }
  }
}

Then, somewhere in your build, you’ll need to call the client tool. I use AppVeyor and have the following in my yml:

environment:
  SignClientSecret:
    secure: <the encrypted client secret using the appveyor secret encryption tool>

install: 
  - cmd: appveyor DownloadFile https://dist.nuget.org/win-x86-commandline/v3.5.0-rc1/NuGet.exe
  - cmd: nuget install SignClient -Version 0.5.0-beta3 -SolutionDir %APPVEYOR_BUILD_FOLDER% -Verbosity quiet -ExcludeVersion -pre

build: 
 ...

after_build:
  - cmd: nuget pack nuget\Zeroconf.nuspec -version "%GitVersion_NuGetVersion%-bld%GitVersion_BuildMetaDataPadded%" -prop "target=%CONFIGURATION%" -NoPackageAnalysis
  - ps: '.\SignClient\SignPackage.ps1'
  - cmd: appveyor PushArtifact "Zeroconf.%GitVersion_NuGetVersion%-bld%GitVersion_BuildMetaDataPadded%.nupkg"  

SignPackage.ps1 looks like this:

$currentDirectory = split-path $MyInvocation.MyCommand.Definition

# See if we have the ClientSecret available
if([string]::IsNullOrEmpty($env:SignClientSecret)){
    Write-Host "Client Secret not found, not signing packages"
    return;
}

# Setup Variables we need to pass into the sign client tool

$appSettings = "$currentDirectory\appsettings.json"

$appPath = "$currentDirectory\..\packages\SignClient\tools\SignClient.dll"

$nupgks = ls $currentDirectory\..\*.nupkg | Select -ExpandProperty FullName

foreach ($nupkg in $nupgks){
    Write-Host "Submitting $nupkg for signing"

    dotnet $appPath 'zip' -c $appSettings -i $nupkg -s $env:SignClientSecret -n 'Zeroconf' -d 'Zeroconf' -u 'https://github.com/onovotny/zeroconf' 

    Write-Host "Finished signing $nupkg"
}

Write-Host "Sign-package complete"

The parameters to the signing client are as follows. There are two modes, file for a single file and zip for a zip-type archive:

usage: SignClient <command> [<args>]

    file    Single file
    zip     Zip-type file (NuGet, etc)

File mode:

usage: SignClient file [-c <arg>] [-i <arg>] [-o <arg>] [-h <arg>]
                  [-s <arg>] [-n <arg>] [-d <arg>] [-u <arg>]

    -c, --config <arg>            Full path to config json file
    -i, --input <arg>             Full path to input file
    -o, --output <arg>            Full path to output file. May be same
                                  as input to overwrite. Defaults to
                                  input file if ommited
    -h, --hashmode <arg>          Hash mode: either dual or Sha256.
                                  Default is dual, to sign with both
                                  Sha-1 and Sha-256 for files that
                                  support it. For files that don't
                                  support dual, Sha-256 is used
    -s, --secret <arg>            Client Secret
    -n, --name <arg>              Name of project for tracking
    -d, --description <arg>       Description
    -u, --descriptionUrl <arg>    Description Url

Zip-type archive mode, including NuGet:

usage: SignClient zip [-c <arg>] [-i <arg>] [-o <arg>] [-h <arg>]
                  [-f <arg>] [-s <arg>] [-n <arg>] [-d <arg>] [-u <arg>]

    -c, --config <arg>            Full path to config json file
    -i, --input <arg>             Full path to input file
    -o, --output <arg>            Full path to output file. May be same
                                  as input to overwrite
    -h, --hashmode <arg>          Hash mode: either dual or Sha256.
                                  Default is dual, to sign with both
                                  Sha-1 and Sha-256 for files that
                                  support it. For files that don't
                                  support dual, Sha-256 is used
    -f, --filter <arg>            Full path to file containing paths of
                                  files to sign within an archive
    -s, --secret <arg>            Client Secret
    -n, --name <arg>              Name of project for tracking
    -d, --description <arg>       Description
    -u, --descriptionUrl <arg>    Description Url

Contributing

I’m very much open to any collaboration and contributions to this tool to enable additional scenarios. Pull requests are welcome, though please open an issue to discuss first. Security reviews are also much appreciated!

Project.json all the things

February 8, 2016 Coding 34 comments ,

Project.json all the things

One of the less known features of Visual Studio 2015 is that it is possible to use project.json with any project type, not just “modern PCL’s,” UWP projects, or xproj projects. Read on to learn why you want to switch and how you can update your existing solution.

Background

Since the beginning of NuGet, installed packages were tracked in a file named packages.config placed alongside the project file. The package installation process goes something like this:

  1. Determine the full list of packages to install, walking the tree of all dependent packages
  2. Download all of those packages to a \packages directory alongside your solution file
  3. Update your project file with correct libraries from the package (looking at \lib\TFM
    • If the package contains a build directory, add any appropriate props or targets files found
  4. Create or update a packages.config file along the project that lists each package along with the current target framework

Terms

  • TFM – Target Framework Moniker. The name that represents a specific Platform (platforms being .NET Framework 4.6, MonoTouch, UWP, etc.)
  • Short Moniker – a short way of referring to a TFM in a NuGet file (e.g., net46). Full list is here.
  • Full Moniker – a longer way of specifying the TFM (e.g., .NETPortable,Version=v4.5,Profile=Profile111). Easiest way to determine this is to compile and let the NuGet error message tell you what to add (see below).

Limitations

The above steps are roughly the same for NuGet up to and including the 2.x series. While it works for basic projects, larger, more complex projects quickly ran into issues. I do not consider the raw number of packages that a project has to be an issue by itself – that is merely showing oodles of reuse and componentization of packages into small functional units. What does become an issue are the UI and the time it takes to update everything.

As mentioned, because NuGet modifies the project file with the relative location of the references, every time you update, it has to edit the project file. This is slow and can lead to merge conflicts across branches.

Furthermore, the system is unable to pivot on different compile-time needs. With many projects needing to provide some native support, NuGet v2.0 had no way of providing different dependencies based on build configuration.

One more issue surfaces with the use of “bait and switch” PCLs. Some packages provide a PCL for reference purpose (the bait), and then also provide platform-specific implementations that have the same external surface area (the switch). This enables libraries to take advantage of platform specific functionality that’s not available in a portable class library alone. The catch with these packages is that to function correctly in a multi-project solution containing a PCL and an application, the application must also add a NuGet reference to all of the packages its PCL libraries use to ensure that the platform-specific version winds up in the output directory. If you forget, you’ll likely get a runtime error due to an incomplete reference assembly being used.

NuGet v3 and Project.json to the rescue

NuGet 3.x introduces a number of new features aimed at addressing the above limitations:

  • Project files are no longer modified to contain the library location. Instead, an MSBuild task and target gets auto-included by the build system. This task creates references and content-file items at build time enabling the meta-data values to be calculated and not baked into a project file.
    • Per-platform files can exist by using the runtimes directories. See the native light-up section in the docs for the details.
  • Packages are now stored in a per-user cache instead of alongside the solution. This means that common packages do not have to be re-downloaded since they’ll already be present on your machine. Very handy for those packages you use in many different solutions. The MSBuild task enables this as the location is no longer baked into the project file.
  • Reference assemblies are now more formalized with a new ref top-level directory. This would be the “bait” assembly, one that could target a wide range of frameworks via either a portable- or dotnet or netstandard TFM. The implementation library would then reside in \lib\TFM. The version in the ref directory would be used as the compile-time reference while the version in the lib directory is placed in the output location.
  • Transitive references. This is a biggie. Now only the top-level packages you require are listed. The full chain of packages is still downloaded (to the shared per-user cache), but it’s hidden in the tooling and doesn’t get in your way. You can continue to focus on the packages you care about. This also works with project-to-project references. If I have a bait-and-switch package reference in my portable project, and I have an application that references that portable library, the full package list will be evaluated for output in the application and the per-architecture, per-platform assemblies will get put in the output directories. You no longer have to reference each package again in the application.

It is important to note that these features only work when a project is using the new project.json format of package management. Having NuGet v3 alone isn’t enough. The good news is that we can use project.json in any project type with a few manual steps.

Using project.json in your current solution

You can use project.json in your current solution. There are a couple of small caveats here:

  1. Only Visual Studio 2015 with Update 1 currently supports project.json. Xamarin Studio does not yet support it but it is planned. That said, Xamarin projects in Visual Studio do support project.json.
    • If you’re using TFS Team Build, you need TFS 2015 Update 1 on the build agent in addition to VS 2015 Update 1.
  2. Some packages that rely on content files being placed into the project may not work correctly. project.json has a different mechanism for this, so the package would need to be updated. The workaround would be to manually copy the content into your project file.
  3. All projects in your solution would need to be updated for the transitive references to resolve correctly. That’s to say that an application using NuGet v2/packages.config won’t pull in the correct transitive references of a portable project reference that’s using project.json.

With that out of the way, lets get started. If you’d like to skip this and see some examples, please look at the following projects that have been converted over. These are all libraries that have a combination of reference assemblies, platform specific implementations, test applications and unit tests, so the spectrum of scenarios should be covered there. They have everything you need in them:

One last note before diving deep: make sure your .gitignore file contains the following entries:

  • *.lock.json
  • *.nuget.props
  • *.nuget.targets

These files should not generally be checked in. In particular, the .nuget.props/targets files will contain a per-user path to the NuGet cache. These files are created by calling NuGet restore on your solution file.

Diving deep

As you start, have the following blank project.json handy as you’ll need it later:

{
    "dependencies": {        
    },
    "frameworks": {        
        "net452": { }
    },
    "runtimes": {
        "win": { }
    } 
}

This represents an empty project.json for a project targeting .NET 4.5.2. I’m using the short moniker here, but you can also use the full one. The string to use here is the thing you’ll likely hit the most trouble with. Fortunately, when you’re wrong and try to build, you’ll get what’s probably the most helpful error message of all time:

Your project is not referencing the “.NETPortable,Version=v4.5,Profile=Profile111” framework. Add a reference to “.NETPortable,Version=v4.5,Profile=Profile111” in the “frameworks” section of your project.json, and then re-run NuGet restore.

The error literally tells you how to fix it. Awesome! The fix is to put .NETPortable,Version=v4.5,Profile=Profile111 in your frameworks section to wind up with something like:

{
    "dependencies": {        
    },
    "frameworks": {        
        ".NETPortable,Version=v4.5,Profile=Profile111": { }
    },
    "supports": { }
}

The eagle-eyed reader will notice that the first example had a runtimes section with win in it. This is required for a desktop .NET Framework projects and for projects where CopyNuGetImplementations is set to true like your application (we’ll come back that in a bit), but is not required for other library project types. If you have the runtimes section, then there’s rarely, if ever, a reason to have both the supports section too.

The easiest way to think about this:

  • For library projects, use supports and not runtimes
  • For your application project, (.exe, .apk, .appx, .ipa, website) use runtimes and not supports
  • If it’s a desktop .NET Framework project, use runtimes for both class libraries and your application
  • If it’s a unit test library executing in-place and you need references copied to its output directory, use runtimes and not supports

Now, take note of any packages with the versions that you already have installed. You might want to copy/paste your packages.config file into a temporary editor window.

The next step is to remove all of your existing packages from your project. There are two ways to do this: via the NuGet package manager console or by hand.

Using the NuGet Package Manager Console

Pull up the NuGet Package Manager Console and ensure the drop-down is set to the project you’re working on. For each package in the project, uninstall each package with the following command:
Uninstall-Package <package name> -Force -RemoveDependencies
Repeat this for each package until they’re all gone.

By Hand

Delete your packages.config file, save the project file then right-click the project and choose “Unload project”. Now right-click the project and select Edit. We need to clean up a few things in the project file.

  • At the top of the project file, remove any .props files that were added by NuGet (look for the ones going to a \packages directory.
  • Find any <Reference> element where the HintPath points to a NuGet package library. Remove all of them.
  • At the bottom of the file, remove any .targets files that NuGet added. Also remove any NuGet targets or Tasks that NuGet added (might be a target that starts with the following line <Target Name="EnsureNuGetPackageBuildImports" BeforeTargets="PrepareForBuild">).
  • If you have any packages that contain Roslyn Analyzers, make sure to remove any analyzer items that come from them.

Save your changes, right click the project in the solution explorer and reload the project.

Adding the project.json

In your project, add a new blank project.json file using one of the templates above. Ensure that the Build Action is set to None (should be the default). Once present, you might need to unload your project and reload it for NuGet to recognize it, so save your project, right-click your project and unload it and reload it.

Now you can either use the Manage NuGet Packages UI to re-add your packages or add them to the project.json by hand. Remember, you don’t necessarily have to re-add every package, only the top-level ones. For example, if you use Reactive Extensions, you only need Rx-Main, not the four other packages that it pulls in.

Build your project. If there are any errors related to NuGet, the error messages should guide you to the answer. Your project should build.

What you’ll notice for projects other than desktop .NET executables or UWP appx’s, is that the output directory will no longer contain every referenced library. This saves disk space and helps the build be faster by eliminating extra file copying. If you want the files to be in the output directory, like for unit test libraries that need to execute in-place, or for an application itself, there’s two extra steps to take:

  1. Unload the project once more and edit it to add the following to the first <PropertyGroup> at the top of the project file: <CopyNuGetImplementations>true</CopyNuGetImplementations>. This tells NuGet to copy all required implementation files to the output directory.
  2. Save and reload the project file. You’ll next need to add that runtimes section from above. The exact contents will depend on your project type. Rather than list them all out here, please see the Zeroconf or xUnit for Devices for the full examples.
    • For an AnyCPU Desktop .NET project win is sufficient
    • For Windows Store projects, you’ll need more

Once you repeat this for all of your projects, you’ll hopefully still have a working build(!) but now one where the projects are using the rich NuGet v3 capabilities. If you have a CI build system, you need to ensure that you’re using the latest nuget.exe to call restore on your solution prior to build. My preference is to always download the latest stable version from the dist link here: https://dist.nuget.org/win-x86-commandline/latest/nuget.exe.

Edge Cases

There may be some edge cases you hit when it comes to the transitive references. If you need to prevent any of the automatic project-to-project propagation of dependencies, the NuGet Docs can help.

In some rare cases, if you start getting compile errors due to missing System references, you may be hitting this bug, currently scheduled to be fixed in the upcoming 3.4 release. This happens if a NuGet package contains a <frameworkAssembly /> dependency that contains a System.* assembly. The workaround for now is to add <IncludeFrameworkReferencesFromNuGet>false</IncludeFrameworkReferencesFromNuGet> to your project file.

What this doesn’t do

There is often confusion between the use of project.json and its relation to the DNX/CLI project tooling that enables cross-compilation to different sets of targets. Visual Studio 2015 uses a new project type (.xproj) as a wrapper for these. This post isn’t about enabling an existing .csproj or .vbproj project type (the one most people have been using on “regular”) projects to start cross-compiling. Converting an existing project to use .xproj is a topic for another day and not all project types are supported by .xproj.

What this does do is enable the NuGet v3 features to be used by the existing project types today. If you have a .NET 4.6 desktop project, this will not change that. Likewise if your project is using the Xamarin Android 6 SDK, this won’t alter that either. It’ll simply make package management easier.

Acknowledgments

I would like to thank Andrew Arnott for his persistence in figuring out how to make this all work. He explained it to me as he was figuring it out and then recently helped to review this post. Thanks Andrew! A shout out is also due to Scott Dorman and Jason Malinowski for their valuable feedback reviewing this post.

Enabling source code debugging for your NuGet packages with GitLink

September 23, 2015 Coding 3 comments , , , ,

Enabling source code debugging for your NuGet packages with GitLink

Recently on Twitter, someone was complaining that their CI builds were failing due to SymbolSource.org either being down or rejecting their packages. Fortunately, there’s a better way than using SymbolSource if you’re using a public Git repo (like GitHub) to host your project — GitLink.

Symbols, SymbolSource and NuGet

Hopefully by now, most of you know that you need to create symbols (PDB’s) for your release libraries in addition to your debug builds. Having symbols helps your users troubleshoot issues that may crop up when they’re using your library. Without symbols, you need to rely on hacks, like using dotPeek as a Symbol Server. It’s a hack because the generated source code usually doesn’t match the original, and it certainly doesn’t include any helpful comments (you do comment your code, right?)

So you’ve updated your project build properties to create symbols for release, now you need someplace to put them so your users can get them. Up until recently, the easiest way has been to publish them on SymbolSource. You’d include the pdb files in your NuGet NuSpec, and then run nuget pack MyLibrary.nuspec -symbols. NuGet then creates two packages, one with your library and one just with the symbols. If you then run nuget push MyLibrary.1.0.0.nupkg, if there’s also a symbols package alongside, NuGet will push that to SymbolSource instead of NuGet.org. If you’re lucky, things will just work. However, sometimes SymbolSource doesn’t like your PDB’s and your push will fail.

The issues

While SymbolSource is a great tool, there are some shortcomings.
* It requires manual configuration by the library consumer
* They have to know to go to VS and add the SymbolSource URL to the symbol search path
* It slows down your debugging experience. VS will by default check every configured Symbol Server for matching PDB’s. That leads many people to either disable symbol loading entirely or selectively load symbols. Even if you selectively load symbols, the load is still slow as VS has know way to know which Symbol Server a PDB might be on and must check all of them.
* Doesn’t enable Source Code debugging. PDB’s can be indexed to map original source code file metadata into them (the file location, not contents). If you’ve source-indexed your PDB’s and the user has source server support enabled, VS will automatically download the matching source code. This is great for OSS projects with their code on GitHub.

GitLink to the Rescue

GitLink provides us an elegant solution. When GitLink is run after your build step, it detects the current commit (assuming the sln is in a git repo clone), detects the provider (BitBucket and GitHub are currently supported) and indexes the PDB’s to point to the exact source location online. Of course, there are options to specify commits, remote repo location URLs, etc if you need to override the defaults.

After running GitLink, just include the PDB files in your nuspec/main nupkg alongside your dll files and you’re done. Upload that whole package to NuGet (and don’t use the -symbols parameter with nuget pack). This also means that users don’t need to configure a symbol server as the source-indexed PDB’s will be alongside the dll — the location VS will auto-load them from.

An example

Over at xUnit and xUnit for Devices, we’ve implemented GitLink as part of our builds. xUnit builds are setup to run msbuild on an “outer” .msbuild project with high-level tasks; we have a GitLink task that runs after our main build task.

As we want the build to be fully automated and not rely on exe’s external to the project, we “install” the GitLink NuGet package on build if necessary.

Here’s the gist of our main CI target that we call on build msbuild xunit.msbuild /t:CI (abbreviated for clarity):

<PropertyGroup>
  <SolutionName Condition="'$(SolutionName)' == ''">xunit.vs2015.sln</SolutionName>
  <SolutionDir Condition="'$(SolutionDir)' == '' Or '$(SolutionDir)' == '*Undefined*'">$(MSBuildProjectDirectory)</SolutionDir>
  <NuGetExePath Condition="'$(NuGetExePath)' == ''">$(SolutionDir)\.nuget\nuget.exe</NuGetExePath>
</PropertyGroup>

<Target Name="CI" DependsOnTargets="Clean;PackageRestore;GitLink;Build;Packages" />

<Target Name="PackageRestore" DependsOnTargets="_DownloadNuGet">
  <Message Text="Restoring NuGet packages..." Importance="High" />
  <Exec Command="&quot;$(NuGetExePath)&quot; install gitlink -SolutionDir &quot;$(SolutionDir)&quot; -Verbosity quiet -ExcludeVersion -pre" Condition="!Exists('$(SolutionDir)\packages\gitlink\')" />
  <Exec Command="&quot;$(NuGetExePath)&quot; restore &quot;$(SolutionDir)\$(SolutionName)&quot; -NonInteractive -Source @(PackageSource) -Verbosity quiet" />
</Target>

<Target Name='GitLink'>
  <Exec Command='packages\gitlink\lib\net45\GitLink.exe $(MSBuildThisFileDirectory) -f $(SolutionName) -u https://github.com/xunit/xunit' IgnoreExitCode='true' />
</Target>

<Target Name='Packages'>
  <Exec Command='"$(NuGetExePath)" pack %(NuspecFiles.Identity) -NoPackageAnalysis -NonInteractive -Verbosity quiet' />
</Target>

There are a few things to note from the snippet:
* When installing GitLink, I use the -ExcludeVersion switch. This is so it’s easier to call later in the script w/o remembering to update a target path each time.
* I’m currently using -pre as well. There’s a number of bugs fixed since the last stable release.

The end result

If you use xUnit 2.0+ or xUnit for Devices and have source server support enabled in your VS debug settings, VS will let you step into xUnit code seamlessly.

If you do this for your library, your users will thank you.

UWP NuGet Package Dependencies

August 29, 2015 Coding 4 comments , , ,

UWP NuGet Package Dependencies

[Updated: 9/15/15 on the NuGet package contents at the end]

In my last post, Targeting .NET Core, I mentioned that NuGet packages targeting .NET Core and using the dotnet TFM need to list their dependencies. What may not be immediately obvious, as this is new behavior for UWP projects, is that UWP packages need to list their BCL dependencies too, not just “regular” NuGet references.

The reason for this is that UWP projects also use .NET Core and may elect to use newer BCL package versions than the default. While the uap10.0 TFM does imply BCL + Windows Runtime, it doesn’t really say what version of the dependencies you get. Instead, that’s in your project.json file, which by default includes the Microsoft.NETCore.UniversalWindowsPlatform v5.0.0 “meta-package”, which pulls in most of the .NET Core libraries at a particular version. But what happens if newer BCL packages are published? Right now, the OSS BCL .NET Core packages are being worked on and they’re a higher version – System.Runtime is 4.0.21-beta*.

In Windows 8.1 and Windows 8, this wasn’t an issue because those platforms each had a fixed set of BCL references. You’d know for sure what BCL version’s you’d get for each of those. But now with UWP, that’s no longer true, so you need to specify them.

Fortunately, you don’t have to figure out all of the dependencies by hand. Instead, you can use my handy NuSpec.ReferenceGenerator tool (NuGet|GitHub) to add those dependencies to your NuSpec file.

The ReadMe is fairly detailed, but for the majority of projects, if you have a NuSpec file whose filename matches your project name (like MyAwesomeLibrary.csproj with a MyAwesomeLibrary.nuspec sitting somewhere under the .sln dir), adding the reference should be all you need.

For a UWP Class Library package, you should have the following in your NuSpec:

  • A dependency group with the uap10.0 TFM
  • In your Project Build options for Release mode, choose “generate library layout”
  • Copy the entire directory structure of the output to your \lib\uap10.0 dir.