dotnet

Announcing Reactive Extensions for .NET 4.0 Preview 1

May 27, 2017 Coding 1 comment , , ,

Announcing Reactive Extensions for .NET 4.0 Preview 1!

I am happy to announce that the first preview of Rx.NET 4.0 is now available. This release addresses a number of issues and contains several enhancements.

The biggest enhancement is consolidating the existing packages into one main package, System.Reactive NuGet. This will prevent most of the pain around binding redirects that were caused by #205. If you are using Rx 4.0 and need to use libraries built against Rx 3.x, then you need to also install the compatibility package System.Reactive.Compatibility. That package contains facades with type forwarders to the new assembly so types are unified correctly. You only need this compatibility package if you are consuming a library built against 3.x. You do not need it otherwise.

If you’re interested in the background behind the version numbers, I suggest reading the thread as it contains the gory details. While the idea was technically sound, it did mean that binding redirects were required for all .NET Framework uses. We heard the feedback loud and clear that this was really painful and took steps to fix it in 4.0.

The fix was to consolidate the previous set of packages into a single System.Reactive package. With the single package, binding redirects are no longer required and the platforms will get the correct Rx package version.

Please try it out and let us know if you encounter any issues at our repo. The full release notes are there too.

Multi-targeting the world: a single project to rule them all

January 4, 2017 Coding 7 comments , , , , , , ,

Multi-targeting the world: a single project to rule them all

Starting with Visual Studio 2017, you can now use a single project to build platform-specific libraries for all project types. This blog will explore why you might want to do this, how to do it and workarounds for some point-in-time issues with the tooling.

Contents

Intro

Since the beginning of .NET Core, the project.json format has enabled multi-targeting, that is compiling to multiple target frameworks in parallel and creating an output for each. With ASP.NET Core, it’s common to target both net45 and netcoreapp1.0 so you can deploy the site to either the desktop framework, which runs on Windows, or to the CoreCLR, which runs cross-platform. Multi-targeting is nothing more than compiling the same code multiple times, once per target platform. Each target can specify its own dependencies and ifdef‘s, so you can easily tailor the code to the specific platform.

Another example may have a library target netstandard1.0, netstandard1.3, and net45 to enable different levels of functionality based on the available surface area.

While it was also possible to target UWP, Win8, or profile-based PCL’s, using project.json, doing so required hacks like private copies of all reference assemblies, WinMD files and more. Beyond that, some things didn’t work correctly as some platforms require additional targets to generate additional outputs like .pri files on UWP for resource lookup. So while technically possible, full multi-targeting was brittle and required you to stay in a very narrow path, avoiding things like resources or GUI elements that require the full tool-chain to process.

Enter MSBuild

With the move to MSBuild as part of the .NET Core Tooling direction change, the picture gets much better, so much so that with VS 2017 RC2, you can correctly multi-target all platform types, including UWP, profile-based PCL’s, and Xamarin iOS/Android. Not only that, but by conditionally including/excluding directories based on globs, you can reduce the need for ifdef‘s in many cases.

As part of being open sourced and enabled to run cross-platform, the build targets and tasks required to actually do the build were combined into an SDK. This went along with drastic simplification of the csproj file to have a minimal footprint, that will get even smaller, like this:

<Project Sdk="Microsoft.NET.Sdk">
  <PropertyGroup>
    <OutputType>Exe</OutputType>
    <TargetFramework>netcoreapp1.0</TargetFramework>
  </PropertyGroup>
  <ItemGroup>
    <PackageReference Include="Microsoft.NETCore.App" Version="1.0.1" />
  </ItemGroup>
</Project>

Microsoft’s blog details all of the improvements in this area. For current lack of a better term, I’ll call projects based on these new tools “SDK style.” The easiest way to identify these “SDK style” projects is by looking for the Sdk attribute in the top Project element.

Multi-targeting vs. .NET Standard Libraries vs. PCL’s

Before we go further, let’s answer this question that many people have asked — why would you want to multi-target vs just use a single portable library, whether that’s .NET Standard or an older profile-based PCL?

There are several answers to that question — first, if your code can all fit within a single .NET Standard-based library, then there’s no reason to multi-target. If you’re using a legacy profile-based PCL, at the very least consider moving up to the equivalent .NET Standard version. Don’t make more work for yourself. The decision to multi-target falls out of a need to use functionality that doesn’t exist within a .NET Standard version or if you need to target an earlier platform that doesn’t support the .NET Standard version you need. A common example is that many libraries still need to support .NET 4.5. Despite a significant amount of functionality available in .NET Standard 1.3, that .NET Standard version only supports .NET 4.6+. Chances are though that the code would work “just fine” on .NET 4.5, so it’s easy to multi-target to both net45 and netstandard1.3.

The other main reason why you’d need to multi-target is to use platform-specific code within your library. For example, on iOS you might want to use SecKeyChain for saved credentials, on Android use its Context to access shared services like preferences, and on Windows its Credential Manager. You might have a common method called GetCredential that other code uses to get the data. Today you might use dependency injection or reflection to access a “.Platform” library with a specific implementation that your common code uses. Instead, you can choose to multi-target and access the platform code directly.

How to multi-target

Let me start by saying that the methods here are based on the new “SDK-style” projects that VS 2017 provides. They orchestrate using the existing project types that are installed by Visual Studio. As such, the build itself won’t work on a box without the other tools installed (so you’re building on a Windows box, much like you probably are today). Some of these may work on a Mac with Visual Studio for Mac but I have not tested that in any way. When you install Visual Studio 2017, make sure to install all of the tools for the project types you need (Xamarin, UWP, etc) and also the .NET Core Tooling.

There’s no UI in VS for adding additional target frameworks, but I have some samples that show what to do.

First, create a new .NET Core Class Library project. If you don’t see the following option, make sure to install the .NET Core workload in the VS Installer.

New .NET Core Class Library
.NET Core workload

Right-click the project and select “Edit project file…”. This is new in VS 2017 – the ability to edit the project file while it’s open and have changes instantly reflected.

In the editor, after noticing how much less boilerplate code there is now, look for the TargetFramework property that looks like this: <TargetFramework>netstandard1.3</TargetFramework> property. Change that to <TargetFrameworks>netstandard1.3;net45</TargetFrameworks> to target .NET 4.5 and NET Standard 1.3. You can add however many targets you want by adding to that semi-colon list. It’s subtle, but note the difference in property names between TargetFramework and TargetFrameworks with a plural. It’s easy to miss.

For some frameworks, like .NET 4.5, that’s all you need to do. However, targeting .NET Standard and .NET 4.x is far from “the world.” We can do better! You would think it should be as easy as adding additional TFM’s like uap10.0, xamarin.ios10 or MonoAndroid70 to the list, and hopefully by the time the tools RTM it will be, but for now we need to add extra properties to the project file to tell MSBuild what to do with those.

Fortunately, and here’s the real secret, the “SDK-style” build system has a LanguageTargets property that you can specify per TFM to import the targets for that project type instead of the vanilla Microsoft.CSharp.targets import. That means we can use the “Windows Xaml”, Android, iOS, or any other platform tool-chain we need.

Xamarin Example

In the example here, I have a class library that multi-targets to net45, uap10.0, netstandard1.3, Xamarin.iOS10 and MonoAndroid70. In this contrived library, I have a Greeter class that’s calling a Hello() method that needs platform specific code. I’m using a pattern where I have a directory for each TFM where code in there only gets included there, so no ifdef‘s are needed. For Android, Resources are supported if you need them. While the example doesn’t currently use them, you could use PList‘s, xib‘s or Story Boards on iOS, Page‘s on UWP, or any other “native” file type supported by the platform.

Win81/WP8/PCL/Wpa81/Xamarin/Net45 Example

As a more realistic example, one of my libraries, Zeroconf, an mDNS discovery library, targets “the world.” It currently has concrete implementations for wp8, Wpa81, Win8, portable-Wpa81+Win81, uap10.0, net45, and netstandard1.3 (which supports Xamarin and CoreCLR.) In addition to the the concrete implementations, it provides a netstandard1.0 façade to support being used in portable libraries. The different concrete implementations are required due to differences in the networking stacks between the various Windows networking stacks. For now, the uap10.0 version cannot use the netstandard1.3 version until NetworkInformation is fully supported by the platform, so it continues to use the WinRT variant. You can see the platform-specific code in the platforms directory and then how they’re conditionally included by the csproj in the ItemGroups

The property groups at the top contain the LanguageTargets and properties needed. For portable-Wpa81+Win81 two extra items are required as the special PCL profile also supports WinRT. The ItemGroup here has two TargetPlatform to pull in the correct .winmd references.

Building

You can build the libraries either in VS 2017 or the command-line. If you use the command line, you’ll want to run the following from a VS 2017 Developer Command Prompt: msbuild /t:restore followed by msbuild /t:build. If you want to create a NuGet package, you can run msbuild /t:pack. It’s important to note that you must currently use msbuild, the desktop version in the VS 2017 path, to build these and not dotnet build. The reason is that while dotnet build calls MSBuild, it’s currently using a CoreCLR version even though the desktop version is present in your VS installation. The engineering team is aware of this and in the future, dotnet build will be smart enough to call the desktop version of msbuild when present. The “regular” targets file we’re using to support the platform-specific features are designed for Desktop MSBuild. They do not yet have support for CoreCLR tasks. Bottom line, as of the current release: if your targets use build tasks, then you need to provide both CoreCLR and Desktop versions of the library in order to support both “regular” MSBuild and dotnet build.

Common gotcha’s

There are several bugs in the tool-chain currently that are in the process of being fixed:

  • Some Project-to-project (p2p) references aren’t resolving correctly. Whereas they should resolve to the “best” match, they are resolving to the first TFM in the list.
  • Another bug is preventing a “legacy” csproj from doing a p2p reference with a “Portable Library can only reference other portable library” error.
  • Files that are conditionally included won’t show up in the Solution Explorer. As a workaround, include all files with None as the first item group (see example).
  • for iOS (and possibly Android), you need to set DebugType to full as the Xamarin ConvertPdb2Mdb task doesn’t yet support the new Portable PDB format generated by this tool-chain.
  • Win8, Win81, and uap10.0 aren’t correctly understood by the NuGet targets today. As a workaround, you need to include the NugetTargetMoniker property set to the full TFM as shown here. Similarly, for legacy PCL targets, it requires Version=v0.0 in the NugetTargetMoniker here. These should hopefully be fixed by GA.
  • Windows assemblies that use resources need a .pri file alongside them. They’re currently missing from the generated NuGet. Workaround is to use your own .NuSpec for now until the bug is fixed.

Into the weeds, how it all works

This is by no means an official explanation, it’s what I’ve found from exploring the SDK build targets. Some of the terminology and concepts may change over time.

The “SDK style” projects consist of a set of targets/tasks that are pre-installed with MSBuild (and the CLI tools). You can see them in the following directory: C:\Program Files (x86)\Microsoft Visual Studio\2017\<sku>\MSBuild\Sdks where <sku> is Community, Professional, or Enterprise, depending on what you installed. The two SDK’s you’re likely to use directly are Microsoft.NET.Sdk and Microsoft.NET.Sdk.Web.

The Sdk attribute causes an Sdk.props and Sdk.targets within the specified SDK’s \Sdk directory to be imported before and after the project file. The Microsoft.NET.Sdk SDK’s targets defines an “outer” and “inner” build. The “outer-loop” is what your project file directly defines, including several TFM’s in the TargetFrameworks property. If you only have a single build with a TargetFramework property defined, then there’s only an “inner-loop”.

For an “outer-loop” build, the SDK targets imports props/targets in a buildCrossTargeting directory (soon to be renamed to buildMultiTargeting). Those get auto-included before and after the main project file (props before, targets after.) The “outer-loop” targets will eventually loop through each of the TargetFrameworks calling msbuild again in an “inner-loop” with TargetFramework set to one TFM. This “inner-loop” build is what we currently have in today’s “normal” project types. The “inner-loop” build provides an extension point for providing your language-specific targets (the Import that was at the bottom of your old csproj before) in place of the “vanilla” one it’ll include by default. By providing a LanguageTargets property for the “inner-loop,” conditioned by TFM, we can use the “original” targets that invoke the full tool-chain for the target platform. See here, here and here for UWP, iOS, and Android, respectively.

Within each conditionally defined property group, we can set properties that are specific to a particular “inner-loop.” These correspond to the properties in your existing platform-specific project file and are used by the platform-specific targets specified.

One thing you give-up currently is any UI in VS for configuring these properties. Perhaps they’ll return sometime in the future. For now, one thing I’ve found helpful is to maintain a few “dummy” projects where I can edit some settings to see the values and then put them into my multi-targeting csproj.

Looking forward

As of today (January 4, 2017), the tooling is in a fairly rough state. The .NET Core tooling is rightfully in an “alpha” state. The MSBuild SDK is under active development and things will change before GA. There are a number of issues in the tooling that can make it hard to use today, but I expect those to be fixed soon. Most of the bugs I’ve found are slated to be fixed in the RC3 time-frame, and I’d expect things to be better with that release.

As to whether-or-not to take the plunge today: I’d suggest that if you have a tolerance for figuring this out and reporting issues you’ll encounter, then go for it. If you have a complex project today that already multi-targets a different way (most likely by using multiple “head” projects and shared code project types), I would recommend trying this out in a branch to see how far you get. I’ll be happy to help, just give me a shout. The more the community bangs on this stuff up front, the more issues can be addressed prior to GA.

Acknowledgments

Many thanks to Brad Wilson, Joe Morris, and Daniel Plaisted for reviewing this post and providing feedback.

Authenticode Signing Service and Client

September 12, 2016 Coding 1 comment , , , , ,

Authenticode Signing Service and Client

Last night I published a new project on GitHub to make it easier to integrate Authenticode signing into a CI process by providing a secured API for submitting artifacts to be signed by a code signing cert held on the server. It uses Azure AD with two application entries for security:

  1. One registration for the service itself
  2. One registration to represent each code signing client you want to allow

Azure AD was chosen as it makes it easy to restrict access to a single application/user in a secure way. Azure App Services also provide a secure location to store certificates, so the combination works well.

The service currently supports either individual files, or a zip archive that contains supported files to sign (works well for NuGet packages). The service code is easy to extend if additional filters or functionality is required.

Supported File Types

  • .msi, .msp, .msm, .cab, .dll, .exe, .sys, .vxd and Any PE file (via SignTool)
  • .ps1 and .psm1 via Set-AuthenticodeSignature

Deployment

You will need an Azure AD tenant. These are free if you don’t already have one. In the “old” Azure Portal, you’ll need to
create two application entries: one for the server and one for your client.

Azure AD Configuration

Server

Create a new application entry for a web/api application. Use whatever you want for the sign-on URI and App ID Uri (but remember what you use for the App ID Uri as you’ll need it later). On the application properties, edit the manifest to add an application role.

In the appRoles element, add something like the following:

{
  "allowedMemberTypes": [
    "Application"
  ],
  "displayName": "Code Sign App",
  "id": "<insert guid here>",
  "isEnabled": true,
  "description": "Application that can sign code",
  "value": "application_access"
}

After updating the manifest, you’ll likely want to edit the application configuration to enable “user assignment.” This means that only assigned users and applications can get an access token to/for this service. Otherwise, anyone who can authenticate in your directory can call the service.

Client

Create a new application entry to represent your client application. The client will use the “client credentials” flow to login to Azure AD
and access the service as itself. For the application type, also choose “web/api” and use anything you want for the app id and sign in url.

Under application access, click “Add application” and browse for your service (you might need to hit the circled check to show all). Choose your service app and select the application permission.



Finally, create a new client secret and save the value for later (along with the client id of your app).

Server Configuration

Create a new App Service on Azure (I used a B1 for this as it’s not high-load). Build/deploy the service however you see fit. I used VSTS connected to this GitHub repo along with a Release Management build to auto-deploy to my site.

In the Azure App Service, in the certificates area, upload your code signing certificate and take note of the thumbprint id. In the Azure App Service, go to the settings section and add the following setting entries:

NameValueNotes
CertificateInfo:Thumbprintthumbprint of your certThumbprint of the cert to sign with
CertificateInfo:TimeStampUrlurl of timestamp server
WEBSITE_LOAD_CERTIFICATESthumbprint of your certThis exposes the cert’s private key to your app in the user store
Authentication:AzureAd:AudienceApp ID URI of your service from the application entry
Authentication:AzureAd:ClientIdclient id of your service app from the application entry
Authentication:AzureAd:TenantIdAzure AD tenant IDeither the guid or the name like mydirectory.onmicrosoft.com

Enable “always on” if you’d like and disable PHP then save changes. Your service should now be configured.

Client Configuration

The client is distributed via NuGet and uses both a json config file and command line parameters. Common settings, like the client id and service url are stored in a config file, while per-file parameters and the client secret are passed in on the command line.

You’ll need to create an appsettings.json similar to the following:

{
  "SignClient": {
    "AzureAd": {
      "AADInstance": "https://login.microsoftonline.com/",
      "ClientId": "<client id of your client app entry>",
      "TenantId": "<guid or domain name>"
    },
    "Service": {
      "Url": "https://<your-service>.azurewebsites.net/",
      "ResourceId": "<app id uri of your service>"
    }
  }
}

Then, somewhere in your build, you’ll need to call the client tool. I use AppVeyor and have the following in my yml:

environment:
  SignClientSecret:
    secure: <the encrypted client secret using the appveyor secret encryption tool>

install: 
  - cmd: appveyor DownloadFile https://dist.nuget.org/win-x86-commandline/v3.5.0-rc1/NuGet.exe
  - cmd: nuget install SignClient -Version 0.5.0-beta3 -SolutionDir %APPVEYOR_BUILD_FOLDER% -Verbosity quiet -ExcludeVersion -pre

build: 
 ...

after_build:
  - cmd: nuget pack nuget\Zeroconf.nuspec -version "%GitVersion_NuGetVersion%-bld%GitVersion_BuildMetaDataPadded%" -prop "target=%CONFIGURATION%" -NoPackageAnalysis
  - ps: '.\SignClient\SignPackage.ps1'
  - cmd: appveyor PushArtifact "Zeroconf.%GitVersion_NuGetVersion%-bld%GitVersion_BuildMetaDataPadded%.nupkg"  

SignPackage.ps1 looks like this:

$currentDirectory = split-path $MyInvocation.MyCommand.Definition

# See if we have the ClientSecret available
if([string]::IsNullOrEmpty($env:SignClientSecret)){
    Write-Host "Client Secret not found, not signing packages"
    return;
}

# Setup Variables we need to pass into the sign client tool

$appSettings = "$currentDirectory\appsettings.json"

$appPath = "$currentDirectory\..\packages\SignClient\tools\SignClient.dll"

$nupgks = ls $currentDirectory\..\*.nupkg | Select -ExpandProperty FullName

foreach ($nupkg in $nupgks){
    Write-Host "Submitting $nupkg for signing"

    dotnet $appPath 'zip' -c $appSettings -i $nupkg -s $env:SignClientSecret -n 'Zeroconf' -d 'Zeroconf' -u 'https://github.com/onovotny/zeroconf' 

    Write-Host "Finished signing $nupkg"
}

Write-Host "Sign-package complete"

The parameters to the signing client are as follows. There are two modes, file for a single file and zip for a zip-type archive:

usage: SignClient <command> [<args>]

    file    Single file
    zip     Zip-type file (NuGet, etc)

File mode:

usage: SignClient file [-c <arg>] [-i <arg>] [-o <arg>] [-h <arg>]
                  [-s <arg>] [-n <arg>] [-d <arg>] [-u <arg>]

    -c, --config <arg>            Full path to config json file
    -i, --input <arg>             Full path to input file
    -o, --output <arg>            Full path to output file. May be same
                                  as input to overwrite. Defaults to
                                  input file if ommited
    -h, --hashmode <arg>          Hash mode: either dual or Sha256.
                                  Default is dual, to sign with both
                                  Sha-1 and Sha-256 for files that
                                  support it. For files that don't
                                  support dual, Sha-256 is used
    -s, --secret <arg>            Client Secret
    -n, --name <arg>              Name of project for tracking
    -d, --description <arg>       Description
    -u, --descriptionUrl <arg>    Description Url

Zip-type archive mode, including NuGet:

usage: SignClient zip [-c <arg>] [-i <arg>] [-o <arg>] [-h <arg>]
                  [-f <arg>] [-s <arg>] [-n <arg>] [-d <arg>] [-u <arg>]

    -c, --config <arg>            Full path to config json file
    -i, --input <arg>             Full path to input file
    -o, --output <arg>            Full path to output file. May be same
                                  as input to overwrite
    -h, --hashmode <arg>          Hash mode: either dual or Sha256.
                                  Default is dual, to sign with both
                                  Sha-1 and Sha-256 for files that
                                  support it. For files that don't
                                  support dual, Sha-256 is used
    -f, --filter <arg>            Full path to file containing paths of
                                  files to sign within an archive
    -s, --secret <arg>            Client Secret
    -n, --name <arg>              Name of project for tracking
    -d, --description <arg>       Description
    -u, --descriptionUrl <arg>    Description Url

Contributing

I’m very much open to any collaboration and contributions to this tool to enable additional scenarios. Pull requests are welcome, though please open an issue to discuss first. Security reviews are also much appreciated!

Targeting .NET Core

July 29, 2015 Coding 7 comments , , , ,

Targeting .NET Core

Problem

Since DNX was announced, library authors have been inundated with requests to support .NET Core and the CoreCLR. Up until now, the only real option was to use the DNX-based project.json build system with the Visual Studio xproj projects. Adding these project types into an existing project that already supports a wide-range of platform targets can be challenging. There are a few issues with the current approach:
– Not all project types can be built with project.json
– It’s been a moving target as DNX is rightfully still in beta.
– Without proper guidance, authors have been targeting dnxcore50 in their packages intended for .NET Core instead of dotnet
– To be fair, dotnet is a recent update that has been little publicized

Starting today though, there’s a better way. Just make sure to install the Windows developer tooling as it includes this new functionality.

Terminology

If we go back to the .NET Core presentation back in November, you may remember this diagram:

In terms of terminology, .NET Core should be your target; CoreCLR is just a runtime. Referring to the diagram, the dnxcore50 Target Framework Moniker refers to the box in the upper-right — it’s the ASPNet 5 app model. It is BCL + DNX specific libraries. Similarly, uap10.0 is the Windows Universal app model, BCL + Windows Runtime.

Many (most?) libraries do not actually need the DNX or WinRT dependencies. All they really need are the BCL libraries. What then is the target there? The answer is dotnet. By using dotnet, you instead specify your dependencies in your nuget package and your package will then run on any supported runtime, including CoreCLR, .NET Native and .NET 4.6 (assuming you’re using the newest BCL packages.)

Existing Libraries

What has been lost in the commotion around DNX, CoreCLR and .NET Core is the fact that “Profile 259″+ Portable Class Libraries, class libraries that target a minimum of .NET 4.5, Windows 8 and Windows Phone 8, can run on CoreCLR as-is. You do not need to create a new project or target newer contract/BCL references. All you need is to put your existing library into \lib\dotnet in your NuGet package in addition to the \lib\portable-* directory it is now and list your dependencies in the package.

The only time you might need a new project is if you have platform-specific code. In that case, the new UWP tools for Windows 10 has a better option: “Modern PCLs”. Once you install the UWP tools, create a new Class Library (Portable) in your solution and make sure only .NET 4.6, Windows Universal 10 and ASP Net 5 is checked. When you do that, you’ll get a modern PCL that uses project.json and pulls in the newest .NET Core packages as references. You can then use linked files, shared projects and your existing techniques to build a class library that targets .NET Core. Then, put that in your \lib\dotnet directory and create the dependencies element for it. No magic needed. Using this technique, I was able to adapt several OSS libraries to support .NET Core in very little time.

NuGet Dependencies – the heart of dotnet

As I described in my previous post, the key to making dotnet work is specifying all of your dependencies. This can be a tedious and error-prone process. I’ve built a tool, NuGet.ReferenceGenerator that automates creation of the dependency element for the majority of cases. The tool works with either existing compatible PCL projects and the new “modern PCL” projects.

Just add the NuSpec.ReferenceGenerator NuGet to your package and build. I won’t go over all of the docs, but you can find those on the project site.

At build time, the tool will read the references your assembly requires, determine the source NuGet package and version, and create the <dependencies> element in the NuSpec.

Call To Action

  • If you maintain a library, review any areas where you are currently targeting dnxcore50 and update your NuGet package to put those bits in dotnet. If you are not using any Microsoft.Dnx references, and the majority of libraries do not, then there’s no reason to target dnxcore50 when dotnet reaches a far broader set of targets.
    • Bonus by using the “Modern PCL” projects and/or reusing your existing PCL, your dependencies will be the stable versions, not pre-release. That means your package can be stable too and not wait until Q1 2016!.
  • If you currently have a library that’s a “System.Runtime”-based PCL, one that’s at least portable-win8+net45+wp8, then simply add a copy of the binary to your NuGet package in the dotnet directory. Adding it to \lib\dotnet and leaving a copy in lib\portable-win8+net45+wp8 allows it to work with .NET Core and the existing NuGet v2 clients.
  • Ensure your NuGet package lists all of its dependencies in a <dependencies targetFramework="dotnet"> element. Use the stable package versions, not the DNX pre-release versions. If you don’t want to create and maintain this by hand, use my ReferenceGenerator.
  • Last, but most importantly, make sure your nuget.exe version is up-to-date by running nuget update -self. Version 2.8.6 or later is required to properly package dotnet.

Demystifying PCLs, .NET Core, DNX and UWP (Redux)

June 16, 2015 Coding 1 comment , , ,

[Disclaimer: Many of the things I talk about here may not work in the RC of Visual Studio 2015. The information is taken from Microsoft’s public repos on GitHub and from conversations with members of the .NET team. The information herein is accurate at the time of writing but as with everything pre-release, things may change!]

Intro

A few days ago, I posted an article trying to explain my current understanding of how the new .NET Core libraries fit into the existing ecosystem. Since then, I’ve had more conversations with a few people on the .NET Team (many thanks to David Kean and Eric St. John!) that clarify the meaning of the dotnet target framework and how the pieces all fit together. This blog will attempt to explain further.

TL;DR

dotnet is not a specific target framework—it means “I’m compatible with any target framework that my dependencies are compatible with.” Read on for more.

Let’s start at the very beginning (a very good place to start!)

To help explain where things are going, it helps to have some background for context. Before we had any such thing as Portable Class Libraries (PCLs), if we wanted to use a library on multiple frameworks, we had to compile it multiple times. The figure below illustrates the state of the world circa 2010.
Before PCLs

The only real strategy for code sharing was to use linked files and many #ifdefs, as there were wide differences in capabilities between the frameworks. A solution would contain multiple projects, one per target framework. Each project would contain platform-specific references and would generate a binary compatible only with its target platform. This situation was not scalable as future frameworks and platforms would only lead to even more file linking.

The birth of PCLs

In early 2011, Microsoft released the first version of Portable Class Libraries as a toolset for Visual Studio 2010. These tools allowed creation of single binary targeting the .NET Framework, Silverlight, Windows Phone 7 and Xbox 360. They accomplished this by finding the lowest common denominator of functionality shared among the target frameworks. The available functionality changed to match your selection:
PCL target framework dialog

From this early start, the tools grew over time. Visual Studio 2012 included support for PCLs without the need for an add-in. The list of target frameworks and versions increased; now you could choose .NET Framework 4 or 4.5. You could choose Silverlight 4 or Silverlight 5. Windows Phone gained options for 7.5, 8 and 8.1. We saw support added for additional platforms like Windows 8 and 8.1 Store applications. In 2013, Windows Phone App 8.1 made its first appearance. In early 2014 Xamarin added support for Portable Class Libraries, providing additional target frameworks for their iOS and Android platforms.

Making the sausage

They say that if you enjoy eating sausage, you should never see how it’s made. I personally don’t find ignorance to be bliss and strive to understand how things are made. The same could be said for PCLs—don’t look under the covers unless you’re prepared for what you may see! As one might imagine, there’s quite a bit going on to enable PCLs. In the current system, there are really two main components: contract assemblies and profiles.

Contract Assemblies

Contract assemblies are a special kind of assembly that contains types/metadata but no actual implementation. Think of this as a compile-time reference. A library can reference one or more contract assemblies and the compiler will use the type information in the file. At runtime, when a type is requested from the contract assembly, the loader sees either a TypeForwarder pointing to a concrete implementation or assembly metadata indicating redirection is allowed for the library. The indirection enables types to live in different assemblies in the implementation (think Silverlight vs .NET) but be referenced from a single dll. It also enables the runtime to substitute one type for another even if the assembly versions don’t match.

The best way to think of a contract assembly is like a promise that a specified surface area is present. Your library can reference that assembly and then it’ll run on any target framework that implements that contract. Not all target frameworks support all versions of a particular contract. When working with a least-common-denominator based system, like PCLs, you’ll see fewer types available when you check more/older target frameworks. What Microsoft has done is pre-generate all of the permutations of those checkboxes so that you have a contract assembly for each possible option.

Profiles

That leads us squarely into PCL profiles. These are the things like Profile259 or Profile78 that people most associate with PCLs. In order to support every permutation of target frameworks that you, as a library author, want to choose, Microsoft pre-computed over fifty profiles to date. The profiles are collections of contract assemblies that represent the intersections of the public surface area from the targets. What people really mean by saying Profile259 is that they’re targeting .NET 4.5, Windows 8, Windows Phone 8 Silverlight and Windows Phone 8.1. The number is just a shorthand for spelling out each target framework. It was never really the intent for the profiles to be what people talked about, it was always supposed to be about the target platforms.

What each profile represents, then, is a set of contract assemblies supported by a set of target frameworks. The profiles, in sum, represent every combination of possible contract assemblies. Taken one step further, what ultimately matters to a library isn’t the target framework; rather, what matters to a library are the contracts available to it through the selected set of target frameworks. The profile itself is just a transitive way to get that set of contracts.

Enter the NuGet

It’s not possible to have a complete discussion about PCLs without mentioning NuGet. In parallel to the rise of PCLs, community support was growing around using NuGet (and its package format by extension) as the de facto way of distributing library components. One of a NuGet’s key features is the ability to support multiple target platform versions within a single package. NuGet accomplishes this by using Target Framework Monikers (TFMs) that represent each platform. For example, net means .NET Framework, wp is Windows Phone and netcore is Windows Store. NuGet adds a version number to the TFM so that we get the common usage: net45, wp8, netcore451, which translates to .NET 4.5, Windows Phone 8 and .NET Core 4.5.1 (Windows 8.1) respectively. PCLs are supported in NuGet by using the portable TFM combined with the set of supported TFMs that the library targets. Using our earlier example of PCL Profile259, that would be portable-net45+netcore45+wpa81+wp8 inside a NuGet package.

The breaking point

There are two breaking points in this system: 1) Library authors need to update their NuGet packages to specify compatible targets, and 2) Using pre-computed contracts for PCLs is not scalable. This summer, two new runtimes, CoreCLR and .NET Native are being introduced; the desktop .NET Framework has a new 4.6 version coming out too. At the same time, a new application platform, the .NET Execution Environment (DNX), on which ASP.Net 5 is based, and a new version of the Windows “modern” platform, the Universal Windows Platform (UWP), are set to appear. It was time for a change. Adding support for UWP and DNX in combination with CoreCLR, Desktop .NET and .Net Native would be untenable with pre-computing contracts. Further, with .NET Core becoming Open Source and moving to GitHub, .NET 4.6, CoreCLR and .NET Native would support an application-local Base Class Library (BCL). The surface area available to those newer platforms was poised to explode.

To make the issue concrete, let’s look at an example. Most people are likely familiar with the Newtonsoft.Json NuGet package for working with JSON data. The library, Json.NET, aims to support every .NET platform available. In addition to compiling the code many different times with #ifdefs to accommodate older platforms, as new platforms appear, the Json.NET author needs to update the NuGet package too. That means that as new platforms like UWP and DNX appear, despite targeting a set of contract libraries (remember, all libraries really reference contracts, not platforms), the author needs to keep updating packages to add each new platform to the supported platform list.

What we’re experiencing here is an impedance mismatch between what the library cares about and what NuGet supports. The mismatch highlights, as fundamentally broken, a model that puts the onus on each library author to keep up-to-date with the available platforms and contract-to-platform support matrix. Libraries that would otherwise work on a target platform may not be understood as compatible by NuGet. While it is true that NuGet has a set of heuristics to accommodate additional platforms, the heuristics are also not scalable as they’re hard-coded into each NuGet client version.

Fixing the impedance mismatch: dotnet to the rescue

Over the past year, as “One Microsoft” has taken hold, you started to see the NuGet and .NET CLR teams work much closer together. Based on community feedback, NuGet was chosen as the de facto mechanism to deliver future versions of .NET that can run as self-contained app-local packages. In order to support the ever-increasing complexity placed upon it, NuGet had to evolve. You can read more about NuGet’s evolution to 3.0 on the NuGet team blog in posts from April 2014-November 2014.

One of the most recent changes to NuGet, and the .NET ecosystem by extension, is support for the dotnet TFM. The meaning of dotnet wasn’t clear at first and as reflected in my earlier blog post, it seemed like it was the new target for the “new” portable .NET packages being published to NuGet and consumed by DNX and UWP. The reality isn’t quite like that but is far more interesting. Rather than dotnet representing a particular target like netcore45, dnxcore5 or net46, it really means “I’m compatible with any targets that my dependencies are, check those.” It gets NuGet out of the platform guessing game and instead walks the dependency graph.

Practically speaking, the most common set of dependencies for any package will be its contracts – the assemblies referenced at build time. Today, with the platform-TFMs, those contracts don’t need to be listed in the NuGet package as they’re implied by the TFM. With the dotnet-based TFM, NuGet packages will have to specify their dependencies, even system ones. You can see this today with the project.json file that DNX projects use. By explicitly listing the dependencies (which may be CLR contracts), the mismatch between target framework and supported contracts is removed. Instead, each contract package declares its own support by way of its implementation.

The way this is done is beyond the scope of this post, but you can get a sense of it by looking at the layout of the System.IO.FileSystem package below.
System.IO.FileSystem package layout

In the package, you can see two assemblies in the ref folder, called design-time façades, one for .NET 4.6 and one for everything else (CoreCLR, .NET Native, etc). The surface area is identical but they function a bit differently. The façades are used at build time to enable portable assemblies which were built against contracts (System.Runtime-based) to actually resolve those types against the desktop reference assemblies (mscorlib-based). This lets an mscorlib assembly pass its version of string, that lives in mscorlib, to an API in a PCL that takes a string from System.Runtime. The same façades are used at runtime as well. This is something that should usually be considered trivia as most people need not concern themselves about the minutia.

The package contains three implementations of the contract, one for dnxcore50, one for net46 and one for netcore50 (UWP). When I said earlier that the new .NET Core packages would only support the newer platforms, this is the how/what/why. One last thing to note in the above picture, you can see that System.IO.FileSystem itself declares many other dependencies. This is expected; with small, granular, libraries the end result is that you pull in only what you need, not the whole framework.

None of this is to say that dotnet explicitly means the newer platforms though. Microsoft may release the existing contract assemblies, the ones currently in the Profile* directories, as NuGet packages. If they do that, then a library that “targets” dotnet could target .NET 4.5/Win8 as well. The key is that version number of each dependency would be lower than the new ones. The new .NET Core libraries, and their contracts, would all have a higher version number than the existing contracts.

This drives home the point that what dotnet really means is “check my dependencies and I’ll run on any platform my dependencies do.”

The fact that the new .NET Core libraries use this mechanism is actually orthogonal to dotnet’s meaning. dotnet adds its value today with existing code and libraries by changing the question of “what platforms does my library support” to “what dependencies does my library require?”

Coming back to the earlier example of Json.NET, if it were to use dotnet, it would also declare the contracts, with its version, that it needs. It would not have to know or care about what platforms are currently supported by those contracts. In the future, if some new unicorn platform were to appear, so long as newer versions of the contracts were published that supported the unicorn platform, Json.NET would happily run there without any foreknowledge.

Contracts or Dependencies?

Throughout this discussion, I’ve used the terms contracts and dependencies. From the perspective of a library author or consumer, these terms are often interchangeably, but there is a difference. Contracts are one type of dependency – they are specifically crafted reference assemblies. Contracts are useful if you need to have multiple implementations of library for different platforms. Aside from the built-in system reference assemblies, the other place you see contracts are libraries that use the “bait and switch” PCL technique. The vast majority of libraries can be implemented without any platform-specific references and are thus simply dependencies. If this sounds confusing, don’t worry too much about it. This is an advanced technique that most packages don’t need to consider; the only takeaway is that whether contract or “regular” library, they both appear as dependencies in a package.

Wrapping it all up

At first glance, it’s easy to think “whoa, this is complicated!” Upon stepping back though, hopefully the initial complexity melts away with the newfound understanding that what’s happening here is that a layer is being removed. The layer was the platform. Up until NuGet v3 we were trying to cram a round peg into a square hole. We’d gather up an intersection of target frameworks and call it a profile. We’d calculate the contract assemblies for those and the compiler would reference those, but they stayed firmly in the background. Visual Studio intentionally hides the references behind a single .NET entry in a PCL project’s references. This lead to the platform support list being encoded within the NuGet package structure, leaving package authors scrambling to update their packages should a new platform emerge. In many cases, the existing code is already compatible but a package update was required. NuGet v3 eliminates this problem by removing the platform layer and have the ability to go “direct to the dependencies.” This is an opt-in approach for packages that use the new dotnet TFM. Packages can contain both dotnet and the existing TFMs; they are not mutually exclusive.

The new version of .NET Core is dependent on these dependency-driven, framework agnostic packages, but the existing PCL profiles could fit into the model too. That said, dotnet doesn’t mean .NET Core any more than it means any other platform. They’re different things.

Demystifying PCL’s, .NET Core, DNX and UWP

June 9, 2015 Coding 6 comments , , ,

Since the announcement of .NET Core there’s been confusion around what that means for Portable Class Libraries, runtime support, NuGet support and how these “new” libraries relate to the existing PCLs. At least I was confused.

As ASPNet 5 started taking shape, we started hearing about new target frameworks for NuGet, like dnxcore50. Other posts mentioned that the new Windows 10 Universal Windows Platform (UWP) would be using the new .NET Core 5 libraries too, but that lead to the question, what do we call it in NuGet? dnxcore5 is clearly the wrong one as that refers to the Dotnet Runtime Environment.

Current NuGet conventions don’t make things any more clear. Today we have the following target framework names:

  • Win Windows 8 and Windows 8.1
  • Net .NET Framework
  • Wpa Windows Phone App 8.1
  • NetCore Also refers to Windows 8 and 8.1
    • NetCore and Win are used interchangeably and are the same

So far, NuGet has added the following over the course of ASPNet vNext:

  • dnx The Dotnet Runtime Environment for the .NET Framework
  • dnxcore The Dotnet Runtime Environment for the .NET Core CLR

Over the past few days, it seems like the .NET Core team has been busy updating the target names to change from dnxcore5 to something new called dotnet. More confusion to ensue.

Brice Lambson was kind enough to explain it this afternoon and it finally all makes sense, so here is (don’t take this as official advice!) my current understanding. The new world distinguishes between the platform (.NET Framework/CoreCLR) and the app model (desktop/aspnet/UWP) cleanly.

  • dotnet This is the new .NET Core for packages that don’t have any app model requirements.
  • net Existing .NET Framework platform
  • netcore For UWP apps, based on dotnet plus app model specifics
  • dnx ASPNet apps based on the .NET Framework
  • dnxcore ASPNet apps based on the .NET Core framework

These are the targets you’ll most likely care about going forward. Most libraries will want to target dotnet to hit the widest range of consuming apps. dotnet will run on the .NET 4.6 Framework. If you need specific UWP functionality (like XAML in your library), then you’ll need netcore5. If you need AspNet specific items, then you’ll need dnxcore5. If you need something that’s only part of the full .NET Framework, then you’ll need either net46 or dnx46.

This ties into the existing PCL structure by being a new platform. Today you have libraries that support multiple platforms like this portable-net45+netcore45+wpa81. If you want to also include dotnet, then it simply becomes portable-net45+netcore45+wpa81+dotnet. If you can afford to target just Windows 10, .NET 4.6 and ASPNet 5, then having the older platforms severely limits your available surface area. In that case, better to target just dotnet, which can then be consumed by all of the modern platforms.

What does this all mean?

The table below should help explain things. The columns represent target frameworks and the rows are platforms/apps. That is, if your library targets x it’ll run on y.