asp.net

Continuous Deployment of Cloud Services with VSTS

October 18, 2017 Coding 4 comments , , ,

Continuous Deployment of Cloud Services with VSTS

In my last blog post, I showed how you can use ASP.NET Core with an Azure Cloud Service Web Role. The next step is to enable CI/CD for it, since you really shouldn’t be using “Publish” within Visual Studio for deployment.

As part of this, I wanted to configure the Cloud Service settings per environment in VSTS and not have any configuration checked-in to source control. Cloud Services’ configuration mechanism makes this a bit challenging due to the way it stores configuration, but with a few extra steps, it’s possible to make it work.

What you’ll need

To follow along, you’ll need the following:

  • Cloud Service the code can live in GitHub, VSTS, or many other locations. VSTS can build from any of them.
  • Azure Key Vault we’ll use Azure Key Vault to store the secrets. Creating a Key Vault is easy and the standard tier will work.
  • VSTS this guide is using Visual Studio Team Services, so you’ll need an account there. Those are free for up to five users and any number of users with MSDN licenses.

What we’re going to do

The gist here is that we’ll create a build definition that publishes the output of the Cloud Service project as an artifact. Then, we’ll create a release management process that takes the output of the build and deploys it to the cloud service in Azure. To handle the configuration, we’ll tokenize the checked-in configuration, then use a release management task to read configuration values stored in Key Vault and replace the matching tokenized values before the Azure deployment.

Moving the configuration into Key Vault

Create a new Key Vault to hold your configuration. You should have one Key Vault per environment that you intend to release to, since the secret names will directly translate to variables within VSTS. For each setting you need, create a secret with name like CustomSetting-Setting1 or CustomSetting-Setting2 and set their values. Next, in your ServiceConfiguration.Cloud.cscfg, set the values to be __CustomSetting-Setting1__ and __CustomSetting-Setting2__. The __ is the token start/end, and the value identifies which VSTS variable should be used to replace it.

One tip: If you have Password Encryption certificates or SSL endpoints configured, the .cscfg will have the certificates’ SHA-1 thumbprint’s encoded in them. If you want to configure this per environment, then replace those with token values. The configuration checker will enforce that it looks like a thumbprint, so use values like:

  • ABCDEF01234567ABCDEF01234567ABCDEF012345
  • BACDEF01234567ABCDEF01234567ABCDEF012345

Those sentinel values will be replaced with tokens during the build process and those tokens can be replaced with variable values.

We’ll use these in the build task later on.

The build definition

  1. Start with a new Empty build definition.
  2. On the process tab, choose the Hosted VS2017 Agent queue and give your build definition a name.
  3. Select Get Sources and point to your repository. This could be VSTS, GitHub or virtually any other location.
  4. Add the tasks we’ll need: Visual Studio Build (three times), Publish Build Artifacts (once). It should look something like this:
  5. For the first Visual Studio Build task, set the following values:
    SettingValue
    Display nameRestore solution
    SolutionAspNetCoreCloudService.sln
    Visual Studio VersionVisual Studio 2017
    MSBuild Arguments/t:restore
    Platform$(BuildPlatform)
    Configuration$(BuildConfiguration)
  6. For the second Visual Studio Build task, use the following values:

    SettingValue
    Display nameBuild solution
    SolutionAspNetCoreCloudService.sln
    Visual Studio VersionVisual Studio 2017
    MSBuild Arguments
    Platform$(BuildPlatform)
    Configuration$(BuildConfiguration)
  7. And the third Visual Studio Build task should be set as:

    SettingValue
    Display namePublish Cloud Service
    SolutionTheCloudService\TheCloudService.ccproj
    Visual Studio VersionVisual Studio 2017
    MSBuild Arguments/t:Publish /p:OutputPath=$(Build.ArtifactStagingDirectory)\
    Platform$(BuildPlatform)
    Configuration$(BuildConfiguration)
  8. If you are using sentinel certificate values, add a PowerShell Task. Configure the PowerShell task by selecting “Inline Script”, expand Advanced and set the working folder to the publish directory (like $(Build.ArtifactStagingDirectory)\app.publish) and use the following script:

    $file = "ServiceConfiguration.Cloud.cscfg"
    # Read file
    $content = Get-Content -Path $file
    # substitute values
    $content = $content.Replace("ABCDEF01234567ABCDEF01234567ABCDEF012345", "__SslCertificateSha1__")
    $content = $content.Replace("BACDEF01234567ABCDEF01234567ABCDEF012345", "__PasswordEncryption__")
    # Save
    [System.IO.File]::WriteAllText($file, $content)
    

    This replaces the fake SHA-1 thumbprints with tokens that release management will use. Be sure to define variables in release management that match the names you use.

  9. Finally, set the Publish Artifact step to:

    SettingValue
    Display namePublish Artifact: Cloud Service
    Path to Publish$(Build.ArtifactStagingDirectory)\app.publish
    Artifact NameTheCloudService
    Artifact TypeServer
  10. Go to the Variables tab and add two variables:

    NameValue
    BuildConfigurationRelease
    BuildPlatformAny CPU
  11. Hit Save & Queue to save the definition and start a new build. It should complete successfully. If you go to the build artifacts folder, you should see TheCloudService with the .cspkg file in it.

Deploying the build to Azure

This release process depends on one external extension that handles the tokenization, the Release Management Utility Tasks. Install it from the marketplace into your VSTS account before starting this section.

  1. In VSTS, switch to the Releases tab and create a new release definition using the “Azure Cloud Service Deployment” template.
  2. Give the environment a name, like “Cloud Service – Prod”.
  3. Click the “Add artifact” box and select your build definition. Should look something like this:

    If you want continuous deployment, click the “lightning bolt” icon and enable the CD trigger.
  4. Click on the Tasks tab and specify an Azure subscription, storage account, service name and location. If you need to link your existing Azure subscription, click the “Manage” link. If you need a new storage account to hold the deployment artifacts, you can create that in the portal as well, just make sure to create a “Classic” storage account.
  5. Go to the Variables tab and select “Variable groups”, then “Manage variable groups.” Add a new variable group, give it a name like “AspNetCloudService Production Configuration”, select your subscription (click Manage to link one), and select the Key Vault we created earlier to hold the config. Press the Authorize button if prompted.

    Finally, click Add to select which secrets from Key Vault should be added to this variable group.

    It’s important to note that it does not copy the values at this point. The secret’s values are always read on use, so they’re always current. Save the variable group and return back to the Release Management definition. At this point, you can select “Link variable group” and link the one we just created.
  6. Add a Tokenize with XPath/Regular Expressions task before the Azure Deployment task.
  7. In the Tokenizer task, browse to the ServiceConfiguration.Cloud.cscfg file, something like $(System.DefaultWorkingDirectory)/AspNetCoreCloudService-CI/TheCloudService/ServiceConfiguration.Cloud.cscfg depending on what you call your artifacts.
  8. Ensure that the Azure Deployment task is last, and you should be all set.
  9. Create a new release and it should deploy successfully. If you view your cloud service configuration on Azure Portal, you should see the real values, not the __Tokenized__ values.

Summary

That’s it, you now have an ASP.NET Core Cloud Service deployed to Azure with CI/CD through VSTS. If you want to add additional environments, simply add an additional key vault and linked variable group for each environment, clone the existing environment configuration in the Release Management editor and set the appropriate environmental values. Variable groups are defined at the release definition level, so for multiple-environments you can use a suffix in your variable names and then update the PowerShell script in step 7 to append that per environment (__MyVariable-Prod__), etc.

Using ASP.NET Core with Azure Cloud Services

October 16, 2017 Coding 1 comment , ,

Using ASP.NET Core with Azure Cloud Services

Overview

Cloud Services may be the old-timer of Azure’s offerings, but there are still some cases where it is useful. For example, today, it is the only available PaaS way to run a Windows Server 2016 workload in Azure. Sure, you can run a Windows Container with Azure Container Services, but that’s not really PaaS to me. You still have to be fully aware of Kubernetes, DC/OS, or Swarm, and, as with any container, you are responsible for patching the underlying OS image with security updates.

In developing my Code Signing Service, I stumbled upon a hard dependency on Server 2016. The API I needed to Authenticode sign a file using Azure Key Vault’s signing methods only exists in that version of Windows. That meant that using Azure App Services was out, as it uses Server 2012 (based on the version numbers from its command line). That left Cloud Service Web Roles as the sole remaining option if I wanted PaaS. I could have also used a B-Series VM, that’s perfect for this type of workload, but I really don’t want to maintain a VM.

If you have tried to use ASP.NET Core with a Cloud Service Web Role, you’ll probably have come away disappointed as Visual Studio doesn’t let you do this…. until now. Never one to accept no for an answer, I found a way to make this work, and with a few workarounds, you can too.

The solution presented here handles deployment of an MVC & API application that along with config settings and deployment of the ASP.NET Core Windows Hosting Module. VS Cloud Service tooling works for making changes to config and publishing to cloud services (though please use CI/CD in VSTS!)

Many thanks to Scott Hunter‘s team, Jaques Eloff and Catherine Wang in particular, on figuring out a workaround for some gotcha’s when installing the Windows Hosting Module.

Pieces to the puzzle

You can see the sample solution here, and it may be helpful to clone and follow along in VS.

There are a few pieces to making this work:

  1. TheWebsite The ASP.NET Core MVC site. Nothing significantly special here, just an ordinary site.
  2. TheCloudService The Cloud Service project. Contains the configuration files and service definition.
  3. TheWebRole ASP.NET 4.6 project that contains the Web Role startup scripts and “references” the TheWebsite site. This is where the tricks are.

At a high level, the Cloud Service “sees” TheWebRole as the configured website. The cloud service doesn’t know anything about ASP.NET Core. The trick is to get the ASP.NET Core site published and running “in” an ASP.NET site.

Doing this yourself

The Projects

In a new solution, create a new ASP.NET Core 2 project. Doesn’t really matter what template you use. For the descriptions here, I’ll call it TheWebsite. Build and run the site, it should debug and run normally in IISExpress.

Next, create a new Cloud Service (File -> Add -> New Project -> Cloud -> Azure Cloud Service). I’ll call the cloud service TheCloudService, and on the next dialog, add a single Web Site. I called mine TheWebRole.

Finally, on the ASP.NET Template selection, choose “Empty” and continue.

Right now, we have an ASP.NET Core Website and an Azure Cloud Service with a single ASP.NET 4.6 WebRole. Next up is to clear out almost everything from TheWebRole since it won’t actually contain any ASP.NET Code. Delete the packages.config and Web.config files.

Save the project, then select “Unload” from the project’s context menu. Right-click again and select “Edit TheWebRole.csproj”. We need to delete the packages brought in by NuGet along with the imported props and target. There are three areas to delete as noted in the screen shots: Props at the top, all Reference elements with a HintPath pointing to ..\packages\ and the Target at the bottom.



At this point, your project file should look similar to this here. You can also view the complete diff.

Magic

Now comes the special sauce — we need a way to have TheWebRole build TheWebsite and include TheWebsite‘s publish output as Content. Doing this ensures that TheCloudService Package contains the correct folder layout. Add the following snippet to the bottom of TheWebRole‘s project file to call Publish on our website before the main build step.

<Target Name="BeforeBuild">
  <MSBuild Projects="..\TheWebsite\TheWebsite.csproj" Targets="Publish" Properties="Configuration=$(Configuration)" />
</Target>

Then, add the following ItemGroup to include TheWebsite‘s publish output as Content in the TheWebRole project:

<ItemGroup>
  <Content Include="..\TheWebsite\bin\$(Configuration)\netcoreapp2.0\publish\**\*.*" Link="%(RecursiveDir)%(Filename)%(Extension)" />
</ItemGroup>

Save the csproj file, then right-click the TheWebRole and click Reload. You can test that the cloud service package is created correctly by right-clicking TheCloudService and selecting Package. After choosing a build configuration and hitting “Package,” the project should build and the output directory pop up.

The .cspkg is really a zip file, so extract it and you’ll see the guts of cloud service packages. Look for the .cssx file and extract that (again, just a zip file)

Inside there, open the approot folder and that is the root of your website. If the previous steps were done correctly, you should see something like the following

You should see TheWebsite.dll, TheWebsite.PrecompiledViews.dll, wwwroot, and the rest of your files from TheWebsite.

Congratulations, you’ve now created a cloud service that packages up and deploys an ASP.NET Core website! This alone won’t let the site run though since the Cloud Service images don’t include the Windows Hosting Module.

Installing .NET Core 2 onto the Web Role

Installing additional components onto a Web Role typically involves a startup script, and .NET Core 2 is no different. There is one complication though: the installer downloads files into the TEMP folder, and Cloud Services has a 100MB hard limit on that folder. We need to specify an alternate folder to use as TEMP with a higher quota (this is what Jaques and Catherine figured out).

In TheCloudService, expand Roles, right click TheWebRole and hit properties. Go to Local Storage and add a new location called CustomTempPath with a 500MB limit (or whatever else your app might need).

Next, we need the startup script. Go to TheWebRole, add a new folder called Startup and add the following files to it. Ensure that the Build Action is set to Content and that Copy to Output Directory is set to Copy if newer. Finally, we need to configure the cloud service to invoke our startup task. Open the ServiceDefinition.csdef file and add the following xml in the WebRole node to define the startup task:

<Startup>
  <Task commandLine="Startup\startup.cmd" executionContext="elevated" taskType="simple">
    <Environment>
    <Variable name="IsEmulated">
      <RoleInstanceValue xpath="/RoleEnvironment/Deployment/@emulated" />
    </Variable>
    </Environment>
  </Task>
</Startup>

Now we finally have a cloud service that can be deployed, install .NET Core, and run the website. The first time you publish, it will take a few minutes for the role instance to become available since it
has to install the hosting module and restart IIS.

Note: I leave creating a cloud service instance in the Azure Portal as an exercise to the reader

Configuration

There are many ways of getting configuration into an ASP.NET Core application. If you know you’ll only be running in Cloud Services, you may consider taking a direct dependency on the Cloud Services libraries and using the RoleEnvironment types to get populate your configuration. Alternatively, you can likely write a configuration provider that funnels in the RoleEnvironment configuration into the ASP.NET Core configuration system.

In my original case, I didn’t want my ASP.NET Core website to have any awareness of Cloud Services, so I came up with another way—in the startup script, I copy the values from the RoleEnvironment into environment variables that the default configuration settings pick up. The key here to making this transparent is knowing that the double-underscore, __, translates into the : when read from an environment variable. This means you can define a setting like CustomSetting__Setting1, and then you can access it with Configuration["CustomSetting:Setting1"], or similar mechanisms.

To bridge this gap, we can add this to the startup script (complete script):

$keys = @(
  "CustomSetting__Setting1",
  "CustomSetting__Setting2"
)

foreach($key in $keys){
  [Environment]::SetEnvironmentVariable($key, [Microsoft.WindowsAzure.ServiceRuntime.RoleEnvironment]::GetConfigurationSettingValue($key), "Machine")
}

This copies the settings from the Cloud Service Role Environment into environment variables on the host, and from there, the default ASP.NET Core configuration adds them into configuration.

Considerations

  • Session affinity If you need session affinity for session state, you’ll need to configure that.
  • Data Protection API Unlike Azure App Services, Cloud Services doesn’t have any default synchronization for the keys. You’ll need a solution for this. If anyone comes up with a reusable solution, I’ll happily mention it here. More info on configuring DPAPI is here.
  • Local Debugging Due to the way local debugging of cloud services works (it directly uses TheWebRole as a startup project in IIS Express), directly debugging the cloud service does not work with the current patterns. Instead, you can set TheWebsite as a startup project and debug that normally. The underlying issue is that TheWebRole includes TheWebsite as Content and does not copy the published files to TheWebRole‘s directory. It may be possible to achieve this, though you’d likely want additional .gitignore rules to prevent those files from being committed. In my case, I did not want my service to have any direct dependency on Cloud Services, so this wasn’t an issue—I simply needed a Server 2016 web host.

CI / CD with VSTS

It is possible to automate build/deploy of these cloud service web role projects using VSTS. My next blog post will show how to set that up.

Update October 18: The post is live

Connecting SharePoint to Azure AD B2C

September 8, 2016 Coding 4 comments , , ,

Connecting SharePoint to Azure AD B2C

Overview

This post will describe how to use Azure AD B2C as an authentication mechanism for SharePoint on-prem/IaaS sites. It assumes a working knowledge of identity and authentication protocols, WS-Federation (WsFed) and OpenID Connect (OIDC). If you need a refresher on those, there are some great resources out there, including Vittorio Bertocci’s awesome book.

Background

Azure AD B2C is a hyper-scalable standards-based authentication and user storage mechanism typically aimed at consumer or customer scenarios. It is a separate product from “regular” Azure AD. Whereas “regular” Azure AD is normally meant to house identities for a single organization, B2C is designed to host identities of external users. In my opinion, it’s the best alternative to writing your own authentication mechanism (which no one should ever do!)

For one client, we had a scenario where we needed to enable external users to access specific site collections within SharePoint. Azure AD wasn’t a good fit, even with the B2B functionality, as we needed to collect additional information during user sign-up. Out-of-the-box, B2C doesn’t yet support WsFed or SAML 1.1 and SharePoint doesn’t support OpenID Connect. This leaves us needing a tool that can bridge B2C to SharePoint by acting as an OIDC relying party (RP) to B2C and a WsFed Identity Provider (IdP) to SharePoint.

The Solution

Fortunately, the identity guru’s Dominick Baier and Brock Allen created just such a tool with IdentityServer 3. From the docs:

IdentityServer is a framework and a hostable component that allows implementing single sign-on and access control for modern web applications and APIs using protocols like OpenID Connect and OAuth2.

IdentityServer has plugins to support additional functionality, like acting as a WsFed IdP. This means we can use IdentityServer as a bridge from OIDC to WsFed. We’ll register an application in B2C for IdentityServer and then create an entry in IdentityServer for SharePoint.

Here’s a diagram of the pieces:
Diagram

While you can use IdentityServer to act as an IdP to multiple clients, in the model we used, we considered IdentityServer as “part of SharePoint.” That is, SharePoint is the only client and in B2C, the application entry visible is called “SharePoint.” I mention this because B2C allows applications to choose different policies/flows for sign up/sign in, password reset, and more. In our solution, we’ve configured IdentityServer to use a particular set of policies that meet SharePoint’s needs — it many not meet the needs of other applications.

Diving deep

As mentioned, IdentityServer isn’t so much a “drop in product,” but rather, it’s a framework that needs customization. The rest of this post will look at how we customized and configured B2C, IdentityServer and SharePoint to enable the end-to-end flow.

B2C

Let’s start with B2C. As far as B2C is concerned, we register a new web application and create a couple of policies: sign-up/sign-in and password reset. When you register the application, enter the redirect uri’s you’ll need for IdentityServer (localhost and/or your real url.) You don’t need a client secret for these flows.

IdentityServer

Follow the IdentityServer getting started guide to create a blank ASP.NET 4.6 MVC site and install/configure IdentityServer. ASP.NET Core is not yet supported on CoreCLR as .NET Core doesn’t yet have the XML cryptography libraries needed for WsFed (that support will come as part of .NET Standard 2.0.) After installing the IdentityServer3 NuGet, you’ll need to install the WsFed plugin NuGet.

The key here is that we don’t need a local user store as IdentityServer won’t be acting as the user database. We just need to configure B2C as an identity provider and the WsFed plugin to act as an IdP. IdentityServer won’t maintain any state and is simply a pass-through, validating JWT’s and issuing SAML tokens.

Below, I’ll explain some of the core snippets; the full set of files are available here.

Inventory

There are several areas of IdentityServer that need to either be configured or have custom code added:

  • Identity Provider B2C via the standard OIDC OWIN middleware
  • WS-Federation plugin IdentityServer plug for WsFed
  • Relying parties An entry or two for your WsFed/SAML client (SharePoint or a test app configured with WsFed auth)
  • User Service IdentityServer component for mapping external auth to users

Identity Provider

We need to configure B2C as an OIDC middleware to IdentityServer. Due to the way B2C works, we need some additional code to handle the different policies — it’s not enough to configure a single OIDC endpoint. For normal flows, it’ll default to the policy specified in “SignInPolicyId”. Where it gets tricky is in handling password reset.

First, let’s look at the normal “happy path” flow, where a user either sign’s up or sign’s in. Here’s what the flow looks like:
Sign in

In B2C, password reset is a separate policy and thus requires a specific call to the /authorize endpoint specifying the password reset policy to use. If you use the “combined sign up/sign in” policy, which is recommended as it’s the most styleable, it provides a link button for “password reset”. What this does, however, is return a specific error code to the app that started the sign up flow. It’s up to the app to start the password reset flow. Then, once the password reset flow is complete, despite appearing to be authenticated (as defined by having had a signed JWT returned), B2C’s SSO mechanisms won’t consider the user signed in. You’ll notice this if you try to use a profile edit flow or any other flow where SSO should have signed in the user w/o additional prompting. The guidance from the B2C team here is that after the password reset flow completes, an app should immediately trigger the sign in flow again. Logically, this makes sense, as a user started the reset password flow from the sign in screen, once the password is reset, they should logically resume there to actually sign in.

Sign in with password reset

Implementing this all with IdentityServer requires a little bit of extra code. Unfortunately, with IdentityServer, we cannot simply add individual OIDC middleware instances for each endpoint as we would in a normal web app because IdentityServer will see them as different providers and present an identity provider selection screen. To avoid this, we are only configuring a single identity provider and passing the policy as an authentication parameter. The B2C samples provide a PolicyConfigurationManager class that can retrieve and cache the OIDC metadata for each of the policies (sign-up/sign-in and password reset).

Here’s an example from Startup.Auth.B2C.cs:

ConfigurationManager = new PolicyConfigurationManager(
    string.Format(CultureInfo.InvariantCulture, B2CAadInstance, B2CTenant, "https://static-content.oren.codes/v2.0", OIDCMetadataSuffix), 
    new[] {SignUpPolicyId, ResetPasswordPolicyId}),

The main work in getting IdentityServer to handle the B2C flows are in handling the OpenID Connect Event’s RedirectToIdentityProvider, AuthenticationFailed, and SecurityTokenValidated. By handling these three, we can bounce between the flows.

In the Startup.Auth.B2C.cs file, the OnRedirectToIdentityProvider event handler looks for the policy authentication parameter and ensures the correct /authorize endpoint is used. As IdentityServer handles the initial auth call, we cannot specify a policy parameter, so we assume it’s a sign-in. IdentityServer tracks some state for the sign in request, and we’ll need access to it in case the user needs to do a password reset later, so we store it in a short-lived, encrypted, session cookie.

Once the B2C flow comes back, we need to handle both the failed and validated events. If failed, we look for the specific error codes and take appropriate action. If success, we check if it’s from a password reset and then bounce back to the sign in to complete the journey.

WS-Federation plugin

Configuring IdentitySever to act as a WS-Federation IdP is fairly simple: install the plugin package and provide the plugin configuration in Startup.cs. As an aside, don’t forget to either provide your own certificate or alter the logic to pull the cert from somewhere else!

The main WsFed configuration is a list of Relying Parties, seen in RelyingParties.cs. I’ve hard-coded it, but you can generate this data however you see fit.

Relying parties

Within the Relying Party configuration, you can specify the required WsFed parameters, including Realm, ReplyUrl and PostLogoutRedirectUris. The final thing you need is a map of OIDC claims to SAML claim types returned.

User Service

The User Service is what IdentityServer uses to match external claims to internal identities. For our use, we don’t have any internal identities and we simply pass the claims through as you can see in AadUserService.cs. The main thing we do is to extract a few specific claims and tell IdentityServer to use those for name, subject, issuer and authentication method.

WsFed Client (or SharePoint)

Adding a WsFed client should be faily easy at this point. Configure the realm and reply url’s as required and point to the metadata address. For IdentityServer, this is https://localhost:44352/wsfed/metadata by default (or whatever your hostname is.)

ASP.NET 4.6

I find it useful to have a basic ASP.NET MVC site I can use for testing that authenticates and prints out the claims — helps isolate me from difficult SharePoint issues.

With ASP.NET MVC 4.6, add the Microsoft.Owin.Security.WsFederation NuGet package and use this in your Startup class where realm is the configured realm and adfsMetadata is the IdentityServer metadata endpoint:

public void ConfigureAuth(IAppBuilder app)
{
    app.SetDefaultSignInAsAuthenticationType(CookieAuthenticationDefaults.AuthenticationType);

    app.UseCookieAuthentication(new CookieAuthenticationOptions());

    app.UseWsFederationAuthentication(
        new WsFederationAuthenticationOptions
        {
            Wtrealm = realm,
            MetadataAddress = adfsMetadata
        });
}

SharePoint

I will readily confess that I am not a SharePoint expert. I’ll happily leave that to other’s like Bob German, a colleague and SharePoint MVP. From Bob:

The Microsoft documentation is fine, but is oriented toward a connection with Active Directory via AD FS, so it includes claims attributes such as the SID value, which won’t exist in this scenario. The only real claims SharePoint needs are email address, first name, and last name. Any role claims passed in are available for setting permissions in SharePoint. Follow the relevant portions of the documentation, but only map the claims that make sense.

For example,

$emailClaimMap = New-SPClaimTypeMapping -IncomingClaimType "http://schemas.xmlsoap.org/ws/2005/05/identity/claims/emailaddress" -IncomingClaimTypeDisplayName "EmailAddress" -SameAsIncoming
$firstNameClaimMap = New-SPClaimTypeMapping -IncomingClaimType "http://schemas.xmlsoap.org/ws/2005/05/identity/claims/givenname" -IncomingClaimTypeDisplayName "FirstName" -SameAsIncoming
$lastNameClaimMap = New-SPClaimTypeMapping -IncomingClaimType "http://schemas.xmlsoap.org/ws/2005/05/identity/claims/surname" -IncomingClaimTypeDisplayName "LastName" -SameAsIncoming
$roleClaimMap = New-SPClaimTypeMapping -IncomingClaimType "http://schemas.microsoft.com/ws/2008/06/identity/claims/role" -IncomingClaimTypeDisplayName "Role" -SameAsIncoming

New-SPTrustedIdentityTokenIssuer -Name <somename> -Description <somedescription> -realm <realmname> -ImportTrustCertificate <token signing cert> -ClaimsMappings $emailClaimMap,$roleClaimMap,$firstNameClaimMap,$lastNameClaimMap -IdentifierClaim $emailClaimMap.InputClaimType

You can pass in additional claims attributes and SharePoint’s STS will pass them along to you, but you can only access them server-side via Thread.CurrentPrincipal; for example,

IClaimsPrincipal claimsPrincipal = Thread.CurrentPrincipal as IClaimsPrincipal;
If (claimsPrincipal != null)
{
    IClaimsIdentity claimsIdentity = (IClaimsIdentity)claimsPrincipal.Identity;
    foreach (Claim c in claimsIdentity.Claims)
    {
        // Do something
    }
}

With this scenario, you can assign permissions based on an individual user using the email claim, or based on a role using the role claim. However SharePoint’s people picker isn’t especially helpful in this case. Since it has no way to look up and resolve the claims attribute value, it will let users type anything they want. Type something and then hover over the people picker; you’ll see a list of claims. Select the Role claim to grant permission based on a role, or the Email claim to grant permission to an individual user based on their email address.

SharePoint does not use WsFed metadata, so you need to provide the signing certificate’s public key directly and specify the WsFed signin url. For the scenario here, that’s https://localhost:44352/wsfed

Conclusion

While not without its challenges, it is possible to use B2C with a system that only knows WsFed. One thing I have not yet done is implement a profile edit flow. I need to give that more thought around how that’d work and interact. I’m open to ideas if you have them and I’ll blog a follow-up once that’s done.

External Auth in ASP.Net MVC/SPA/Web API apps

July 31, 2014 Coding 3 comments ,

Out of the box, Visual Studio 2013 comes with a number of templates for ASP.Net to cover most scenarios. The issue is that if you want registered users, you have a choice between Individual User Accounts, Organizational Accounts or Windows Authentication.

VS 2013 ASP.Net Options

While “Individual User Accounts” sounds like the only option for apps that aren’t using Azure AD or Windows Auth, the trouble is that it brings in all of ASP.Net Identity, which includes a local user database. If you look at the description above, it seems like this is the option to use for external/social logins, but as we’ll discuss shortly, it’s a red herring.

There are a few reasons why it’s not a good idea to mix your user database with your main app:

  • Single Responsibility Principle. This is the “S” in SOLID, and it applies to your application/service as a whole, not just your classes. Your application should be great at what it does. Your services should be granular and each do a single thing. Your app is the best thing since sliced bread. What it doesn’t need, is to manage users.
  • Security is hard. If you look at the template code generated by the ASP.Net wizards for “Individual User Accounts,” you’ll see tons of code in the AccountController class. Go ahead, take a look, I’ll wait. Back yet? That code is now on you to understand and work with as after creation, it’s no longer a library. As you go about adjusting the templates, and views, you need to make sure you don’t accidentally introduce any issues. And you will need to be touching that code as the templates are not complete; they’re a getting-started point.
  • Higher Risk. Even though the password field in ASP.Net Identity is salted and hashed, there’s still a lot of juicy information in your users table that a 1337 h@x0r would love to get their hands on. You don’t want that 3am phone call.

Now that you’re hopefully thoroughly convinced that you do not want to implement your own authentication, you’re wondering what can you do now? The answer is to use an external authentication provider. Facebook is one, Microsoft Account another, and Google yet another. There’s no shortage of external authentication providers these days. If you’re thinking that “Individual User Accounts” with ASP.Net Identity does this for you (it’s in the description and template code!), don’t lose sight of the fact that it’s really creating new identities in your app and simply linking external logins. See Brock Allen’s detailed description on how it all works. It’s the same core issue where your app is now managing identities.

Returning to that initial ASP.Net Authentication options dialog, there’s one option that’s conspicuously absent: External Identity Provider. This is one option that really needs to be present in the dialog—a set of templates that are geared around validating external claims that aren’t part of Azure AD.

For Web API apps validating a Bearer token, this comes down to a single line of code (with Thinktecture’s OWIN component):

app.UseJsonWebToken({issuerId}, {audience}, {appSecret});

With that one line, your Web API can rely on true external security.

To be clear, relying an an external provider does not mean that you need to give up the idea of having “local users,” or multiple social logins like Facebook either. If you don’t want to be tied to a single provider, you can use a service like Auth0 or an configurable identity app like Identity Server. Either of those options provides an appropriate architectural separation of concerns and lets the experts handle the authentication. Your app becomes a Relying Party to the external authentication service, receiving, and validating, a set of claims (such as an email address or unique user id in their system).

The key difference between ASP.Net Identity’s assumptions and templates and using an external provider is who owns the claims. ASP.Net Identity translates external claims into its own identity system and issues its own new set of claims and tokens. My recommendation is that this extra layer is unnecessary and your application should use externally provided claims primarily.

There’s one area that I haven’t covered yet: what if you have your Web API and MVC content (even a SPA) in the same site? You’d like your login mechanism to work for both your web UI and your API. This is an area that has caused much confusion in current incarnations as Web API and MVC each do authentication a bit differently. For web browsers, you need some form of cookie auth while still having bearer tokens available for your API. This is exactly the situation I was in where I fell into the ASP.Net Identity trap. I’ll talk more about what happened and how I solved it next time.

To sum up, if you’re not sure whether to use ASP.Net Identity, this flow chat might help:

Special thanks to Barry Dorrans for reviewing this content prior to release.