<?xml version="1.0" encoding="utf-8"?>
<?xml-stylesheet type="text/xsl" href="https://blog.dangl.me/rss/xslt"?>
<rss xmlns:a10="http://www.w3.org/2005/Atom" version="2.0">
  <channel>
    <title>Dangl.Blog();</title>
    <link>https://blog.dangl.me/</link>
    <description>Blogging about .Net, DevOps, Networking and BIM. Home of the free GAEB Converter.</description>
    <generator>Articulate, blogging built on Umbraco</generator>
    <item>
      <guid isPermaLink="false">1436</guid>
      <link>https://blog.dangl.me/archive/signing-electron-apps-before-bundling-with-azure-key-vault-and-ev-code-signing-certificates/</link>
      <category>Continuous Integration</category>
      <title>Signing Electron Apps before Bundling with Azure Key Vault and EV Code Signing Certificates</title>
      <description>&lt;p&gt;Just a few days ago, I've blogged about &lt;a data-udi="umb://document/97c1f03c1f504869b17557ed82d8069a" href="/archive/running-fully-automated-e2e-tests-in-electron-in-a-docker-container-with-playwright/" title="Running Fully Automated E2E Tests in Electron in a Docker Container with Playwright"&gt;running E2E tests for an Electron app&lt;/a&gt;. But, once tested and verified, the next step is deployment.&lt;/p&gt;
&lt;p&gt;If you've ever shipped an application for Windows users, you might be aware of the Windows SmartScreen filter. It's basically a check that might pop up and warn your users in case Microsoft doesn't have any data on your app being trustworthy. That's actually a great feature, since it encourages application developers to sign their programs and therefore increase overall app security. Except for when your build pipeline becomes compromised, as in the recent SolarWinds attack...&lt;/p&gt;
&lt;p&gt;However, I'd like to focus on the technical aspects. For code signing certificates, there are two options available: Either a regular certificate, which has quite a low bar for verification and is available for less than 100,- € per year, or an &lt;em&gt;Extended Validation&lt;/em&gt; (EV) certificate. The EV ones are much pricier, and do require the certificate issuer to perform a more in-depth check before handing it out. You pay more, but you get more: In contrast to regular certificates, which require a certain number of installs until Windows SmartScreen recognizes them as trustworthy, EV certificates work out of the box and don't show any warnings to your users. So, for anything commercial, you probably want to go with an EV certificate.&lt;/p&gt;
&lt;p&gt;But it also comes with a catch: You usually need some hardware device to secure it, often an USB dongle or a &lt;em&gt;Hardware Security Module&lt;/em&gt; (HSM). That's just a requirement for the heightened security about EV certificates, but this also means that it's harder to integrate it in automated build workflows. You don't want to be in a position where you need a dedicated build computer in your office where you manually need to insert some USB dongle to generate a release.&lt;/p&gt;
&lt;p&gt;Here comes Azure Key Vault: It's a service available in the Azure Cloud which just kind of does everything around securing stuff, hence the name. It's typically used for things like secret management, e.g. to handle configuration for cloud services and the like, or just for plain SSL certificate management, e.g. with Let's Encrypt management via &lt;a href="https://github.com/shibayan/keyvault-acmebot" title="GitHub - shibayan - keyvault-acmebot"&gt;shibayan's key-vault-acme-bot&lt;/a&gt;. The part that's interesting for us in this article, however, is that the premium tier of Azure Key Vault allows you to store certificates on a HSM, securely on Microsoft infrastructure.&lt;/p&gt;
&lt;p&gt;When going that way, you give up the option of exporting the certificate, e.g. temporary to your build server to sign applications. That would introduce too much of a security risk, so you need to remotely use the signing feature of Azure Key Vault to sign your apps.&lt;/p&gt;
&lt;p&gt;And this is where this blog post is headed: When building an Electron app, you've got two steps. First, you build &amp;amp; publish the app, then it's getting packaged. Now, you want to sign both outputs, the actual &lt;span class="Code"&gt;*.exe&lt;/span&gt; files in the bundle as well as the installer itself. So, I'll show you some code that performs the build, calls a hook between the publish and package step to sign everything, and then finally signs the installer itself. All done via remote signing in Azure Key Vault.&lt;/p&gt;
&lt;p&gt;I'm a big fan of &lt;a data-udi="umb://document/6ecb73fad5484466b0fc73c1a80f79c3" href="/archive/escalating-automation-the-nuclear-option/" title="Escalating Automation - The Nuclear Option"&gt;the NUKE build system&lt;/a&gt;, so I'm using that for the actual build automation. The concept should be easily translatable to whatever system you're using. So, let's look at the code finally!&lt;/p&gt;
&lt;p&gt;First, we're starting with this build script:&lt;/p&gt;
&lt;p&gt;&lt;code data-gist-id="f3f82bd3d865e9c66466c38c247d6584" data-gist-file="BuildElectron.cs"&gt;&lt;/code&gt;&lt;/p&gt;
&lt;p&gt;The part above is the target in NUKE that generates the Electron bundle. In this specific example, we're &lt;a data-udi="umb://document/9476ba07bd354057aece5444330de399" href="/archive/transform-your-aspnet-core-website-into-a-single-file-executable-desktop-app/" title="Transform your ASP.NET Core Website into a Single File Executable Desktop App"&gt;even using Electron.NET&lt;/a&gt;, since the app itself is really just a website that some customers want to use as a desktop application. I won't go into too much detail here, since the build process should be pretty straightforward. It's really just a wrapper around electron-builder with something injected here and there.&lt;/p&gt;
&lt;p&gt;&lt;code data-gist-id="f3f82bd3d865e9c66466c38c247d6584" data-gist-file="electron.manifest.json"&gt;&lt;/code&gt;&lt;/p&gt;
&lt;p&gt;This second part is a bit more interesting here. The &lt;span class="Code"&gt;electron.manifest.json&lt;/span&gt; is your app definition, we're configuring, for example, that the Windows target should use the &lt;span class="Code"&gt;nsis&lt;/span&gt; installer. But the real fun is on &lt;span class="Code"&gt;line 11&lt;/span&gt;: We're defining a JavaScript file that is called during the build, in our case right after the built-in signing task is completed.&lt;/p&gt;
&lt;p&gt;&lt;code data-gist-id="f3f82bd3d865e9c66466c38c247d6584" data-gist-file="electronAfterPackHook.js"&gt;&lt;/code&gt;&lt;/p&gt;
&lt;p&gt;&lt;span class="Code"&gt;electronAfterPackHook.js&lt;/span&gt;, again, is a simple script. It's really just calling back out to the NUKE build system to run the &lt;span class="Code"&gt;SignExecutables&lt;/span&gt; target and passes the current output directory as an argument. The magic here is that this is now called during your build, after the executable has been created and patched with the icon, but before it's being bundled up by the installer.&lt;/p&gt;
&lt;p&gt;&lt;code data-gist-id="f3f82bd3d865e9c66466c38c247d6584" data-gist-file="BuildSign.cs"&gt;&lt;/code&gt;&lt;/p&gt;
&lt;p&gt;Finally, we're using the &lt;span class="Code"&gt;AzureSign&lt;/span&gt; package to sign our executable with the certificate in Azure Key Vault.&lt;/p&gt;
&lt;p&gt;After everything's done here, the build continues. At the end, the parent NUKE process just signs the installers and we're done. The next step would be publishing, for us that means generating the documentation and deploying the artifact.&lt;/p&gt;
&lt;p&gt;So, you've seen we're really doing something like in the movie &lt;em&gt;Inception&lt;/em&gt;: We're calling our build system from a hook from a build process initiated by the build system itself. It feels a bit Rube-Goldberg-y, but it works fine and is pretty simple to setup &amp;amp; understand.&lt;/p&gt;
&lt;p&gt;Happy signing!&lt;/p&gt;</description>
      <pubDate>Wed, 18 Aug 2021 20:42:54 Z</pubDate>
      <a10:updated>2021-08-18T20:42:54Z</a10:updated>
    </item>
    <item>
      <guid isPermaLink="false">1435</guid>
      <link>https://blog.dangl.me/archive/running-fully-automated-e2e-tests-in-electron-in-a-docker-container-with-playwright/</link>
      <category>Continuous Integration</category>
      <title>Running Fully Automated E2E Tests in Electron in a Docker Container with Playwright</title>
      <description>&lt;p&gt;If you've read some of my previous articles, you probably know that I'm kind of obsessed with rigorous testing - I'm advocating for having a good, automated quality control for every project I encounter. It's probably one of the biggest time savers there is in software development, whether it's because you catch regressions early or you've got lots of infrastructure set up so that reproducing bugs is a matter of minutes instead of hours.&lt;/p&gt;
&lt;p&gt;For one of our apps, we've got a bit of a complex setup. The backend is a traditional ASP.NET Core application with the usual  moving parts, a relational database, some blob storage and the compute part of the app itself. That's easy to test on its own.&lt;/p&gt;
&lt;p&gt;The frontend part deviates a bit from our regular architectures. Due to the nature of the app, it's built and deployed as an Electron application. Electron is a great tool that &lt;a data-udi="umb://document/9476ba07bd354057aece5444330de399" href="/archive/transform-your-aspnet-core-website-into-a-single-file-executable-desktop-app/" title="Transform your ASP.NET Core Website into a Single File Executable Desktop App"&gt;I've blogged about previously&lt;/a&gt;, but it comes at a cost. It gets the most flak for being inefficient, whether it's due to it having a large memory footprint (both in storage and memory), or being less fast than native applications. However, the upsides are great - since Electron is really just a wrapper around Chromium, it allows you to build desktop applications with all tried &amp;amp; tested web technologies. This makes it a natural choice when your team is building web applications usually and can therefore transition seamlessly to an Electron project.&lt;/p&gt;
&lt;p&gt;However, Electron feels sometimes a bit like building on sand - it works really well, but there are a lot of moving parts under the hood. One obstacle we've encountered was around testing the full application in an end-to-end (E2E) way. To ensure such tests run in a controlled environment, independent of whatever host is currently executing, &lt;a data-udi="umb://document/97c1f03c1f504869b17557ed82d8069a" href="/archive/running-fully-automated-e2e-tests-in-electron-in-a-docker-container-with-playwright/" title="Running Fully Automated E2E Tests in Electron in a Docker Container with Playwright" data-anchor="#"&gt;we're using a small Docker setup&lt;/a&gt; to spin up the entire infrastructure, test it and then tear it down again. That works pretty well with regular web apps, but not so much with Docker.&lt;/p&gt;
&lt;p&gt;When we first started setting this up, we've ran into lots of problems. The biggest maybe was a lack of community information around this issue - it just seemed like not a lot of people are executing such tests, so there was not a lot to be found except small bits here and there. We've finally managed to get it up in the end, but even an external consultant specializing in that area was struggling to set it up. One of the major hurdles was that while there are lots of tools for automating browsers that work fine in Docker, there are not a lot of options when it comes to Electron.&lt;/p&gt;
&lt;p&gt;Then came last week, when we updated the Electron version. That immediately broke our build. It turned out that &lt;a href="https://github.com/electron-userland/spectron" title="GitHub - Spectron"&gt;Spectron, the test runner for Electron&lt;/a&gt;, doesn't have any active maintainers left. And it also looked like &lt;a href="https://github.com/electron-userland/spectron/issues/1021" title="GitHub - Spectron - Issue #1021"&gt;we're not the only ones affected by this&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;Luckily, there's a new player around, trying to offer a modern and stable way for browser automation: &lt;a href="https://playwright.dev/" title="Playwright.dev"&gt;Playwright by Microsoft&lt;/a&gt;. It does have official Electron support, is actively maintained and, due to it being just a bit over a year old, was able to use a modern and easy to use API approach.&lt;/p&gt;
&lt;p&gt;After spending some hours on trying to get the original test setup fixed, we've decided to give Playwright a try. To our amazement, it just worked! It took just about an hour to set everything up and migrate the tests. And, our E2E tests started being green again🚀&lt;/p&gt;
&lt;p&gt;I'll try to give you a condensed summary of what we did to get it running, and how it was set up. In reality, we're also spinning up the other services, like backend, database and blob storage, put all the Docker containers in the same network and therefore simulate the app as close to the production environment as possible. But, let's take a look at it file by file:&lt;/p&gt;
&lt;p&gt;&lt;code data-gist-id="f2d65986085da494020edf07a5a6b96f" data-gist-file="Dockerfile"&gt;&lt;/code&gt;&lt;/p&gt;
&lt;p&gt;We start with a Dockerfile. That's a rather simple one, the few things worth mentioning are:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;We're building from the &lt;span class="Code"&gt;node&lt;/span&gt; image, since a lot of dependencies are already present there&lt;/li&gt;
&lt;li&gt;Some tools need to be installed around display drivers, most importantly &lt;span class="Code"&gt;xvfb&lt;/span&gt; which will act as a virtual display inside the Docker container&lt;/li&gt;
&lt;li&gt;A custom entrypoint for the container is provided to execute a script at container start&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;&lt;code data-gist-id="f2d65986085da494020edf07a5a6b96f" data-gist-file="entrypoint.sh"&gt;&lt;/code&gt;&lt;/p&gt;
&lt;p&gt;The entrypoint itself isn't complicated, either. When looking around for running &lt;span class="Code"&gt;xvfb&lt;/span&gt; in Docker, you'll often find similar snippets. Ours was, in fact, also mostly copied from various sources. Its really just making sure that a virtual display is available before the actual command is run&lt;/p&gt;
&lt;p&gt;&lt;code data-gist-id="f2d65986085da494020edf07a5a6b96f" data-gist-file="common-setup.ts"&gt;&lt;/code&gt;&lt;/p&gt;
&lt;p&gt;The &lt;span class="Code"&gt;common-setup.ts&lt;/span&gt; file is a base for all our tests. Here, we're using playwright and launch a new instance of the app for every test. As you can see, the API could probably not be simpler here, yet it works flawlessly.&lt;/p&gt;
&lt;p&gt;&lt;code data-gist-id="f2d65986085da494020edf07a5a6b96f" data-gist-file="maine2e.ts"&gt;&lt;/code&gt;&lt;/p&gt;
&lt;p&gt;Finally, some tests. They're shortened, but give you a small indication what's possible with playwright, or automated E2E tests in general.&lt;br /&gt;We do usually aim for two things with such tests:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;We want to get quick feedback to see if the app is generally running, is it able to start, do we get any script errors in the console, can users log in and such&lt;/li&gt;
&lt;li&gt;Additionally, we usually have full E2E tests for the &lt;em&gt;critical paths&lt;/em&gt; in our applications. For example, &lt;em&gt;can new users register, confirm their email, login, start a trial and then upgrade to a paid subscription&lt;/em&gt;? Generally, things a QA department would do but that are easy to automate&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;It's sometimes tedious to write such tests, and you most likely won't ever have your full app covered with them. But they're a huge confidence boost, especially when your pipeline automatically deploys right into production.&lt;/p&gt;
&lt;p&gt;&lt;code data-gist-id="f2d65986085da494020edf07a5a6b96f" data-gist-file="startup.sh"&gt;&lt;/code&gt;&lt;/p&gt;
&lt;p&gt;Finally, to actually run the tests, we're just spinning up the Docker container with a script like the one above. The important part here is that we're now calling &lt;span class="Code"&gt;docker run&lt;/span&gt; to execute end to end tests in the container. We're doing some more optimizations, like using &lt;span class="Code"&gt;tmpfs&lt;/span&gt; for the &lt;span class="Code"&gt;node_modules&lt;/span&gt; folder and mounting the app directory into the container. But in the end, the tests run and we're getting a test results file out that can be processed in CI systems.&lt;/p&gt;
&lt;p&gt;So, happy testing!&lt;/p&gt;</description>
      <pubDate>Thu, 12 Aug 2021 07:03:26 Z</pubDate>
      <a10:updated>2021-08-12T07:03:26Z</a10:updated>
    </item>
    <item>
      <guid isPermaLink="false">1433</guid>
      <link>https://blog.dangl.me/archive/lets-use-nuke-to-quickly-deploy-an-app-to-azure-via-zip-deployment/</link>
      <category>Continuous Integration</category>
      <title>Let's use NUKE to Quickly Deploy an App to Azure via Zip Deployment</title>
      <description>&lt;p&gt;It's no secret I'm a big fan of the &lt;a href="http://www.nuke.build/" title="NUKE Build"&gt;NUKE build system&lt;/a&gt;, it's making my life just so much more convenient! So here's a small real world example that demonstrates how easy it is to deploy a static website to an Azure App Service using NUKE and &lt;a href="https://github.com/projectkudu/kudu/wiki/Deploying-from-a-zip-file-or-url" title="Kudu Zip Deployment"&gt;Kudu Zip Deployment&lt;/a&gt;. You can &lt;a href="https://github.com/GeorgDangl/antlr-calculator" title="GitHub - GeorgDangl - antlr-calculator"&gt;check out the repository here&lt;/a&gt; if you want to see it all.&lt;/p&gt;
&lt;p&gt;Recently, I was going through some old repositories on &lt;a href="https://github.com/GeorgDangl" title="GitHub - GeorgDangl"&gt;my GitHub account&lt;/a&gt; for the occasional cleaning - just checking if something needs an update, or is no longer working. &lt;a href="https://github.com/GeorgDangl/antlr-calculator" title="GitHub - GeorgDangl - antlr-calculator"&gt;My antlr-calculator project&lt;/a&gt; hadn't been updated in a while, and the demo site was still hosted on a virtual server. So, a great task to kill an evening was found! I decided to update the build system from just some CLI commands to use NUKE and to additional move the hosting to an Azure App Service.&lt;/p&gt;
&lt;p&gt;Many think that for using NUKE to automate your build, you need a .NET project. But while NUKE leverages C# to set up your build, you can actually use it to automate any task you want. For example, here's how you build an NPM package:&lt;/p&gt;
&lt;p&gt;&lt;code data-gist-id="53b51501765672cb4ccc60973b4068ce" data-gist-file="Build.cs" data-gist-line="76-94"&gt;&lt;/code&gt;&lt;/p&gt;
&lt;p&gt;Or, in this case, build &amp;amp; deploy a static website to Azure. Let's go through this step by step. First, we define the &lt;span class="Code"&gt;Target DeployDemo&lt;/span&gt; in Nuke. It's set up to invoke the &lt;span class="Code"&gt;Clean&lt;/span&gt; target and specifies some required parameters:&lt;/p&gt;
&lt;p&gt;&lt;code data-gist-id="53b51501765672cb4ccc60973b4068ce" data-gist-file="Build.cs" data-gist-line="96-102"&gt;&lt;/code&gt;&lt;/p&gt;
&lt;p&gt;Next, we use NPM to build the site and update a placeholder in &lt;span class="Code"&gt;index.html&lt;/span&gt; for the version:&lt;/p&gt;
&lt;p&gt;&lt;code data-gist-id="53b51501765672cb4ccc60973b4068ce" data-gist-file="Build.cs" data-gist-line="104-108"&gt;&lt;/code&gt;&lt;/p&gt;
&lt;p&gt;Then we get the Base64 encoded authentication header value to deploy to Azure and zip the output into a Zip file:&lt;/p&gt;
&lt;p&gt;&lt;code data-gist-id="53b51501765672cb4ccc60973b4068ce" data-gist-file="Build.cs" data-gist-line="110-112"&gt;&lt;/code&gt;&lt;/p&gt;
&lt;p&gt;Finally, we POST this zip file to the Kudu API and wait for a response:&lt;/p&gt;
&lt;p&gt;&lt;code data-gist-id="53b51501765672cb4ccc60973b4068ce" data-gist-file="Build.cs" data-gist-line="114-120"&gt;&lt;/code&gt;&lt;/p&gt;
&lt;p&gt;That's it! Now you've deployed a small websites to Azure. With build automation servers, &lt;a href="https://github.com/GeorgDangl/antlr-calculator/blob/develop/Jenkinsfile" title="GitHub - GeorgDangl - antlr-calculator - Jenkinsfile"&gt;like Jenkins&lt;/a&gt;, GitHub Actions or Azure Pipelines, you can configure your build scripts to run on every commit.&lt;/p&gt;
&lt;p&gt;Happy automating!&lt;/p&gt;</description>
      <pubDate>Mon, 12 Oct 2020 20:15:01 Z</pubDate>
      <a10:updated>2020-10-12T20:15:01Z</a10:updated>
    </item>
    <item>
      <guid isPermaLink="false">1432</guid>
      <link>https://blog.dangl.me/archive/running-sql-server-integration-tests-in-net-core-projects-via-docker/</link>
      <category>Continuous Integration</category>
      <title>Running SQL Server Integration Tests in .NET Core Projects via Docker</title>
      <description>&lt;p&gt;I'm a huge fan of test driven development. It helps me deliver fewer defects in my apps and gives me confidence that my changes don't break any existing features.&lt;/p&gt;
&lt;p&gt;While I write mostly unit tests that run quickly, I try to also have lots of integrations tests for all API endpoints in my web applications. This typically involves code that hits a database, so I had to find some way to easily spin up database instances for my tests. I'm a firm believer that running tests should only involve a single command, so relying on any existing database on the build agent was a hard no for me. Additionally, multiple instances might run in parallel on the same machine, e.g. when different branches are built at the same time.&lt;/p&gt;
&lt;p&gt;My initial approach has always involved using a SQLite in memory database, actually a fresh one for every single test. This worked well, was fast enough and dead easy to set up. However, it had one huge problem: I'm not running SQLite in production, so I wasn't really testing the system under the same circumstances and additionally, I could not test features that involved database specific behavior.&lt;/p&gt;
&lt;p&gt;Typically in such a situation, most developers would use a containerized database, such as Microsoft SQL Server. But most approaches like this that I've seen were a bit ugly: the database usually had to be spun up manually before running tests, or at least via a build script. But I wanted a way that works independently of how the tests are run - whether via my build script, by running &lt;span class="Code"&gt;dotnet test&lt;/span&gt; or just by clicking &lt;em&gt;Run&lt;/em&gt; in the Visual Studio Test Explorer.&lt;/p&gt;
&lt;p&gt;Luckily, I've found the great &lt;a href="https://github.com/dotnet/Docker.DotNet" title="GitHub - Docker.DotNet"&gt;Docker.DotNet library&lt;/a&gt; which allows you to access the Docker Daemon directly in your C# code, e.g. during test setup and teardown methods. The integration &amp;amp; setup for such integration tests is then pretty straightforward, but requires a bit of code to get it reliably working. Just follow the snippets below and you should be able to set up something similar!&lt;/p&gt;
&lt;p&gt;&lt;code data-gist-id="20ecb3873e78053abfc31c0d0458dfb2" data-gist-file="DockerSqlDatabaseUtilities.cs"&gt;&lt;/code&gt;&lt;/p&gt;
&lt;p&gt;The class &lt;span class="Code"&gt;DockerSqlDatabaseUtilities&lt;/span&gt; is where Docker.DotNet is referenced - it's purpose is to ensure that a Microsoft SQL Server Docker image is running and ready for connections.&lt;/p&gt;
&lt;p&gt;&lt;code data-gist-id="20ecb3873e78053abfc31c0d0458dfb2" data-gist-file="SqlServerDockerCollectionFixture.cs"&gt;&lt;/code&gt;&lt;/p&gt;
&lt;p&gt;Next, &lt;span class="Code"&gt;SqlServerDockerCollectionFixture&lt;/span&gt; is a test fixture that essentially just orchestrates the Docker setup and implements XUnit's &lt;span class="Code"&gt;IAsyncLifetime&lt;/span&gt; interface.&lt;/p&gt;
&lt;p&gt;&lt;code data-gist-id="20ecb3873e78053abfc31c0d0458dfb2" data-gist-file="AssemblyInfo.cs"&gt;&lt;/code&gt;&lt;/p&gt;
&lt;p&gt;Further down, we're specifying a custom &lt;span class="Code"&gt;TestFramework&lt;/span&gt; in &lt;span class="Code"&gt;AssemblyInfo.cs&lt;/span&gt;, since we'll be running the test setup from the previous fixture class only once per assembly, and that's not supported out of the box in XUnit. You could refactor the code to internally track if the setup has already been performed, or simply use the &lt;a href="https://github.com/tomaszeman/Xunit.Extensions.Ordering" title="GitHub - Xunit.Extensions.Ordering"&gt;Xunit.Extensions.Ordering&lt;/a&gt; as test framework😀&lt;/p&gt;
&lt;p&gt;&lt;code data-gist-id="20ecb3873e78053abfc31c0d0458dfb2" data-gist-file="TestHelper.cs"&gt;&lt;/code&gt;&lt;/p&gt;
&lt;p&gt;Another class I'm using is &lt;span class="Code"&gt;TestHelper&lt;/span&gt;. You can guess from the name that it's offering supporting methods during testing, such as providing an in memory instance of our &lt;a href="https://www.dangl-it.com/products/danglidentity/" title="Dangl IT GmbH - Dangl.Identity Product Site"&gt;OpenID Service Dangl.Identity&lt;/a&gt; and performing the actual setup of our in memory ASP.NET Core backend, which is what we actually want to test.&lt;/p&gt;
&lt;p&gt;&lt;code data-gist-id="20ecb3873e78053abfc31c0d0458dfb2" data-gist-file="IntegrationTestBase.cs"&gt;&lt;/code&gt;&lt;/p&gt;
&lt;p&gt;Finally, we've got an abstract base class &lt;span class="Code"&gt;IntegrationTestBase&lt;/span&gt; that all our tests inherit from. &lt;span class="Code"&gt;TestHelper&lt;/span&gt; is actually provided as a dedicated instance per test, which means that we can run all tests in parallel yet still fully isolated from each other.&lt;/p&gt;
&lt;p&gt;Does this approach work for you? Tell me in the comments!&lt;/p&gt;
&lt;p&gt;Happy testing!&lt;/p&gt;</description>
      <pubDate>Tue, 22 Sep 2020 19:51:36 Z</pubDate>
      <a10:updated>2020-09-22T19:51:36Z</a10:updated>
    </item>
    <item>
      <guid isPermaLink="false">1431</guid>
      <link>https://blog.dangl.me/archive/simple-and-quick-way-to-backup-jenkins-to-azure-blob-storage/</link>
      <category>Continuous Integration</category>
      <title>Simple and Quick Way to Backup Jenkins to Azure Blob Storage</title>
      <description>&lt;p&gt;For years, I've been a happy user of &lt;a href="https://www.jenkins.io/" title="Jenkins"&gt;Jenkins&lt;/a&gt; to automate all our Continuous Integration &amp;amp; Continuous Delivery CI/CD steps. Just recently, I've been evaluating some other, more modern platforms. While trying out GitHub Actions and Azure DevOps, I've been generally satisfied, but found that both don't fit perfectly for what I want. GitHub Actions is still in it's early stages and lacks many features, and with Azure DevOps I've felt the setup to be a bit too complicated and tightly integrated with Azure services. Jenkins just could do everything quite well while giving you lots of freedom.&lt;/p&gt;
&lt;p&gt;These findings resulted in me not changing the existing setup too much - two servers, one running Windows and one on Linux worked fine so far. However, I've been a bit unsatisfied with the server's performance characteristics. The instances were on some virtual servers that were approaching their 4 year mark, so I decided to switch to two up to date, dedicated machines.&lt;/p&gt;
&lt;p&gt;This led me to rethink the backup process. Previously, a cronjob just backed up all the Jenkins data daily to a network share that was provided by the server hosting company, and that felt a bit too outdated and makes accessing backups actually more complicated than necessary. So I decided to just Backup the data to Azure Blob Storage. However, there was no ready made solution that I could find, so I had to roll my own.&lt;/p&gt;
&lt;p&gt;Luckily, Jenkins uses just file storage in it's &lt;span class="Code"&gt;JENKINS_HOME&lt;/span&gt; directory for all configurations, from users to jobs to plugins, so backing up the configuration is actually pretty easy - just copy the parts you want to be backed up. For this, I decided to directly leverage Jenkins itself to run the backup jobs, with a simply script that backs up the data to a Zip file and uploads it to the cloud.&lt;/p&gt;
&lt;p&gt;Essentially, it boils down to two parts, all of them are &lt;a href="https://github.com/GeorgDangl/JenkinsBackup" title="GitHub - GeorgDangl - JenkinsBackup"&gt;available directly on GitHub&lt;/a&gt;. First, the &lt;span class="Code"&gt;Jenkinsfile&lt;/span&gt; which configures the job itself:&lt;/p&gt;
&lt;p&gt;&lt;code data-gist-id="aa7b522d0d889e99cd45e77448b04a6e" data-gist-file="Jenkinsfile"&gt;&lt;/code&gt;&lt;/p&gt;
&lt;p&gt;It's really just doing two things: Specify a trigger that runs once a month and then invoke the actual backup script. If your master is not a Windows machine, just call &lt;span class="Code"&gt;build.cmd BackupInstance&lt;/span&gt; instead of the PowerShell version.&lt;/p&gt;
&lt;p&gt;The script itself leverages the &lt;a href="http://www.nuke.build/" title="nuke.build"&gt;NUKE build automation tool&lt;/a&gt;, which itself is an awesome asset that &lt;a data-udi="umb://document/6ecb73fad5484466b0fc73c1a80f79c3" href="/archive/escalating-automation-the-nuclear-option/" title="Escalating Automation - The Nuclear Option"&gt;I've blogged about previously&lt;/a&gt;. It's an engine that lets you write build scripts in .NET and execute them anywhere. This backup script is a great example - it doesn't require anything preinstalled, you just run &lt;span class="Code"&gt;build.cmd&lt;/span&gt; and it works! &lt;a href="https://github.com/GeorgDangl/JenkinsBackup/blob/master/build/Build.cs#L116" title="GitHub - GeorgDangl - JenkinsBackup - Build.cs" data-anchor="#L116"&gt;You can view the full script here&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;Happy Automating!&lt;/p&gt;</description>
      <pubDate>Sun, 14 Jun 2020 14:52:13 Z</pubDate>
      <a10:updated>2020-06-14T14:52:13Z</a10:updated>
    </item>
    <item>
      <guid isPermaLink="false">1372</guid>
      <link>https://blog.dangl.me/archive/set-up-private-nuget-feeds-with-proget-v5/</link>
      <category>Continuous Integration</category>
      <title>Set Up Private NuGet Feeds with ProGet v5</title>
      <description>&lt;p&gt;I've &lt;a data-udi="umb://document/fccc13e4cad34696ab2643bc74a57451" href="/archive/set-up-private-nuget-feeds-with-proget-and-jenkins/" title="Set Up Private NuGet Feeds with ProGet and Jenkins"&gt;previously blogged&lt;/a&gt; about setting up ProGet and then written about the &lt;a data-udi="umb://document/17cae157557f45a291123bcd6647aa2b" href="/archive/set-up-private-nuget-feeds-with-proget-v4plus/" title="Set Up Private NuGet Feeds with ProGet v4+"&gt;changes in ProGet Version 4&lt;/a&gt;. This post is a small update to them. While nothing from the setup side has changed, &lt;a rel="noopener" href="https://inedo.com/support/documentation/proget/feeds/nuget#legacy" target="_blank" title="ProGet - Legacy (Quirks) NuGet Feeds"&gt;there have been updates&lt;/a&gt; to the way feeds work and you have to migrate legacy feeds to the new format.&lt;/p&gt;
&lt;p&gt;Previously, I've recommended to configure an API key that is used for pushing changes. Having had such a key made unauthorized operations on the feed impossible, and hence required either the key itself or user credentials. Requests using the API Key were considered to be from the &lt;span class="Code"&gt;Anonymous&lt;/span&gt;, meaning unauthenticated, user. This should still be the case when you upgrade and don't migrate legacy feeds, but you have probably configured package publishing permissions for the "Anonymous" user when you've worked with API keys.&lt;/p&gt;
&lt;p&gt;This becomes a problem once you do upgrade! Feeds no longer have a dedicated API key, but the permission for Anonymous to publish packages stays in place. I've found no mentioning of it in the docs, so please &lt;strong&gt;be careful when you migrate feeds from the legacy format&lt;/strong&gt; to check if you have any open permissions left.&lt;/p&gt;
&lt;p&gt;With the latest feeds, I also no longer recommend using an API key but instead to create a dedicated user with push access. ProGet stores API keys in plaintext, which is something you should avoid to use if possible. With an user account, you can use it's &lt;span class="Code"&gt;username:password&lt;/span&gt; instead of the key. To do so, create an user via the admin interface and then switch to &lt;em&gt;Tasks&lt;/em&gt; where you should assign both &lt;span class="Code"&gt;Publish Packages&lt;/span&gt; and &lt;span class="Code"&gt;View &amp;amp; Download Packages&lt;/span&gt; permissions to the user:&lt;img id="__mcenew" style="display: block; margin-left: auto; margin-right: auto;" src="https://blog.dangl.me/media/1174/progetv5permissions.png" alt="ProGet Version 5 Task Permission Management" data-udi="umb://media/132744ae9b9e49eaaaf439a8a38274ad" /&gt;&lt;/p&gt;
&lt;p&gt;The &lt;span class="Code"&gt;View &amp;amp; Download Packages&lt;/span&gt; permission is not required in all cases, but some (older) NuGet clients do make a &lt;span class="Code"&gt;GET&lt;/span&gt; request to the feed before they publish a package and hence may need permission to view packages.&lt;/p&gt;
&lt;p&gt;Happy publishing!&lt;/p&gt;</description>
      <pubDate>Fri, 16 Feb 2018 10:27:19 Z</pubDate>
      <a10:updated>2018-02-16T10:27:19Z</a10:updated>
    </item>
    <item>
      <guid isPermaLink="false">1371</guid>
      <link>https://blog.dangl.me/archive/integration-testing-in-memory-compiled-code-with-roslyn-visual-studio-2017-edition/</link>
      <category>DotNet</category>
      <category>Continuous Integration</category>
      <title>Integration Testing In Memory Compiled Code with Roslyn - Visual Studio 2017 Edition</title>
      <description>&lt;p&gt;Have you ever written code to generate code? &lt;em&gt;Probably&lt;/em&gt;.&lt;/p&gt;
&lt;p&gt;Have you ever written proper integration tests for that? &lt;em&gt;Probably not&lt;/em&gt;.&lt;/p&gt;
&lt;blockquote&gt;This is a follow up post to &lt;a data-udi="umb://document/84cf1e72fc89438baafc4096c5beb264" href="/archive/integration-testing-in-memory-compiled-code-with-roslyn/" title="Integration Testing In Memory Compiled Code with Roslyn"&gt;last years article&lt;/a&gt;, which was still about the now deprecated &lt;span class="Code"&gt;project.json&lt;/span&gt; format&lt;/blockquote&gt;
&lt;p&gt;Everyone's done it - write code that writes code. Be it for converting from an esoteric Domain Specific Language, generating from a set of known data or for any other reason. It happens a lot, it's useful a lot, but it's not tested a lot. Testing such code has always been inconvenient, did involve a lot of disk IO and quite often relied on scripts and manual work. Luckily, with &lt;a rel="noopener noreferrer" href="https://github.com/dotnet/roslyn" target="_blank" title="GitHub - dotnet/roslyn"&gt;dotnets Roslyn compiler,&lt;/a&gt; there are NuGet packages for code analysis and compilation available that make all of this so easy!&lt;/p&gt;
&lt;p&gt;The &lt;a rel="noopener noreferrer" href="https://github.com/GeorgDangl/InMemoryCompilation" target="_blank" title="GitHub - GeorgDangl/InMemoryCompilation"&gt;complete sample repository is available at GitHub&lt;/a&gt;. Here are the interesting parts:&lt;/p&gt;
&lt;p&gt;
&lt;script src="https://gist.github.com/GeorgDangl/069a42d3dc1f5c2f9172d2e80e83170b.js?file=CodeGenerator.cs"&gt;&lt;/script&gt;
&lt;/p&gt;
&lt;p&gt;It starts with the &lt;span class="Code"&gt;CodeGenerator&lt;/span&gt;  which does what it's name implies - it creates code. In this case, it's a simple class that has one method: &lt;span class="Code"&gt;AddIntegers()&lt;/span&gt;.&lt;/p&gt;
&lt;p&gt;
&lt;script src="https://gist.github.com/GeorgDangl/069a42d3dc1f5c2f9172d2e80e83170b.js?file=InMemoreCompilation.Tests.csproj"&gt;&lt;/script&gt;
&lt;/p&gt;
&lt;p&gt;This &lt;span class="Code"&gt;InMemoreCompilation.Tests.csproj&lt;/span&gt; is a regular, xUnit enabled, project configuration file. There is a reference to the &lt;span&gt;&lt;span class="Code"&gt;Microsoft.CodeAnalysis.CSharp&lt;/span&gt; package. That one will bring the Roslyn API to your project.&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;
&lt;script src="https://gist.github.com/GeorgDangl/069a42d3dc1f5c2f9172d2e80e83170b.js?file=CodeGeneratorTests.cs"&gt;&lt;/script&gt;
&lt;/p&gt;
&lt;p&gt;The actual integration test class is quite big, so let's break it down into it's functional units: &lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;span class="Code"&gt;GenerateCode()&lt;/span&gt; does just that - it gets the string representation of the sourcecode&lt;/li&gt;
&lt;li&gt;&lt;span class="Code"&gt;CreateCompilation()&lt;/span&gt; takes the sourcecode, adds assembly references and turns it into a &lt;span class="pl-en"&gt;&lt;span class="Code"&gt;CSharpCompilation&lt;/span&gt; object&lt;/span&gt;&lt;span&gt;&lt;/span&gt;&lt;/li&gt;
&lt;li&gt;&lt;span class="pl-en"&gt;&lt;span class="Code"&gt;CompileAndLoadAssembly()&lt;/span&gt; now invokes the Roslyn API to compile onto a MemoryStream, then loads it and makes the content available&lt;/span&gt;&lt;/li&gt;
&lt;li&gt;&lt;span class="pl-en"&gt;Finally, &lt;span class="Code"&gt;CallCalculatorMethod()&lt;/span&gt; uses reflection on the newly generated assembly and invokes the &lt;span class="Code"&gt;AddIntegers()&lt;/span&gt; method&lt;/span&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;This is a really interesting approach on tackling code generation. While I got the initial bits to set this up from &lt;a rel="noopener noreferrer" href="http://www.tugberkugurlu.com/archive/compiling-c-sharp-code-into-memory-and-executing-it-with-roslyn" target="_blank" title="Tugberk Ugurlu - Compiling C# Code Into Memory and Executing It with Roslyn"&gt;Tugberk Ugurlus&lt;/a&gt; blog, I've had &lt;a rel="noopener noreferrer" href="https://github.com/GeorgDangl/XmlTools" target="_blank" title="GitHub - GeorgDangl/XmlTools"&gt;a project of my own&lt;/a&gt; where I did some heavy code generation (for correcting Xml documents) where I needed that. There's also code that works both in .Net Core and the full .Net framework.&lt;/p&gt;
&lt;p&gt;Happy compiling!&lt;/p&gt;</description>
      <pubDate>Sat, 03 Feb 2018 16:12:43 Z</pubDate>
      <a10:updated>2018-02-03T16:12:43Z</a10:updated>
    </item>
    <item>
      <guid isPermaLink="false">1370</guid>
      <link>https://blog.dangl.me/archive/using-webdeploy-via-nuke-build-scripts/</link>
      <category>Continuous Integration</category>
      <title>Using WebDeploy via Nuke Build Scripts</title>
      <description>&lt;p&gt;Some days ago, I read about &lt;a rel="noopener noreferrer" href="http://www.nuke.build/" target="_blank" title="NUKE build"&gt;NUKE&lt;/a&gt;, which is kind of like &lt;a rel="noopener noreferrer" href="https://www.gnu.org/software/make/" target="_blank" title="GNU Make"&gt;Make&lt;/a&gt;, &lt;a rel="noopener noreferrer" href="https://cakebuild.net/" target="_blank" title="Cake Build"&gt;Cake&lt;/a&gt; and &lt;a rel="noopener noreferrer" href="https://fake.build/" target="_blank" title="FAKE Build"&gt;FAKE&lt;/a&gt;. Besides breaking the rhyme, it's looking really promising and takes an interesting approach for build configuration.&lt;/p&gt;
&lt;blockquote&gt;&lt;strong&gt;Update:&lt;/strong&gt; I've now published the package on &lt;a rel="noopener" href="https://www.nuget.org/packages/Nuke.WebDeploy/" target="_blank" title="NuGet - Nuke.WebDeploy"&gt;NuGet: Nuke.WebDeploy&lt;/a&gt;&lt;/blockquote&gt;
&lt;p&gt;I'm mostly using PowerShell scripts for all my automation, and have also tinkered a bit with dotnet script. PowerShell scripts are really easy to use, but complex tasks have a tendency to become brittle and, since I'm no where near being an expert, always require a bit of trial and error. Scripting with C# is great, but is often not easy to set up. Now there's NUKE, which lets you write build scripts in regular .Net projects. Overall, the experience with it was quite good for me, and I've migrated one of our projects at work already.&lt;/p&gt;
&lt;p&gt;One of the missing bits in NUKE is the support for Microsoft WebDeploy. I'm using it a lot, both for deploying to Azure as well as IIS. Unfortunately, there's no NuGet distribution available for the command line utility, it must be installed on the machine executing the build. However, there's a &lt;a rel="noopener noreferrer" href="https://www.nuget.org/packages/Microsoft.Web.Deployment/" target="_blank" title="NuGet - Microsoft.Web.Deployment"&gt;Microsoft.Web.Deployment package on NuGet&lt;/a&gt; which exposes the complete WebDeploy API. I'm working on releasing a NuGet package for NUKE, based on &lt;a rel="noopener noreferrer" href="https://github.com/SharpeRAD/Cake.WebDeploy" target="_blank" title="GitHub - SharpeRAD - Cake.WebDeploy"&gt;Cake.WebDeploy&lt;/a&gt;. In the meantime, you can simply use the code on &lt;strong&gt;&lt;a rel="noopener noreferrer" href="https://gist.github.com/GeorgDangl/5521ef94bc6cae85793f8842f64fa3c5" target="_blank" title="GitHub Gists - GeorgDangl - Nuke.WebDeploy"&gt;this Gist&lt;/a&gt;&lt;/strong&gt; to deploy to Azure or IIS with WebDeploy using NUKE. This snippet shows a shortened build config. The &lt;span class="Code"&gt;Deploy&lt;/span&gt; target is executing WebDeploy:&lt;/p&gt;
&lt;p&gt;
&lt;script src="https://gist.github.com/GeorgDangl/d69fd27941f563848d4d06f215fe3063.js?file=Build.cs"&gt;&lt;/script&gt;
&lt;/p&gt;
&lt;p&gt;You can simply build, publish and deploy your website with PowerShell, for example in a Jenkins build:&lt;/p&gt;
&lt;p&gt;
&lt;script src="https://gist.github.com/GeorgDangl/d69fd27941f563848d4d06f215fe3063.js?file=jenkins.ps1"&gt;&lt;/script&gt;
&lt;/p&gt;
&lt;p&gt;Happy building!&lt;/p&gt;</description>
      <pubDate>Sat, 27 Jan 2018 19:04:11 Z</pubDate>
      <a10:updated>2018-01-27T19:04:11Z</a10:updated>
    </item>
    <item>
      <guid isPermaLink="false">1365</guid>
      <link>https://blog.dangl.me/archive/creating-correct-source-code-links-with-docfx-in-jenkins/</link>
      <category>Continuous Integration</category>
      <title>Creating Correct Source Code Links with DocFX in Jenkins</title>
      <description>&lt;p&gt;If you haven't used it yet, you should check out &lt;a rel="noopener noreferrer" href="https://dotnet.github.io/docfx/index.html" target="_blank" title="DocFX"&gt;DocFX&lt;/a&gt; to automate your documentation generation. It's working great with .Net Core projects, is simple to use and easily customizable. It's also got a feature to link directly from your documentation to source code on GitHub, but this comes with a small catch: When running the build in Jenkins, DocFX &lt;a rel="noopener noreferrer" href="https://dotnet.github.io/docfx/tutorial/docfx_getting_started.html#4-use-docfx-with-a-build-server" target="_blank" title="DocFX Documentation"&gt;uses an environment parameter&lt;/a&gt; &lt;span class="Code"&gt;GIT_BRANCH&lt;/span&gt; set by Jenkins. This leads to broken links, since the variable is, for example, &lt;span class="Code"&gt;origin/dev&lt;/span&gt; when it should only be &lt;span class="Code"&gt;dev&lt;/span&gt;. This also prevents the links from being versioned, so documentation of old versions won't have the historic source code available.&lt;/p&gt;
&lt;p&gt;But since this being Jenkins, there's nothing we can't do to solve this! In every build, there's an environment variable &lt;span class="Code"&gt;GIT_COMMIT&lt;/span&gt; provided which, as the name suggest, is the full hash of the checked out commit.&lt;/p&gt;
&lt;p&gt;All you need to to is run this small PowerShell script to fix links in your generated Html files:&lt;/p&gt;
&lt;script src="https://gist.github.com/GeorgDangl/192a067d9812ceacdeda6360a76eaea0.js"&gt;&lt;/script&gt;
&lt;p&gt;If you're looking for a bit more info on automating documentation generation, have a look at these two points:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;The &lt;a rel="noopener noreferrer" href="https://github.com/GeorgDangl/LightQuery" target="_blank" title="GitHub - GeorgDangl - LightQuery"&gt;LightQuery&lt;/a&gt; package is configured to generate docs on build. Most importantly, there's a &lt;span class="Code"&gt;GenerateAndDeployDocs.ps1&lt;/span&gt; script at the root doing all the work.&lt;/li&gt;
&lt;li&gt;For publishing, I've created &lt;a rel="noopener noreferrer" href="https://github.com/GeorgDangl/WebDocu" target="_blank" title="GitHub - GeorgDangl - WebDocu"&gt;WebDocu&lt;/a&gt;. It's an Asp.Net Core app that lets you host online documentation with versioning and user management for non-public projects. You can &lt;a rel="noopener noreferrer" href="https://docs.dangl-it.com" target="_blank" title="DanglDocs"&gt;take a look at it right here&lt;/a&gt;, a few of my projects have their docs publicly available.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Happy documenting!&lt;/p&gt;</description>
      <pubDate>Sun, 12 Nov 2017 22:06:50 Z</pubDate>
      <a10:updated>2017-11-12T22:06:50Z</a10:updated>
    </item>
    <item>
      <guid isPermaLink="false">1361</guid>
      <link>https://blog.dangl.me/archive/publish-azure-functions-via-jenkins-ci-continuously-to-azure/</link>
      <category>Continuous Integration</category>
      <title>Publish Azure Functions via Jenkins CI Continuously to Azure</title>
      <description>&lt;p&gt;We're not exactly heavily oriented towards a microservice architecture, but we do use them a fair bit for some resource intensive methods. I'm pretty happy with Azure Functions, especially with the parts that I can automate: I'll describe how to setup a simple Jenkins job that builds your function and deploys it automatically to Azure Functions.&lt;/p&gt;
&lt;p&gt;In your projects repository, you need a build script to compile your function, like this:&lt;/p&gt;
&lt;script src="https://gist.github.com/GeorgDangl/e3e7e0462e82e09abed40bf0d8d8aacb.js?file=PublishFunction.ps1"&gt;&lt;/script&gt;
&lt;p&gt;It's not too complicated, but I want to highlight these points:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Following best practices, &lt;span class="Code"&gt;$deployUsername&lt;/span&gt; and &lt;span class="Code"&gt;$deployPassword&lt;/span&gt; are passed as parameters to the function so we're not storing any sensitive date in source control&lt;/li&gt;
&lt;li&gt;&lt;span class="Code"&gt;dotnet build&lt;/span&gt; doesn't yet work with Azure Functions, so the script is using MSBuild. It's easiest to get by simply installing the Visual Studio 2017 Community Edition on your CI server (which I recommend anyways, since it comes with all the SDKs you ever need)&lt;/li&gt;
&lt;li&gt;You have to insert your own values for the placeholders, &lt;span class="Code"&gt;&amp;lt;ProjectFolder&amp;gt;&lt;/span&gt; and &lt;span class="Code"&gt;&amp;lt;FunctionName&amp;gt;&lt;/span&gt;. This could also be parameterized if you intend to have a more complex deployment scenario&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;In Jenkins, you simply call this script and provide the required parameters:&lt;/p&gt;
&lt;p&gt;&lt;img id="__mcenew" style="display: block; margin-left: auto; margin-right: auto;" src="https://blog.dangl.me/media/1171/jenkinsazurefunctionsdeploysetup.png" alt="Jenkins Azure Functions Deployment with WebDeploy Configuration" data-udi="umb://media/26a10d44fbbf4476ba5d515fe2e6f823" /&gt;&lt;/p&gt;
&lt;p&gt;I really encourage you to use something like the &lt;a rel="noopener noreferrer" href="https://wiki.jenkins.io/display/JENKINS/Credentials+Binding+Plugin" target="_blank" title="Jenkins Credentials Binding Plugin"&gt;Credentials Binding Plugin&lt;/a&gt; to provide your secrets. Otherwise, it's as simple as a setup can be. You can set up &lt;a data-udi="umb://document/5fd6ac1692164a5da6c5eff165f6de38" href="/archive/configure-git-hooks-on-bonobo-git-server-in-windows/" title="Configure Git Hooks on Bonobo Git Server in Windows"&gt;Git Hooks&lt;/a&gt; or use the GitHub plugin to automatically build whenever you push new code.&lt;/p&gt;
&lt;p&gt;If you're wondering where to actually get your deployment credentials, they're included in the Publishing Profile that you can download for the function in the Azure portal.&lt;/p&gt;
&lt;p&gt;Happy deploying!&lt;/p&gt;</description>
      <pubDate>Mon, 06 Nov 2017 17:13:21 Z</pubDate>
      <a10:updated>2017-11-06T17:13:21Z</a10:updated>
    </item>
    <item>
      <guid isPermaLink="false">1356</guid>
      <link>https://blog.dangl.me/archive/publish-and-deploy-aspnet-core-applications-from-jenkins-to-iis-2017-edition/</link>
      <category>Continuous Integration</category>
      <title>Publish and Deploy Asp.Net Core Applications from Jenkins to IIS - 2017 Edition</title>
      <description>&lt;p&gt;If you have an Asp.Net Core application that you want to continuously deploy to either Azure or IIS on a Windows Server, you should follow the following, simple steps.&lt;/p&gt;
&lt;blockquote&gt;This is a follow up post to &lt;a data-udi="umb://document/ead686a9f85649ec933540671bb80269" href="/archive/publish-and-deploy-asp-net-core-applications-from-jenkins-to-iis/" title="Publish and Deploy Asp.Net Core Applications from Jenkins to IIS"&gt;last years article&lt;/a&gt;. It's updated for the Visual Studio 2017 project format.&lt;/blockquote&gt;
&lt;h2&gt;Get the Required Tools on the CI Server&lt;/h2&gt;
&lt;p&gt;Just like with&lt;span&gt; &lt;/span&gt;&lt;a rel="noopener noreferrer" data-id="1172" href="/archive/unit-testing-and-code-coverage-with-jenkins-and-net-core/" target="_blank" title="Unit Testing and Code Coverage with Jenkins and .NET Core"&gt;unit testing in a CI environment&lt;/a&gt;, you’ll need the .Net Core SDK to be available to Jenkins. It’s available at&lt;span&gt; &lt;/span&gt;&lt;a rel="noopener noreferrer" href="https://www.microsoft.com/net/download#core" target="_blank" title=".Net Core Official Download Site"&gt;the official download site&lt;/a&gt;. Make sure you've got matching SDK and runtime versions, as you might encounter issues with xUnit if you have a later SDK installed with no matching runtime.&lt;/p&gt;
&lt;p&gt;Additionally, if you're using IIS instead of an Azure Web App, you need the .Net Core Windows Server hosting bundle on the hosting server, available on the same site. See the&lt;span&gt; &lt;/span&gt;&lt;a rel="noopener noreferrer" href="https://docs.asp.net/en/latest/publishing/iis.html#install-the-net-core-windows-server-hosting-bundle" target="_blank" title=".Net Core Documentation for Installing Windows Server Hosting Bundle"&gt;official documentation&lt;/a&gt; for how to install it.&lt;/p&gt;
&lt;p&gt;You will also need to install Node.js  and npm on the server that is hosting Jenkins so you can run all the prepublish commands for an Asp.Net Core application. To install Node.js, go to its&lt;span&gt; &lt;/span&gt;&lt;a rel="noopener noreferrer" href="https://nodejs.org/en/download/" target="_blank" title="Node.js Official Download Site"&gt;download site&lt;/a&gt;&lt;span&gt; &lt;/span&gt;and grab the latest Windows installer. When installing, make sure that&lt;span&gt; &lt;/span&gt;&lt;span class="Code"&gt;Add to PATH&lt;/span&gt; is enabled in the install options (it is by default), so that the npm command is added to the PATH environment variable and therefore directly recognized in the command line interface.&lt;/p&gt;
&lt;p&gt;Make sure to&lt;span&gt; &lt;/span&gt;&lt;strong&gt;restart Jenkins&lt;/strong&gt;&lt;span&gt; &lt;/span&gt;after you've added the tools, as the PATH variable is only read at startup and not watched for changes!&lt;/p&gt;
&lt;blockquote&gt;&lt;strong&gt;Tip:&lt;/strong&gt;&lt;span&gt; &lt;/span&gt;It's not required to globally install node modules like bower or gulp on the build server. That should be handled by node.js’ devDependencies and proper pre- and postpublish scripts in your deployment process.&lt;/blockquote&gt;
&lt;p&gt;Finally, the &lt;a rel="noopener noreferrer" href="https://www.iis.net/downloads/microsoft/web-deploy" target="_blank" title="Microsoft - Web Deploy Download"&gt;Web Deploy&lt;/a&gt; tool should be installed on your server to easily deploy to IIS and Azure from the command line.&lt;/p&gt;
&lt;h2&gt;Configure Jenkins to Build the Project&lt;/h2&gt;
&lt;p&gt;&lt;span&gt;In Jenkins, just create a new project and configure the source code management, for example by pulling from a Git repository. To build, publish and deploy, the following commands are necessary. They're explained in detail in the next section.&lt;/span&gt;&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;&lt;span&gt;&lt;strong&gt;Windows Batch&lt;br /&gt;&lt;/strong&gt;&lt;span class="Code"&gt;cd src\&amp;lt;ProjectFolder&amp;gt;&lt;br /&gt;dotnet publish -c Production -o publish&lt;/span&gt;&lt;br /&gt;The &lt;span class="Code"&gt;dotnet publish&lt;/span&gt; command restores, builds and publishes the web application. With &lt;span class="Code"&gt;-o publish&lt;/span&gt;, we're specifying the output folder to be &lt;span class="Code"&gt;/publish&lt;/span&gt;, relative to the project directory.&lt;br /&gt;&lt;/span&gt;&lt;/li&gt;
&lt;li&gt;&lt;span&gt;&lt;strong&gt;Windows PowerShell (optional)&lt;/strong&gt;&lt;br /&gt;&lt;span class="Code"&gt;cd src\&amp;lt;ProjectFolder&amp;gt;&lt;/span&gt;&lt;br /&gt;&lt;span class="Code"&gt;./TransformWebConfigForProduction.ps1&lt;/span&gt;&lt;br /&gt;There's no built in support in the dotnet CLI currently that allows the transformation of &lt;span class="Code"&gt;web.config&lt;/span&gt; files, so we're calling a custom PowerShell script to do it for us before deploying.&lt;/span&gt;&lt;/li&gt;
&lt;li&gt;&lt;span&gt;&lt;strong&gt;Windows Batch&lt;/strong&gt;&lt;span class="Code"&gt;&lt;br /&gt;"C:\Program Files (x86)\IIS\Microsoft Web Deploy V3\msdeploy.exe" -verb:sync -source:IisApp='%WORKSPACE%\src\&amp;lt;ProjectFolder&amp;gt;\publish' -dest:iisapp='&amp;lt;SiteName&amp;gt;',computerName='&amp;lt;WebDeployEndpoint&amp;gt;/msdeploy.axd?site=&amp;lt;SiteName&amp;gt;',authType='basic',username='&amp;lt;DeployUsername&amp;gt;',password='&amp;lt;DeployPassword&amp;gt;' -enableRule:AppOffline&lt;/span&gt;&lt;br /&gt;Uses Web Deploy to copy the &lt;span class="Code"&gt;publish&lt;/span&gt; folder to the server. You might have to adjust the path to &lt;span class="Code"&gt;msdeploy.exe&lt;/span&gt; if it's different on your machine.&lt;/span&gt;&lt;/li&gt;
&lt;/ol&gt;
&lt;p&gt; &lt;/p&gt;
&lt;h2&gt;&lt;span&gt;Publish the Project&lt;/span&gt;&lt;/h2&gt;
&lt;p&gt;&lt;span&gt;The first build step switches to the project directory and then calls &lt;span class="Code"&gt;dotnet publish -c Production -o publish&lt;/span&gt;.&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;span&gt;To make sure your pre- and postpublish scripts run, include them in your &lt;span class="Code"&gt;package.json&lt;/span&gt; (for npm) and &lt;span class="Code"&gt;project.csproj&lt;/span&gt; (for dotnet) files, for example like this:&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;span&gt;
&lt;script src="https://gist.github.com/GeorgDangl/86ea4cacaefaf5d74f10afbfcfbc5be0.js?file=project.csproj"&gt;&lt;/script&gt;
&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;span&gt;This combination would effectively run the &lt;/span&gt;&lt;span class="Code"&gt;npm install&lt;/span&gt;&lt;span&gt; command first, followed by &lt;/span&gt;&lt;span class="Code"&gt;bower install&lt;/span&gt;&lt;span&gt;, &lt;/span&gt;&lt;span class="Code"&gt;gulp clean&lt;/span&gt;&lt;span&gt; and &lt;/span&gt;&lt;span class="Code"&gt;gulp min&lt;/span&gt;&lt;span&gt;, then compile and publish the application via dotnet CLI. This example is from the &lt;/span&gt;&lt;a rel="noopener noreferrer" href="https://github.com/GeorgDangl/WebDocu" target="_blank" title="GitHub - GeorgDangl - WebDocu"&gt;WebDocu&lt;/a&gt;&lt;span&gt; project, a very simple MVC website. You can get a bit more creative for your apps, for example by using the Angular CLI to build your Single Page App front end bundles and copy them into your &lt;span class="Code"&gt;wwwroot&lt;/span&gt; folder before publishing.&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;span&gt;The &lt;span class="Code"&gt;WebConfigTransformRunner&lt;/span&gt; dependency is optional. It's to get the console tool via NuGet if you intend to run web.config transformations in the next step.&lt;/span&gt;&lt;/p&gt;
&lt;h2&gt;&lt;span&gt;Run web.config Transformations (optional)&lt;/span&gt;&lt;/h2&gt;
&lt;p&gt;To run web.config transformations from the command line, make sure the &lt;span&gt;WebConfigTransformRunner tool is referenced in your project and call this script in your project directory:&lt;/span&gt;&lt;/p&gt;
&lt;script src="https://gist.github.com/GeorgDangl/86ea4cacaefaf5d74f10afbfcfbc5be0.js?file=TransformWebConfigForProduction.ps1"&gt;&lt;/script&gt;
&lt;p&gt;&lt;span&gt;If you want to read more about manually running web.config transformation, &lt;/span&gt;&lt;span&gt;&lt;a rel="noopener noreferrer" data-id="1178" href="/archive/applying-web-config-transformations-without-msbuild/" target="_blank" title="Applying web.config Transformations without MSBuild"&gt;see here&lt;/a&gt;.&lt;/span&gt;&lt;/p&gt;
&lt;h2&gt;&lt;span&gt;Publish to IIS with Web Deploy&lt;/span&gt;&lt;/h2&gt;
&lt;p&gt;&lt;span&gt;For the next step, you should have set up Web Deploy on both the source and target server. There’s only one step needed, and that is calling &lt;span class="Code"&gt;msdeploy.exe&lt;/span&gt; to publish the app to IIS or Azure.&lt;br /&gt;When doing that, you must supply valid credentials for the user that has publishing rights in IIS. In Azure, you can simply download the publish profile in the Web App settings and extract the deployment username and password.&lt;/span&gt;&lt;/p&gt;
&lt;blockquote&gt;&lt;span&gt;Since it’s &lt;strong&gt;never&lt;/strong&gt; a good idea to &lt;strong&gt;include plain text&lt;/strong&gt; user &lt;strong&gt;credentials&lt;/strong&gt; in a script or build step, you should set up credentials in Jenkins and use these as variables in your build step.&lt;/span&gt;&lt;/blockquote&gt;
&lt;p&gt;&lt;span&gt;The command for web.deploy is rather long:&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;span class="Code"&gt;"C:\Program Files (x86)\IIS\Microsoft Web Deploy V3\msdeploy.exe" -verb:sync -source:IisApp='%WORKSPACE%\src\&amp;lt;&lt;strong&gt;ProjectFolder&lt;/strong&gt;&amp;gt;\publish' -dest:iisapp='&amp;lt;&lt;strong&gt;SiteName&lt;/strong&gt;&amp;gt;',computerName='&amp;lt;&lt;strong&gt;WebDeployEndpoint&lt;/strong&gt;&amp;gt;/msdeploy.axd?site=&amp;lt;&lt;strong&gt;SiteName&lt;/strong&gt;&amp;gt;',authType='basic',username='&amp;lt;&lt;strong&gt;DeployUsername&lt;/strong&gt;&amp;gt;',password='&amp;lt;&lt;strong&gt;DeployPassword&lt;/strong&gt;&amp;gt;' -enableRule:AppOffline&lt;/span&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;ProjectFolder&lt;br /&gt;&lt;/strong&gt;That's simply where the project is inside your Jenkins workspace.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;SiteName&lt;/strong&gt;&lt;span&gt;&lt;br /&gt;The name for the site. This is rather important since the WebDeploy service will determine whether or not the user has access to the service depending on the site name that is specified here.&lt;/span&gt;&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;WebDeployEndpoint&lt;/strong&gt;&lt;span&gt;&lt;br /&gt;You should just specify the Url endpoint for the WebDeploy service. You either configure it manually in IIS or just download the publish information for an Azure Web App. Here's &lt;a data-udi="umb://document/5ca9babbb4d743c2b7104738125bf53e" href="/archive/install-and-configure-web-deploy-for-an-iis-installation/" title="Install and Configure Web Deploy for an IIS Installation"&gt;a post about configuring Web Deploy&lt;/a&gt; on IIS.&lt;/span&gt;&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;DeployUsername&lt;/strong&gt;&lt;span&gt;&lt;br /&gt;The username for the user who you gave access to WebDeploy. You should really use Jenkins credentials for this instead of pasting the actual username and password in your Jenkins job.&lt;/span&gt;&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;DeployPassword&lt;/strong&gt;&lt;span&gt;&lt;br /&gt;Should be clear – Provide the user accounts password&lt;/span&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;The &lt;span class="Code"&gt;authType=’Basic’&lt;/span&gt; parameter is not strictly required. If your happen to be in an Active Directory domain, you could also use Windows Authentication and completely omit any credentials in the Web Deploy process.&lt;/p&gt;
&lt;p&gt;&lt;span&gt;Specyfing &lt;span&gt;&lt;span class="Code"&gt;-enableRule:AppOffline&lt;/span&gt; means that your app is taken offline during the deployment process to avoid file locks.&lt;/span&gt;&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;span&gt;&lt;span class="Code"&gt;verb:sync&lt;/span&gt; makes sure to sync our publish folder with the server, so this is the operation (“verb”) to perform&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;If you want to dig further into msdeploy, these three links from the technet contain a lot of info:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a rel="noopener noreferrer" href="https://technet.microsoft.com/de-de/library/dd568992(v=ws.10).aspx" target="_blank" title="Technet Web Deploy Rules"&gt;Web Deploy Rules&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a rel="noopener noreferrer" href="https://technet.microsoft.com/de-de/library/dd569089(v=ws.10).aspx" target="_blank" title="Technet Web Deploy Operation Settings"&gt;Web Deploy Operation Settings&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a rel="noopener noreferrer" href="https://technet.microsoft.com/de-de/library/dd569001(v=ws.10).aspx" target="_blank" title="Technet Web Deploy Provider Settings"&gt;Web Deploy Provider Settings&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;h2&gt;Summary&lt;/h2&gt;
&lt;p&gt;&lt;span&gt;Now you should be able to click on “Build now” and let the magic happen, continuous deployment for your .Net Core website. It's good practice to configure triggers for your job. I often use branches for workflow control, e.g. I have a job that deploys a website to a staging (or testing) environment whenever a push happens to the &lt;span class="Code"&gt;develop&lt;/span&gt; branch.&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;span&gt;Happy Deploying!&lt;/span&gt;&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;</description>
      <pubDate>Sun, 22 Oct 2017 14:11:38 Z</pubDate>
      <a10:updated>2017-10-22T14:11:38Z</a10:updated>
    </item>
    <item>
      <guid isPermaLink="false">1352</guid>
      <link>https://blog.dangl.me/archive/publish-net-core-code-coverage-results-with-reportgenerator-in-jenkins/</link>
      <category>Continuous Integration</category>
      <title>Publish .Net Core Code Coverage Results with ReportGenerator in Jenkins</title>
      <description>&lt;p&gt;I've &lt;a data-udi="umb://document/0af7dc1507fd4c188880e6c98a47fe4d" href="/archive/unit-testing-and-code-coverage-with-jenkins-and-net-core-2017-edition/" title="Unit Testing and Code Coverage with Jenkins and .NET Core - 2017 Edition"&gt;blogged before&lt;/a&gt; about automating your .Net Core test, deploy and coverage work flow with Jenkins. When visualizing coverage reports, I'm usually sticking to what's built into Jenkins. I've &lt;a data-udi="umb://document/40b91f3434c14e7185f30e29dd868f9e" href="/archive/unit-testing-and-code-coverage-with-jenkins-and-net-core/" title="Unit Testing and Code Coverage with Jenkins and .NET Core"&gt;dabbled with the ReportGenerator before&lt;/a&gt;, but was content with the well-known &lt;a rel="noopener noreferrer" href="https://wiki.jenkins.io/display/JENKINS/Cobertura+Plugin" target="_blank" title="Jenkins Cobertura Plugin"&gt;Cobertura Plugin&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;The only working solution currently that supports both .Net and .Net Core for capturing code coverage, &lt;a rel="noopener noreferrer" href="https://github.com/OpenCover/opencover" target="_blank" title="GitHub - OpenCover"&gt;OpenCover&lt;/a&gt;, doesn't output the Cobertura format by default and thus needs the &lt;a rel="noopener noreferrer" href="https://github.com/danielpalme/OpenCoverToCoberturaConverter" target="_blank" title="GitHub - OpenCoverToCoberturaConverter"&gt;OpenCoverToCoberturaConverter&lt;/a&gt;. Recently, it's author, &lt;a rel="noopener noreferrer" href="http://www.palmmedia.de" target="_blank" title="Daniel Palme"&gt;Daniel Palme&lt;/a&gt;, suggested I could try out the ReportGenerators new version. Since he's the author of a library I use in literally every single project, both private and at work, I was excited to check the &lt;a rel="noopener noreferrer" href="https://github.com/danielpalme/ReportGenerator" target="_blank" title="GitHub - ReportGenerator"&gt;ReportGenerator&lt;/a&gt; out again. In short: It's great! The visualization looks modern, chic and it's enriched with additional metadata such as cyclomatic complexity and the famous &lt;a rel="noopener noreferrer" href="https://testing.googleblog.com/2011/02/this-code-is-crap.html" target="_blank" title="Google Testing Blog - This Code is CRAP"&gt;CRAP score&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;It's also really simple to set up. Please check out &lt;a rel="noopener noreferrer" href="https://github.com/GeorgDangl/ReportGeneratorExample" target="_blank" title="GitHub - GeorgDangl - ReportGeneratorExample"&gt;this sample project&lt;/a&gt;, which you can either clone or download directly, to have a basic starter for a .Net Standard library with .Net Core unit tests, including code coverage. It's fairly simple to get started and have a great foundation for expaning your code base. You only need to know these steps to get more confidence in your code evolution:&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;A PowerShell script&lt;/strong&gt;, &lt;span class="Code"&gt;TestsAndCoverage.ps1&lt;/span&gt;, which runs the tests, gathers coverage data and builds the Html report:&lt;/p&gt;
&lt;script src="https://gist.github.com/GeorgDangl/df180003c6a227bb795986679a1a0455.js"&gt;&lt;/script&gt;
&lt;p&gt;&lt;strong&gt;A Jenkins instance&lt;/strong&gt; configured to check out your code, run the script and publish both your test results and the generated report:&lt;/p&gt;
&lt;p&gt;&lt;img id="__mcenew" style="display: block; margin-left: auto; margin-right: auto;" src="https://blog.dangl.me/media/1168/jenkinsconfiguration.png" alt="Report Generator Configuration in Jenkins" data-udi="umb://media/8593625ad4cd4bcb83a17ad0bb3370ef" /&gt;&lt;/p&gt;
&lt;p&gt;After your first run, there'll be a &lt;em&gt;Code Coverage Report&lt;/em&gt; entry in your Jenkins job that gives you a detailed breakdown of your coverage. You can even easily configure additional features such as keeping a history with ReportGenerator.&lt;/p&gt;
&lt;p&gt;As always, spend some minutes at the beginning of your project to configure automatic testing, coverage and reporting on every code push and you'll save &lt;em&gt;future-you&lt;/em&gt; a ton of trouble later when you've already acquired a lot of technical debt.&lt;/p&gt;
&lt;p&gt;Happy Analyzing!&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;</description>
      <pubDate>Wed, 11 Oct 2017 16:24:11 Z</pubDate>
      <a10:updated>2017-10-11T16:24:11Z</a10:updated>
    </item>
    <item>
      <guid isPermaLink="false">1312</guid>
      <link>https://blog.dangl.me/archive/target-and-visualize-results-of-multiple-frameworks-in-net-unit-tests-with-jenskins-ci/</link>
      <category>Continuous Integration</category>
      <title>Target and Visualize Results of Multiple Frameworks in .Net Unit Tests with Jenskins CI</title>
      <description>&lt;p&gt;In the &lt;a rel="noopener noreferrer" href="https://github.com/GeorgDangl/Dangl.Common" target="_blank" title="GitHub - GeorgDangl/Dangl.Common"&gt;example project&lt;/a&gt;, the unit test project is configured with multiple &lt;span&gt;&lt;span class="Code"&gt;TargetFrameworks&lt;/span&gt;. This is to ensure that tests are being repeated on all target platforms and catch differing runtime behavior early. There are a few cases where there are differences, so it should be best practice for you to run your tests in as many configurations as possible. See the project file below how to set up such a project.&lt;/span&gt;&lt;/p&gt;
&lt;script src="https://gist.github.com/GeorgDangl/0f3280bfa22b3f5549234f7d57d070c7.js?file=Dangl.Common.csproj"&gt;&lt;/script&gt;
&lt;p&gt;&lt;span&gt;With &lt;a rel="noopener noreferrer" data-udi="umb://document/0af7dc1507fd4c188880e6c98a47fe4d" href="/archive/unit-testing-and-code-coverage-with-jenkins-and-net-core-2017-edition/" target="_blank" title="Unit Testing and Code Coverage with Jenkins and .NET Core - 2017 Edition"&gt;xUnit and Jenkins&lt;/a&gt;, you ideally have a configuration that tests your code on every commit, so you'll have test results available that look like this:&lt;/span&gt;&lt;/p&gt;
&lt;p&gt; &lt;img id="__mcenew" style="display: block; margin-left: auto; margin-right: auto;" src="https://blog.dangl.me/media/1143/testresults_before.png" alt="Test results before" data-udi="umb://media/cd83e659fe9549ec9757961a18541587" /&gt;&lt;/p&gt;
&lt;p&gt;When you take a look at the detailed results, you see that your tests are repeated for every target framework. This is great, since it means that all tests are actually run, but you're not getting information about the framework in which a failure occurred. You've just got the same test repeated, with the exact same name. Luckily, if you launch &lt;span class="Code"&gt;dotnet xunit&lt;/span&gt; and don't specify an explicit framework, it will by default run tests across all specified frameworks and amend the output files with the framework name.&lt;/p&gt;
&lt;script src="https://gist.github.com/GeorgDangl/0f3280bfa22b3f5549234f7d57d070c7.js?file=Tests.ps1"&gt;&lt;/script&gt;
&lt;p&gt;This makes it easy to add information about the test framework before publishing in Jenkins with the &lt;a rel="noopener noreferrer" href="https://wiki.jenkins.io/display/JENKINS/xUnit+Plugin?focusedCommentId=57181754" target="_blank" title="Jenkins CI - xUnit Plugin"&gt;xUnit plugin&lt;/a&gt;: Get the framework name from the test results filename and prepend it to the type attribute of every run test case. The above test script calls into a second helper script to do just that:&lt;/p&gt;
&lt;script src="https://gist.github.com/GeorgDangl/0f3280bfa22b3f5549234f7d57d070c7.js?file=AppendxUnitFramework.ps1"&gt;&lt;/script&gt;
&lt;p&gt;Suddenly, you have detailed information about every run test case and can much easier identify problems and reproduce them when they arise:&lt;/p&gt;
&lt;p&gt; &lt;img id="__mcenew" src="https://blog.dangl.me/media/1145/testresults_after.png" alt="Test results after" data-udi="umb://media/238dbd9966e14a5ca4d16bc9af53e49e" /&gt;&lt;/p&gt;
&lt;p&gt;Happy testing!&lt;/p&gt;</description>
      <pubDate>Mon, 02 Oct 2017 10:56:30 Z</pubDate>
      <a10:updated>2017-10-02T10:56:30Z</a10:updated>
    </item>
    <item>
      <guid isPermaLink="false">1307</guid>
      <link>https://blog.dangl.me/archive/continuous-deployment-from-jenkins-to-a-nuget-feed/</link>
      <category>Continuous Integration</category>
      <title>Continuous Deployment from Jenkins to a NuGet Feed</title>
      <description>&lt;p&gt;Code sharing via separate assemblies is an ideal way of separating concerns and benefiting from code reuse. For this, it's great to have continuous deployment of development builds when you're iterating fast and only creating manual releases when deemed necessary.&lt;/p&gt;
&lt;p&gt;Personally, I'm hosting my private projects on a &lt;a rel="noopener noreferrer" data-udi="umb://document/17cae157557f45a291123bcd6647aa2b" href="/archive/set-up-private-nuget-feeds-with-proget-v4plus/" target="_blank" title="Set Up Private NuGet Feeds with ProGet v4+"&gt;ProGet&lt;/a&gt; instance and the public ones are either on &lt;a rel="noopener noreferrer" href="https://www.nuget.org/profiles/GeorgDangl" target="_blank" title="NuGet - Georg Dangl"&gt;NuGet&lt;/a&gt;, or for dev releases on &lt;a rel="noopener noreferrer" href="https://www.myget.org/gallery/dangl" target="_blank" title="MyGet - Georg Dangl"&gt;MyGet&lt;/a&gt;. Configuring your project is easy:&lt;/p&gt;
&lt;script src="https://gist.github.com/GeorgDangl/13e025ea12cec5324138557a31d70f97.js?file=Dangl.Common.csproj"&gt;&lt;/script&gt;
&lt;p&gt;Instead of using the &lt;span class="Code"&gt;Version&lt;/span&gt; attribute, you're setting the &lt;span class="Code"&gt;VersionPrefix&lt;/span&gt; to the next higher one since your last release. Additionally, a conditional &lt;span class="Code"&gt;VersionSuffix&lt;/span&gt; is appended if the current branch is &lt;span class="Code"&gt;origin/dev&lt;/span&gt; (or whatever development branch you're using). This means your package version will be something like &lt;span class="Code"&gt;1.4.1-build-19&lt;/span&gt; in development and when you finally merge to master, it's &lt;span class="Code"&gt;1.4.1&lt;/span&gt;. Both &lt;span&gt;&lt;span class="Code"&gt;BUILD_NUMBER&lt;/span&gt; and &lt;span class="Code"&gt;GIT_BRANCH&lt;/span&gt; are environment variables provided by Jenkins.&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;Having a project like this requires then a bit of configuration in Jenkins to deploy to your package feed, MyGet in this example:&lt;/p&gt;
&lt;p&gt;&lt;img id="__mcenew" src="https://blog.dangl.me/media/1142/jenkinssetup.png" alt="" data-udi="umb://media/e4c7c530a39c407697b90b8567ad470c" /&gt;&lt;img id="__mcenew" style="display: block; margin-left: auto; margin-right: auto;" src="https://blog.dangl.me/media/1142/jenkinssetup.png" alt="" data-udi="umb://media/e4c7c530a39c407697b90b8567ad470c" /&gt;&lt;/p&gt;
&lt;p&gt;I'm using the &lt;a rel="noopener noreferrer" href="https://www.google.de/search?client=opera&amp;amp;q=jenkins+credentials+plugin&amp;amp;sourceid=opera&amp;amp;ie=UTF-8&amp;amp;oe=UTF-8" target="_blank" title="Jenkins Credentials Plugin"&gt;Credentials Plugin&lt;/a&gt; to inject my MyGet API key into the build. The project is configured to auto-package when built in the &lt;span class="Code"&gt;Release&lt;/span&gt; configuration. Then, the package is searched and publish via the NuGet command line utility to my feed. Now on every build, I get a new package pushed to my feed that's available for consumption downstream right away.&lt;/p&gt;
&lt;p&gt;When you push to &lt;span class="Code"&gt;master&lt;/span&gt; and want to make your package available directly via NuGet, just download the package from MyGet and release on NuGet. It's not getting any easier!&lt;/p&gt;
&lt;p&gt;Happy deploying!&lt;/p&gt;</description>
      <pubDate>Fri, 01 Sep 2017 20:18:48 Z</pubDate>
      <a10:updated>2017-09-01T20:18:48Z</a10:updated>
    </item>
    <item>
      <guid isPermaLink="false">1299</guid>
      <link>https://blog.dangl.me/archive/unit-testing-and-code-coverage-with-jenkins-and-net-core-2017-edition/</link>
      <category>Continuous Integration</category>
      <title>Unit Testing and Code Coverage with Jenkins and .NET Core - 2017 Edition</title>
      <description>&lt;p&gt;Last year, &lt;a data-udi="umb://document/40b91f3434c14e7185f30e29dd868f9e" href="/archive/unit-testing-and-code-coverage-with-jenkins-and-net-core/" title="Unit Testing and Code Coverage with Jenkins and .NET Core"&gt;I've blogged&lt;/a&gt; about setting up Jenkins with the then just released .Net Core ecosystem. This is an updated post to target the latest dotnet CLI and MSBuild-based csproj format that was introduced with Visual Studio 2017.&lt;/p&gt;
&lt;p&gt;&lt;span&gt;This post shows you how to use &lt;strong&gt;Jenkins&lt;/strong&gt; as your Continuous Integration system, a project that supports &lt;strong&gt;netstandard&lt;/strong&gt;, .&lt;strong&gt;Net Core&lt;/strong&gt; or the full &lt;strong&gt;.Net Framework&lt;/strong&gt;, &lt;strong&gt;xUnit&lt;/strong&gt; as test runner and &lt;strong&gt;OpenCover&lt;/strong&gt; to generate the coverage reports. Jenkins will also report any compiler warnings and open tasks you may have in your project (such as &lt;span class="Code"&gt;// TODO&lt;/span&gt; comments). You will need a &lt;strong&gt;Windows&lt;/strong&gt; environment on your CI server, otherwise &lt;a rel="noopener noreferrer" data-udi="umb://document/4e3dc17d230c47e7bc6290f3cf425cd3" href="/archive/net-core-and-typescript-continuous-integration-testing-with-jenkins-on-linux/" target="_blank" title=".Net Core and TypeScript Continuous Integration Testing with Jenkins on Linux"&gt;check here&lt;/a&gt; for instructions on &lt;strong&gt;Linux&lt;/strong&gt;.&lt;/span&gt;&lt;/p&gt;
&lt;h2&gt;&lt;span&gt;Required Tools&lt;/span&gt;&lt;/h2&gt;
&lt;p&gt;&lt;span&gt;Obviously, you have to install the .NET Core SDK. It is available &lt;/span&gt;&lt;a rel="noopener noreferrer" href="https://www.microsoft.com/net/download#core" target="_blank" title=".NET Core Official Download Site"&gt;at the official dot.net site&lt;/a&gt;&lt;span&gt; and installs the dotnet CLI tool and registers it globally for all users on the PATH variable. If you intend to run tests for the full .Net Framework, you have to install the right SDKs for that. I found just installing Visual Studio 2017 with all the .Net SDKs quite simple for that.&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;In Jenkins, four plugins are required:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a rel="noopener noreferrer" href="https://wiki.jenkins-ci.org/display/JENKINS/xUnit+Plugin" target="_blank" title="Jenkins CI xUnit Plugin"&gt;xUnit Plugin&lt;/a&gt;&lt;span&gt; &lt;/span&gt;to evaluate test results&lt;/li&gt;
&lt;li&gt;&lt;a rel="noopener noreferrer" href="https://wiki.jenkins-ci.org/display/JENKINS/Cobertura+Plugin" target="_blank" title="Jenkins CI Cobertura Plugin"&gt;Cobertura Plugin&lt;/a&gt;&lt;span&gt; &lt;/span&gt;for the code coverage data&lt;/li&gt;
&lt;li&gt;&lt;a rel="noopener noreferrer" href="https://wiki.jenkins.io/display/JENKINS/Warnings+Plugin" target="_blank" title="Jenkins CI Warnings Plug-in"&gt;Warnings Plug-in&lt;/a&gt; to scan for compiler warnings&lt;/li&gt;
&lt;li&gt;&lt;a rel="noopener noreferrer" href="https://wiki.jenkins.io/display/JENKINS/Task+Scanner+Plugin" target="_blank" title="Jenkins CI Task Scanner Plugin"&gt;Task Scanner Plugin&lt;/a&gt; to scan for open tasks&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Since we're using OpenCover, the code coverage results must be transformed to the Cobertura format before they're published in Jenkins. The &lt;span class="Code"&gt;OpenCoverToCoberturaConverter&lt;/span&gt; NuGet package should be included as a build-time dependency (&lt;span class="Code"&gt;&amp;lt;PrivateAssets&amp;gt;All&amp;lt;/PrivateAssets&amp;gt;&lt;/span&gt;) in your test project. OpenCover works both for &lt;strong&gt;.Net Core&lt;/strong&gt; as well as the full &lt;strong&gt;.Net Framework&lt;/strong&gt;. It's slowing the process quite a bit down, though, so consider splitting larger projects into unit test jobs with code coverage and plain integration or acceptance test jobs without OpenCover.&lt;/p&gt;
&lt;p&gt;You &lt;span class="Code"&gt;TestProject.csproj&lt;/span&gt; should look like this:&lt;/p&gt;
&lt;script src="https://gist.github.com/GeorgDangl/6181c271e3dbdc562a414c3e298de0aa.js?file=Sample.Tests.csproj"&gt;&lt;/script&gt;
&lt;p&gt;The &lt;span class="Code"&gt;xunit&lt;/span&gt;, &lt;span class="Code"&gt;xunit.runner.visualstudio&lt;/span&gt; and &lt;span class="Code"&gt;Microsoft.NET.Tests.Sdk&lt;/span&gt; packages are required for xUnit, while the &lt;span&gt;&lt;span class="Code"&gt;dotnet-xunit&lt;/span&gt; CLI tool allows us to run tests directly with &lt;span class="Code"&gt;dotnet xunit&lt;/span&gt; and supply xUnit parameters instead of going through &lt;span class="Code"&gt;dotnet test&lt;/span&gt;. &lt;span class="Code"&gt;OpenCover&lt;/span&gt; and &lt;span class="Code"&gt;OpenCoverToCoberturaConverter&lt;/span&gt; are build time dependencies, both are included in the project. During package restore, the tools are copied to your local &lt;span class="Code"&gt;.nuget&lt;/span&gt; folder. The &lt;span class="Code"&gt;TargetFrameworks&lt;/span&gt; are set to whatever frameworks you want to test. I suggest to always test .Net Core and the full .Net Framework if your project targets netstandard, since sometimes there are slight differences that may crop up by testing against multiple supported platforms. For example, the &lt;span class="Code"&gt;System.Text.Encoding&lt;/span&gt; package behaves different between these two.&lt;/span&gt;&lt;/p&gt;
&lt;h2&gt;&lt;span&gt;Configure your Project under Test&lt;/span&gt;&lt;/h2&gt;
&lt;p&gt;&lt;span&gt;To allow OpenCover to record coverage, it needs to have access to the actual source code. This is easiest done by configuring your project under test to output debug symbols (pdb files) when compiling for debug. &lt;strong&gt;This is only required in your actual project, not in your test projects&lt;/strong&gt;. If your Code Lens support in Visual Studio seems broken and you can't navigate to test cases from the Test Explorer, you might accidentally generate symbols for your test projects, too. This currently messes with xUnit / the Visual Studio test discovery engine and inhibits Code Lens support and mapping from reported test cases to actual code. Configure your project under test like this:&lt;/span&gt;&lt;/p&gt;
&lt;script src="https://gist.github.com/GeorgDangl/6181c271e3dbdc562a414c3e298de0aa.js?file=Sample.csproj"&gt;&lt;/script&gt;
&lt;p&gt;&lt;span&gt;At the root of your project, you need a PowerShell script to run your tests. I like the script approach, since it gives me a reproducible, single-click way (&lt;a rel="noopener noreferrer" href="https://www.joelonsoftware.com/2000/08/09/the-joel-test-12-steps-to-better-code/" target="_blank" title="Joel On Software - The Joel Test"&gt;#2 on the Joel Test&lt;/a&gt;) to test my project.&lt;/span&gt;&lt;/p&gt;
&lt;blockquote&gt;&lt;span&gt;&lt;strong&gt;Tip&lt;/strong&gt;: &lt;a rel="noopener noreferrer" href="https://gist.github.com/GeorgDangl/6181c271e3dbdc562a414c3e298de0aa#file-tests-ps1" target="_blank" title="GitHub - Georg Dangl - PowerShell script to automate xUnit tests"&gt;Here's a simpler script&lt;/a&gt; if you only want to run tests and no code coverage.&lt;/span&gt;&lt;/blockquote&gt;
&lt;script src="https://gist.github.com/GeorgDangl/6181c271e3dbdc562a414c3e298de0aa.js?file=TestsAndCoverage.ps1"&gt;&lt;/script&gt;
&lt;p&gt;&lt;span&gt;It's doing three things:&lt;/span&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;span&gt;Locate the OpenCover tools via the NuGet cache&lt;/span&gt;&lt;/li&gt;
&lt;li&gt;&lt;span&gt;Run xUnit tests with attached OpenCover&lt;/span&gt;&lt;/li&gt;
&lt;li&gt;&lt;span&gt;Convert OpenCover results to Cobertura for later processing&lt;/span&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;h2&gt;&lt;span&gt;Configure the Job in Jenkins&lt;/span&gt;&lt;/h2&gt;
&lt;p&gt;&lt;span&gt;In Jenkins, there's only one simple &lt;span class="Code"&gt;Execute Windows Batch Command&lt;/span&gt; step required:&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;span class="Code"&gt;powershell.exe -NoProfile -ExecutionPolicy Bypass ./TestsAndCoverage.ps1&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;span&gt;After the nuget packages have been restored, the &lt;span class="Code"&gt;dotnet xunit&lt;/span&gt; command is available and your tests can run. Every configured test project is tested against all targeted frameworks while the coverage is being recorded. It's not strictly necessary with this script, but it's good advice to configure a job to clean the workspace before each checkout whenever results are written to disk, to be sure no previous runs contaminate your results.&lt;/span&gt;&lt;/p&gt;
&lt;h2&gt;&lt;span&gt;Evaluate Results in Jenkins&lt;/span&gt;&lt;/h2&gt;
&lt;p&gt;&lt;span&gt;In total, four post build actions are required to evaluate all results in Jenkins. Add and configure the tasks as shown below for a great CI experience.&lt;/span&gt;&lt;/p&gt;
&lt;h3&gt;&lt;span&gt;Scan Workspace for Open Tasks&lt;/span&gt;&lt;/h3&gt;
&lt;p&gt;&lt;span&gt;This is great for finding all those &lt;span class="Code"&gt;// TODO&lt;/span&gt; and &lt;span class="Code"&gt;// HACK&lt;/span&gt; things that accumulate in every project. It is part of measuring &lt;a rel="noopener noreferrer" href="https://www.techopedia.com/definition/27913/technical-debt" target="_blank" title="Techopedia - Technical Debt"&gt;technical debt&lt;/a&gt; and allows you to visually track when you should commit to your backlog and chose refactoring over new features. Or, at least, it gives you a bad feeling when the graph rises...&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;span&gt;&lt;img id="__mcenew" style="display: block; margin-left: auto; margin-right: auto;" src="https://blog.dangl.me/media/1136/scanworkspaceforopentasks.png" alt="Jenkins - Scan workspace for open Tasks" data-udi="umb://media/a49ea86f277448c69087f6d674661039" /&gt;&lt;/span&gt;&lt;/p&gt;
&lt;h3&gt;&lt;span&gt;Scan for Compiler Warnings&lt;/span&gt;&lt;/h3&gt;
&lt;p&gt;&lt;span&gt;Similar to Open Tasks, but usually more severe. This task captures all warnings reported during the &lt;span class="Code"&gt;dotnet build&lt;/span&gt; (or MSBuild) process. I'm a big fan of a zero-warnings tolerance, and this report certainly shows you when it's out of hand. In my experience, a lot of compiler warnings directly translate to a lot of bugs and runtime errors, so treat these seriously.&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;span&gt;&lt;img id="__mcenew" style="display: block; margin-left: auto; margin-right: auto;" src="https://blog.dangl.me/media/1139/scanforcompilerwarnings.png" alt="Jenkins - Scan for compiler warnings" data-udi="umb://media/dc286fc6919d4c60ad112b937722fdf7" /&gt;&lt;/span&gt;&lt;/p&gt;
&lt;h3&gt;&lt;span&gt;Publish Cobertura Coverage Report&lt;/span&gt;&lt;/h3&gt;
&lt;p&gt;&lt;span&gt;The PowerShell script generates these &lt;span class="Code"&gt;*.coverageresults&lt;/span&gt; files in the workspace directory, you just need to pick them up. But be wary of too strict metrics on code coverage, as I've often seen goals of having a certain percentage of coverage required. In my experience, the most important thing is to have a solid base from where you start and to not have the coverage decline when the code base grows. This makes sure that your changes and additions are properly covered by tests, but you don't have to waste your time writing tests just to have every conditional hit.&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;span&gt;&lt;img id="__mcenew" style="display: block; margin-left: auto; margin-right: auto;" src="https://blog.dangl.me/media/1137/publishcoberturareport.png" alt="Jenkins - Publish Cobertura report" data-udi="umb://media/b34385386beb40269c6d647b2201c846" /&gt;&lt;/span&gt;&lt;/p&gt;
&lt;h3&gt;&lt;span&gt;Publish xUnit Test Result Report&lt;/span&gt;&lt;/h3&gt;
&lt;p&gt;&lt;span&gt;Probably the most important task, checking if the code actually does what it's supposed to do. You can configure custom thresholds for when a build is considered a failure, but I can't imagine a case where more than zero would be an acceptable answer here.&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;span&gt;&lt;img id="__mcenew" style="display: block; margin-left: auto; margin-right: auto;" src="https://blog.dangl.me/media/1138/publishxunittestreport.png" alt="Jenkins - Publish xUnit test results report" data-udi="umb://media/3f649cf908f34de69b1c738112be33a3" /&gt;&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;span&gt;Don't forget to add an &lt;em&gt;E-mail notification&lt;/em&gt; action at the end, being informed in detail and on time is the most important aspect of CI!&lt;/span&gt;&lt;/p&gt;
&lt;h2&gt;&lt;span&gt;View the Results in Jenkins&lt;/span&gt;&lt;/h2&gt;
&lt;p&gt;&lt;span&gt;Your job view in Jenkins will now display four graphs on the right side for a quick visual overview of your project's health, with more information being available via the menu options. This configuration has been working good for me for all project's I've encountered, and if you want, you can even easily &lt;a rel="noopener noreferrer" data-udi="umb://document/7309eddfa9584ae0972c850e05a62c81" href="/archive/netcoreheroes-angular2-with-net-core-in-visual-studio-2015-part-iv/" target="_blank" title="NetCoreHeroes: Angular 2 with .Net Core in Visual Studio 2015, Part IV"&gt;integrate your front end JavaScript (or TypeScript) tests&lt;/a&gt; into your job if what you're building is a web project.&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;span&gt;&lt;img id="__mcenew" style="display: block; margin-left: auto; margin-right: auto;" src="https://blog.dangl.me/media/1140/jenkinsresult.png" alt="Jenkins Job Results" data-udi="umb://media/37e497d7aba845f7bcb81a4bd8493dc3" /&gt;&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;span&gt;Happy Testing!&lt;/span&gt;&lt;/p&gt;</description>
      <pubDate>Sat, 19 Aug 2017 18:14:42 Z</pubDate>
      <a10:updated>2017-08-19T18:14:42Z</a10:updated>
    </item>
    <item>
      <guid isPermaLink="false">1264</guid>
      <link>https://blog.dangl.me/archive/net-core-and-typescript-continuous-integration-testing-with-jenkins-on-linux/</link>
      <category>Continuous Integration</category>
      <category>Linux</category>
      <title>.Net Core and TypeScript Continuous Integration Testing with Jenkins on Linux</title>
      <description>&lt;p&gt;I've previously blogged about the &lt;a rel="noopener noreferrer" href="https://github.com/GeorgDangl/NetCoreHeroes" target="_blank" title="GitHub - GeorgDangl/NetCoreHeroes"&gt;NetCoreHeroes &lt;/a&gt;project, an Asp.Net Core demo app that features an Angular frontend. It's been my learning project for getting a nice WebDev experience in Visual Studio and Jenkins, and recently &lt;a rel="noopener noreferrer" href="/archive/unit-testing-and-code-coverage-with-jenkins-and-net-core/#comment-2870892160" target="_blank" title="Dangl Blog - Unit Testing and Code Coverage with Jenkins and .Net Core"&gt;there's been interest&lt;/a&gt; in getting it to work in Linux.&lt;/p&gt;
&lt;p&gt;Being a traditional Windows and .Net guy, I've never done a lot with Linux, but I wanted to see how easy it actually is. It actually is pretty easy and took me only two evenings of getting everything up and running on Linux. I've used a Hyper-V virtual machine with Ubuntu Server 16.04, but results should be comparable across distributions.&lt;/p&gt;
&lt;h2&gt;Install Hyper-V&lt;/h2&gt;
&lt;p&gt;On a Windows host system, you've got to install the Hyper-V role. I'm running it on an Intel NUC with an Atom processor. It sits somewhere behind the TV in the living room, is ridiculously slow, but gets the job done and costs about 300,- €.&lt;/p&gt;
&lt;p&gt;&lt;img id="__mcenew" style="display: block; margin-left: auto; margin-right: auto;" src="https://blog.dangl.me/media/1124/01_install-hyperv-role-on-windows.png" alt="Windows Server 2012 R2 - Install Hyper-V Role" data-udi="umb://media/73b9e1b249d743e89029a15918cef2c0" /&gt;&lt;/p&gt;
&lt;p&gt;From the Server Manager, go to &lt;em&gt;Add Roles &amp;amp; Features&lt;/em&gt;, select the &lt;em&gt;Hyper-V&lt;/em&gt; role, confirm the GUI and PowerShell management tools and click on next until you’re in the configuration tab for &lt;em&gt;Virtual Switches&lt;/em&gt;. Since the VM will need internet access, you’ll want to select a network interface through which you’ll connect the virtual machines to the outside world.&lt;/p&gt;
&lt;p&gt;&lt;img id="__mcenew" style="display: block; margin-left: auto; margin-right: auto;" src="https://blog.dangl.me/media/1125/02_select-virtual-switch-for-hyperv.png" alt="Hyper-V Virtual Switch Configuration" data-udi="umb://media/f1589b55d5084c7eaed4b854a82210fd" /&gt;&lt;/p&gt;
&lt;p&gt;In an ideal world, your virtualization host has multiple LAN ports, with a dedicated one for the virtual machines. However, the NUC behind the TV doesn't have that, so we’ll accept the risk and move on. This is mostly a problem for connectivity, since misconfiguration might make your machine unavailable.&lt;br /&gt;You don’t need to add support for Live Migrations for this example and can go with the default storage locations for VM configuration and hard disks.&lt;/p&gt;
&lt;h2&gt;Setup the Virtual Machine&lt;/h2&gt;
&lt;h3&gt;Download Ubuntu Server&lt;/h3&gt;
&lt;p&gt;Go to &lt;a href="https://www.ubuntu.com/download/server"&gt;https://www.ubuntu.com/download/server&lt;/a&gt; and download the current Long Term Support (LTS) version of Ubuntu Server. At the time of writing, that’s 16.04.2.&lt;/p&gt;
&lt;h3&gt;Provision the VM&lt;/h3&gt;
&lt;p&gt;In the Hyper-V manager, create a new virtual machine. I've assigned it 2GB of RAM and a 20GB Hard Disk, this should be plenty for a simple Jenkins instance and one Asp.Net Core demo site. Generation 2 VM have the newer features with the drawback of not being compatible with any hosts pre Windows Server 2012.&lt;/p&gt;
&lt;p&gt;&lt;img id="__mcenew" style="display: block; margin-left: auto; margin-right: auto;" src="https://blog.dangl.me/media/1126/03_createvm.png" alt="Provision a Virtual Machine in Hyper-V" data-udi="umb://media/5495b803526b486fbc183edeeeab0946" /&gt;&lt;/p&gt;
&lt;h3&gt;Install Ubuntu Server &amp;amp; Jenkins&lt;/h3&gt;
&lt;p&gt;When provisioning the VM, I’ve selected to install an operating system from an image and chose the &lt;em&gt;*.iso&lt;/em&gt; that was downloaded earlier for Ubuntu Server. After the VM has been provisioned, you need to change its firmware settings to deactivate &lt;em&gt;Secure Boot&lt;/em&gt; since that’s not compatible with Ubuntu Server. After that, simply connect to it and power it on.&lt;/p&gt;
&lt;p&gt;&lt;img id="__mcenew" style="display: block; margin-left: auto; margin-right: auto;" src="https://blog.dangl.me/media/1127/04_hyperv-ubuntu-installation-welcome-screen.png" alt="Ubuntu Server Installation Welcome Screen" data-udi="umb://media/1c9cd55350e54ec38a9812cdb11d98f4" /&gt;&lt;/p&gt;
&lt;p&gt;This tutorial doesn't cover how to install &amp;amp; setup Ubuntu Server itself. The setup is quite self explanatory (which I would not have expected from Linux if you had asked me a week ago).&lt;/p&gt;
&lt;p&gt;For the Jenkins installation, &lt;a href="https://www.digitalocean.com/community/tutorials/how-to-install-jenkins-on-ubuntu-16-04"&gt;DigitalOcean has a great tutorial&lt;/a&gt; that you should follow. However, I configured the firewall to only allow incoming traffic to Jenkins for my private home networks IP range: &lt;span class="Code"&gt;sudo ufw allow from 192.168.1.0/24 to any port 8080&lt;/span&gt;. In case your firewall is disabled in Ubuntu (&lt;span class="Code"&gt;ufw status&lt;/span&gt; returning &lt;span class="Code"&gt;inactive&lt;/span&gt;), run &lt;span class="Code"&gt;sudo ufw enable&lt;/span&gt; to activate the firewall. After the initial setup, Jenkins is running in your Linux VM:&lt;/p&gt;
&lt;p&gt;&lt;img id="__mcenew" style="display: block; margin-left: auto; margin-right: auto;" src="https://blog.dangl.me/media/1128/05_jenkinswelcome.png" alt="Jenkins Welcome Screen on Ubuntu Server" data-udi="umb://media/6e2331f8288f40548c5d7bffb2efa0e4" /&gt;&lt;/p&gt;
&lt;h2&gt;Install Development Environment &amp;amp; Tools&lt;/h2&gt;
&lt;p&gt;On the build server, you’ll need dotnet and npm available.&lt;/p&gt;
&lt;p&gt;First, follow the official &lt;a href="https://www.microsoft.com/net/core#linuxubuntu"&gt;Microsoft .Net Core Linux installation instructions&lt;/a&gt; to enable dotnet support. To install npm, &lt;a href="https://nodejs.org/en/download/package-manager/#debian-and-ubuntu-based-linux-distributions"&gt;run these commands&lt;/a&gt;:&lt;/p&gt;
&lt;p&gt;&lt;span class="Code"&gt;curl -sL https://deb.nodesource.com/setup_7.x | sudo -E bash -&lt;br /&gt;&lt;/span&gt;&lt;span class="Code"&gt;sudo apt-get install -y nodejs&lt;/span&gt;&lt;/p&gt;
&lt;h2&gt;Jenkins Plugins&lt;/h2&gt;
&lt;p&gt;This one is easy. There’s already support for jUnit test results in Jenkins, meaning the TypeScript tests are already supported. For the xUnit based dotnet tests, install the &lt;a href="https://plugins.jenkins.io/xunit"&gt;xUnit plugin in Jenkins&lt;/a&gt; and you’re good to go.&lt;/p&gt;
&lt;h2&gt;Configure Jenkins&lt;/h2&gt;
&lt;p&gt;The NetCoreHeroes project has both .Net and Angular unit tests that are going to be run in Jenkins.&lt;/p&gt;
&lt;p&gt;The project's SCM git repository is located at &lt;a rel="noopener noreferrer" href="https://github.com/GeorgDangl/NetCoreHeroes/" target="_blank" title="GitHub - GeorgDangl/NetCoreHeroes"&gt;https://github.com/GeorgDangl/NetCoreHeroes/&lt;/a&gt;. You can take a look at the project to get a starting point for your own CI builds. For Linux, I've had to deviate a bit from my standard configuration. The awesome &lt;a rel="noopener noreferrer" href="https://github.com/mmanela/chutzpah" target="_blank" title="GitHub - Chutzpah"&gt;Chutzpah&lt;/a&gt; JavaScript test runner is only available for Windows, so I had to switch to &lt;a rel="noopener noreferrer" href="https://karma-runner.github.io/1.0/index.html" target="_blank" title="Karma Runner GitHub Pages"&gt;Karma&lt;/a&gt; for the Linux environment. I get that the Karma runner is actually much more used, but it doesn't integrate as seamlessly in Visual Studio. In the &lt;a rel="noopener noreferrer" data-udi="umb://document/de1e9b998b584fc6b63002a9dbb02ae5" href="/archive/netcoreheroes-angular2-with-net-core-in-visual-studio-2015-part-i/" target="_blank" title="NetCoreHeroes: Angular 2 with .Net Core in Visual Studio 2015, Part I"&gt;NetCoreHeroes posts series&lt;/a&gt;, I've described the complete way from setting up the project to a fully automated testing and deployment scenario with Windows.&lt;/p&gt;
&lt;p&gt;Back to Linux, here's the Jenkins build configuration:&lt;/p&gt;
&lt;p&gt;&lt;img id="__mcenew" style="display: block; margin-left: auto; margin-right: auto;" src="https://blog.dangl.me/media/1129/06_jenkinsbuildconfiguration.png" alt="NetCoreHeroes Jenkins Build Configuration for Linux" data-udi="umb://media/2e784c9b9fc3421abf5bbee181eacaba" /&gt;&lt;/p&gt;
&lt;p&gt;And the post build actions to publish the test results:&lt;/p&gt;
&lt;p&gt;&lt;img id="__mcenew" style="display: block; margin-left: auto; margin-right: auto;" src="https://blog.dangl.me/media/1130/07_jenkinspostbuildactions.png" alt="Jenkins Post Build Actions for NetCoreHeroes with Angular and Asp.Net Core on Linux" data-udi="umb://media/c2b1e6245634435b831423c0f28fe0b0" /&gt;&lt;/p&gt;
&lt;p&gt;Save the config, click on &lt;em&gt;Build Now&lt;/em&gt; and take a look at what your unit tests have to tell you:&lt;/p&gt;
&lt;p&gt;&lt;span class="Code"&gt;HeroService.HeroService getHeroes should have expected fake heroes (from PhantomJS 2.1.1 (Linux 0.0.0))&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;span&gt;.Net Core and Linux really IS easy!&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;span&gt;Happy platform-independant testing!&lt;/span&gt;&lt;/p&gt;</description>
      <pubDate>Mon, 29 May 2017 20:52:08 Z</pubDate>
      <a10:updated>2017-05-29T20:52:08Z</a10:updated>
    </item>
    <item>
      <guid isPermaLink="false">1253</guid>
      <link>https://blog.dangl.me/archive/integration-testing-in-memory-compiled-code-with-roslyn/</link>
      <category>DotNet</category>
      <category>Continuous Integration</category>
      <title>Integration Testing In Memory Compiled Code with Roslyn</title>
      <description>&lt;p&gt;Have you ever written code to generate code? &lt;em&gt;Probably&lt;/em&gt;.&lt;/p&gt;
&lt;p&gt;Have you ever written proper integration tests for that? &lt;em&gt;Probably not&lt;/em&gt;.&lt;/p&gt;
&lt;blockquote&gt;There is now a &lt;a data-udi="umb://document/285ced6c2ffc498aacd5d15f5bf00c39" href="/archive/integration-testing-in-memory-compiled-code-with-roslyn-visual-studio-2017-edition/" title="Integration Testing In Memory Compiled Code with Roslyn - Visual Studio 2017 Edition"&gt;follow up article&lt;/a&gt; that uses the latest .Net project format in &lt;strong&gt;Visual Studio 2017&lt;/strong&gt;. This post is still displaying the now deprecated &lt;span class="Code"&gt;project.json&lt;/span&gt; configuration&lt;/blockquote&gt;
&lt;p&gt;Everyone's done it - write code that writes code. Be it for converting from an esoteric Domain Specific Language, generating from a set of known data or for any other reason. It happens a lot, it's useful a lot, but it's not tested a lot. Testing such code has always been inconvenient, did involve a lot of disk IO and quite often relied on scripts and manual work. Luckily, with &lt;a rel="noopener noreferrer" href="https://github.com/dotnet/roslyn" target="_blank" title="GitHub - dotnet/roslyn"&gt;dotnets Roslyn compiler,&lt;/a&gt; there are NuGet packages for code analysis and compilation available that make all of this so easy!&lt;/p&gt;
&lt;p&gt;The &lt;a rel="noopener noreferrer" href="https://github.com/GeorgDangl/InMemoryCompilation" target="_blank" title="GitHub - GeorgDangl/InMemoryCompilation"&gt;complete sample repository is available at GitHub&lt;/a&gt;. Here are the interesting parts:&lt;/p&gt;
&lt;p&gt;
&lt;script src="https://gist.github.com/GeorgDangl/4a9982a3b520f056a9e890635b3695e0.js?file=CodeGenerator.cs"&gt;&lt;/script&gt;
&lt;/p&gt;
&lt;p&gt;It starts with the &lt;span class="Code"&gt;CodeGenerator&lt;/span&gt;  which does what it's name implies - it creates code. In this case, it's a simple class that has one method: &lt;span class="Code"&gt;AddIntegers()&lt;/span&gt;.&lt;/p&gt;
&lt;p&gt;
&lt;script src="https://gist.github.com/GeorgDangl/4a9982a3b520f056a9e890635b3695e0.js?file=project.json"&gt;&lt;/script&gt;
&lt;/p&gt;
&lt;p&gt;This &lt;span class="Code"&gt;project.json&lt;/span&gt; is a regular, xUnit enabled, configuration file. There is a reference to the &lt;span&gt;&lt;span class="Code"&gt;Microsoft.CodeAnalysis.CSharp&lt;/span&gt; package. That one will bring the Roslyn API to your project.&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;
&lt;script src="https://gist.github.com/GeorgDangl/4a9982a3b520f056a9e890635b3695e0.js?file=CodeGeneratorTests.cs"&gt;&lt;/script&gt;
&lt;/p&gt;
&lt;p&gt;The actual integration test class is quite big, so let's break it down into it's functional units: &lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;span class="Code"&gt;GenerateCode()&lt;/span&gt; does just that - it gets the string representation of the sourcecode&lt;/li&gt;
&lt;li&gt;&lt;span class="Code"&gt;CreateCompilation()&lt;/span&gt; takes the sourcecode, adds assembly references and turns it into a &lt;span class="pl-en"&gt;&lt;span class="Code"&gt;CSharpCompilation&lt;/span&gt; object&lt;/span&gt;&lt;span&gt;&lt;/span&gt;&lt;/li&gt;
&lt;li&gt;&lt;span class="pl-en"&gt;&lt;span class="Code"&gt;CompileAndLoadAssembly()&lt;/span&gt; now invokes the Roslyn API to compile onto a MemoryStream, then loads it and makes the content available&lt;/span&gt;&lt;/li&gt;
&lt;li&gt;&lt;span class="pl-en"&gt;Finally, &lt;span class="Code"&gt;CallCalculatorMethod()&lt;/span&gt; uses reflection on the newly generated assembly and invokes the &lt;span class="Code"&gt;AddIntegers()&lt;/span&gt; method&lt;/span&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;This is a really interesting approach on tackling code generation. While I got the initial bits to set this up from &lt;a rel="noopener noreferrer" href="http://www.tugberkugurlu.com/archive/compiling-c-sharp-code-into-memory-and-executing-it-with-roslyn" target="_blank" title="Tugberk Ugurlu - Compiling C# Code Into Memory and Executing It with Roslyn"&gt;Tugberk Ugurlus&lt;/a&gt; blog, I've had &lt;a rel="noopener noreferrer" href="https://github.com/GeorgDangl/XmlTools" target="_blank" title="GitHub - GeorgDangl/XmlTools"&gt;a project of my own&lt;/a&gt; where I did some heavy code generation (for correcting Xml documents) where I needed that. There's also code that works both in .Net Core and the full .Net framework.&lt;/p&gt;
&lt;p&gt;Happy compiling!&lt;/p&gt;</description>
      <pubDate>Fri, 07 Apr 2017 19:20:49 Z</pubDate>
      <a10:updated>2017-04-07T19:20:49Z</a10:updated>
    </item>
    <item>
      <guid isPermaLink="false">1231</guid>
      <link>https://blog.dangl.me/archive/set-up-private-nuget-feeds-with-proget-v4plus/</link>
      <category>Continuous Integration</category>
      <title>Set Up Private NuGet Feeds with ProGet v4+</title>
      <description>&lt;p&gt;In an &lt;a data-id="1121" href="/archive/set-up-private-nuget-feeds-with-proget-and-jenkins/" title="Set Up Private NuGet Feeds with ProGet and Jenkins"&gt;earlier post&lt;/a&gt;, I described how to get started with &lt;a href="http://inedo.com/proget" target="_blank" title="Inedo - ProGet"&gt;Inedos ProGet&lt;/a&gt;. It's a nice tool that lets you self host your own packages with a lot of supported repository types, like NuGet or npm. Lately, they've transitioned to version 4 with big changes to the UI so I thought I could post a small update, focusing on what has changed.&lt;/p&gt;
&lt;h2&gt;Install &amp;amp; Configure ProGet&lt;/h2&gt;
&lt;p&gt;Installation and initial configuration is nicely covered by their documentation, so I won’t go into much detail here. There was just one bit missing in the documentation that caused some troubles for me: When you’re using MS SQL Server, it’s possible that your default collation is different to the one ProGet will try to set during the database initialization, so make sure to create the ProGet database with &lt;span class="Code"&gt;SQL_Latin1_General_CP1_CI_AS&lt;/span&gt; collation and you’ll not run into issues, otherwise ProGet might fail to apply its database schema during installation.&lt;/p&gt;
&lt;p&gt;To keep your feeds private, go to the settings page, then navigate to &lt;span class="Code"&gt;Manage Users &amp;amp; Tasks&lt;/span&gt;, select the &lt;span class="Code"&gt;Tasks&lt;/span&gt; and remove the &lt;span class="Code"&gt;Anonymous&lt;/span&gt; group from &lt;span class="Code"&gt;View &amp;amp; Download Packages&lt;/span&gt;. I've not yet discovered how to completely lock down v4 of ProGet, but this settings disables any access for non-authenticated users.&lt;/p&gt;
&lt;p&gt;&lt;img id="__mcenew" style="display: block; margin-left: auto; margin-right: auto;" src="https://blog.dangl.me/media/1106/01-disable-anonymous-proget-access.png" alt="ProGet Task Authorization Menu" data-id="1233"&gt;&lt;/p&gt;
&lt;p&gt;To access feeds, I'd recommend creating a regular user and then adding a permission for this user to &lt;span class="Code"&gt;View &amp;amp; Download Packages&lt;/span&gt;. This way, you'll have a login that works to privately access your repository feeds.&lt;/p&gt;
&lt;h2&gt;Enable NuGet Package Publishing with an Api Key&lt;/h2&gt;
&lt;p&gt;Usually, you can simply set an Api Key for NuGet package publishing in the feeds options &lt;a href="http://inedo.com/support/kb/1112/api-keys-in-proget" target="_blank" title="ProGet Documentation - Api Keys in NuGet"&gt;according to the documentation&lt;/a&gt;. You do need to allow the Anonymous user the package publish right then, however. Since the built in task comes with quite a lot more privileges (like viewing the feed and pulling packages), it's best to create a new task that only has the &lt;span class="Code"&gt;Add Package&lt;/span&gt; privilege for publishing. Some NuGet versions, however, do have a quirk where before doing a push, they're making a GET request and run into authentication issues. For example, NuGet 3.3.0 does have this behavior while 3.4.4 does not. If there's some reason you can't upgrade the NuGet version you're using for publishing, you'll have to create a user for publishing and additionally grant him the &lt;span class="Code"&gt;View Feed&lt;/span&gt; rights and then use default NuGet authentication in addition to the api key when publishing packages.&lt;/p&gt;
&lt;p&gt;To create the task, click on &lt;span class="Code"&gt;Customize Tasks&lt;/span&gt; in the tasks menu and select the &lt;span class="Code"&gt;Add Package&lt;/span&gt; and &lt;span class="Code"&gt;View Feed&lt;/span&gt; (if you're using a designated user for publishing) privilege under &lt;span class="Code"&gt;Feeds&lt;/span&gt;, then save the task and assign the &lt;span class="Code"&gt;Anonymous&lt;/span&gt; user to it (or your publisher!).&lt;/p&gt;
&lt;p&gt;&lt;img id="__mcenew" style="display: block; margin-left: auto; margin-right: auto;" src="https://blog.dangl.me/media/1107/02-create-package-publisher-task.png" alt="ProGet - Create Package Publisher Task" data-id="1234"&gt;&lt;/p&gt;
&lt;blockquote&gt;&lt;strong&gt;Tip:&lt;/strong&gt; Use this command to locally save user credentials in NuGet: &lt;span class="Code"&gt;nuget source add -Name &amp;lt;NameOfTheStoredCredentials&amp;gt; -Source &amp;lt;YourNuGetFeedUrl&amp;gt; -User &amp;lt;Username&amp;gt; -Pass &amp;lt;Password&amp;gt;&lt;/span&gt;&lt;/blockquote&gt;
&lt;p&gt;If you want to continue reading, &lt;a data-id="1121" href="/archive/set-up-private-nuget-feeds-with-proget-and-jenkins/" title="Set Up Private NuGet Feeds with ProGet and Jenkins"&gt;the old post&lt;/a&gt; covers a bit more about creating packages and deploying them via a Jenkins CI solution.&lt;/p&gt;</description>
      <pubDate>Thu, 20 Oct 2016 20:27:09 Z</pubDate>
      <a10:updated>2016-10-20T20:27:09Z</a10:updated>
    </item>
    <item>
      <guid isPermaLink="false">1193</guid>
      <link>https://blog.dangl.me/archive/publish-and-deploy-asp-net-core-applications-from-jenkins-to-iis/</link>
      <category>Continuous Integration</category>
      <title>Publish and Deploy Asp.Net Core Applications from Jenkins to IIS</title>
      <description>&lt;p&gt;&lt;strong&gt;Update:&lt;/strong&gt; There's &lt;a data-udi="umb://document/dd50212926f24610a80f24d70e344616" href="/archive/publish-and-deploy-aspnet-core-applications-from-jenkins-to-iis-2017-edition/" title="Publish and Deploy Asp.Net Core Applications from Jenkins to IIS - 2017 Edition"&gt;a follow up post&lt;/a&gt; that's using the new Visual Studio 2017 project format.&lt;/p&gt;
&lt;h2&gt;Get the Required Tools on the CI Server&lt;/h2&gt;
&lt;p&gt;Just like with &lt;a rel="noopener noreferrer" data-id="1172" href="/archive/unit-testing-and-code-coverage-with-jenkins-and-net-core/" target="_blank" title="Unit Testing and Code Coverage with Jenkins and .NET Core"&gt;unit testing in a CI environment&lt;/a&gt;, you’ll need the .Net Core SDK to be available to Jenkins. It’s available at &lt;a rel="noopener noreferrer" href="https://www.microsoft.com/net/download#core" target="_blank" title=".Net Core Official Download Site"&gt;the official download site&lt;/a&gt;. Additionally, on the hosting server, you need the .Net Core Windows Server hosting bundle, available on the same site. See the &lt;a rel="noopener noreferrer" href="https://docs.asp.net/en/latest/publishing/iis.html#install-the-net-core-windows-server-hosting-bundle" target="_blank" title=".Net Core Documentation for Installing Windows Server Hosting Bundle"&gt;official documentation&lt;/a&gt; for how to install it. You will also need to install Node.js  and npm on the server that is hosting Jenkins so you can run all the prepublish commands for an Asp.Net Core application.&lt;br&gt;To install Node.js, go to its &lt;a rel="noopener noreferrer" href="https://nodejs.org/en/download/" target="_blank" title="Node.js Official Download Site"&gt;download site&lt;/a&gt; and grab the latest Windows installer. When installing, make sure that &lt;span class="Code"&gt;Add to PATH&lt;/span&gt; is enabled in the install options (it is by default), so that the npm command is added to the PATH environment variable and therefore directly recognized in the command line interface.&lt;/p&gt;
&lt;p&gt;Make sure to &lt;strong&gt;restart Jenkins&lt;/strong&gt; after you've added the tools, as the PATH variable is only read at startup and not watched for changes!&lt;/p&gt;
&lt;blockquote&gt;&lt;strong&gt;Tip:&lt;/strong&gt; It's not required to globally install node modules like bower or gulp on the build server. That should be handled by node.js’ devDependencies and proper pre- and postpublish scripts in your deployment process.&lt;/blockquote&gt;
&lt;h2&gt;Configure Jenkins to Build the Project&lt;/h2&gt;
&lt;p&gt;&lt;span&gt;In Jenkins, just create a new project and configure the source code management, for example by pulling from a Git repository. To build, publish and deploy, the following Windows batch command steps are necessary:&lt;/span&gt;&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;&lt;span&gt;&lt;span class="Code"&gt;dotnet restore&lt;/span&gt;&lt;br&gt; Restores NuGet packages for all projects below the workspace folder.&lt;br&gt;Note that right now, private (password protected) NuGet feeds are not working with the dotnet cli, see &lt;a rel="noopener noreferrer" href="https://github.com/dotnet/cli/issues/3174" target="_blank" title="Dotnet CLI NuGet Restore for Private Feeds Fails GitHub Issue"&gt;this issue on GitHub&lt;/a&gt;. You can resolve that by storing the password in plain text in &lt;span class="Code"&gt;NuGet.config&lt;/span&gt;.&lt;/span&gt;&lt;/li&gt;
&lt;li&gt;&lt;span&gt;&lt;span class="Code"&gt;cd "%WORKSPACE%\&amp;lt;PathToProject&amp;gt;"&lt;/span&gt;&lt;br&gt; &lt;span class="Code"&gt;dotnet publish -c EnvironmentName&lt;/span&gt;&lt;br&gt; Changes directory to the actual project folder and runs the publish command. Replace the &lt;span class="Code"&gt;EnvironmentName&lt;/span&gt; with whatever you want to use for publishing.&lt;br&gt;&lt;/span&gt;&lt;/li&gt;
&lt;/ol&gt;
&lt;p&gt;&lt;span&gt;To make sure your pre- and postpublish scripts run, include them in your &lt;span class="Code"&gt;package.json&lt;/span&gt; (for npm) and &lt;span class="Code"&gt;project.json&lt;/span&gt; (for dotnet) files, for example like this:&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;project.json:&lt;/strong&gt;&lt;/p&gt;
&lt;pre class="brush: javascript;"&gt;"scripts": {
    "prepublish": [ "npm install", "npm run prepublish" ],
    "postpublish": [ "dotnet publish-iis --publish-folder %publish:OutputPath% --framework %publish:FullTargetFramework%" ]
  }&lt;/pre&gt;
&lt;p&gt;&lt;span&gt;&lt;/span&gt;&lt;strong&gt;package.json:&lt;/strong&gt;&lt;/p&gt;
&lt;pre class="brush: javascript;"&gt;"scripts": {
      "prepublish": "bower install &amp;amp;&amp;amp; tsc &amp;amp;&amp;amp; gulp clean &amp;amp;&amp;amp; gulp copyClientDeps &amp;amp;&amp;amp; gulp min" 
    }&lt;/pre&gt;
&lt;p&gt;&lt;span&gt;This combination would effectively run the &lt;span class="Code"&gt;npm install&lt;/span&gt; and &lt;span class="Code"&gt;npm run prepublish&lt;/span&gt; commands first, then compile and publish the application via dotnet cli and finally run the &lt;span class="Code"&gt;dotnet-publish-iis&lt;/span&gt; command.&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;span&gt;Additionally, before deploying, you might want to &lt;a rel="noopener noreferrer" data-id="1178" href="/archive/applying-web-config-transformations-without-msbuild/" target="_blank" title="Applying web.config Transformations without MSBuild"&gt;run a web.config transformation as described here&lt;/a&gt;.&lt;/span&gt;&lt;/p&gt;
&lt;h2&gt;&lt;span&gt;Publish to IIS with Web Deploy&lt;/span&gt;&lt;/h2&gt;
&lt;p&gt;&lt;span&gt;For the following step, you should have set up Web Deploy on both the source and target server. There’s only one more step needed, and that is calling &lt;span class="Code"&gt;msdeploy.exe&lt;/span&gt; to publish the app to IIS.&lt;br&gt;When doing that, you must supply valid credentials for the user that has publishing rights in IIS. Since it’s never a good idea to include plain text user credentials in a script or build step, you have to save them manually to the Windows credentials store. For that, log into your CI server with the user account for Jenkins, open a command prompt and execute the following command:&lt;/span&gt;&lt;/p&gt;
&lt;pre&gt;&lt;span class="Code"&gt;"C:\Program Files (x86)\IIS\Microsoft Web Deploy V3\msdeploy.exe" -verb:dump -source:Iisapp=&lt;strong&gt;YourSiteName&lt;/strong&gt;,computerName='https://&lt;strong&gt;WebDeployUrl&lt;/strong&gt;:8172/msdeploy.axd?site=&lt;strong&gt;YourSiteName&lt;/strong&gt; ',userName='&lt;strong&gt;Username&lt;/strong&gt;',password='&lt;strong&gt;Password&lt;/strong&gt;',authType='Basic',storeCredentials='&lt;strong&gt;StorageName&lt;/strong&gt;'&lt;/span&gt;&lt;/pre&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;YourSiteName&lt;/strong&gt;&lt;span&gt;&lt;br&gt; The name for the site. This is rather important since the WebDeploy service will determine whether or not the user has access to the service depending on the site name that is specified here.&lt;/span&gt;&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;WebDeployUrl&lt;/strong&gt;&lt;span&gt;&lt;br&gt; You should just specify the Url endpoint for the WebDeploy service (that was configured earlier). If you’re not having an SSL certificate for that (or none at all, because you’re free to use any dns name for the MSDeploy service on Port 8172), you can add the parameter “-allowUntrusted”, which ignores SSL errors. But you should really try not to ignore SSL errors in any setting, especially in a CI environment where it’s never ever gonna be resolved again…&lt;/span&gt;&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Username&lt;/strong&gt;&lt;span&gt;&lt;br&gt; The username for the user who you gave access to WebDeploy. You might possibly have to include the network domain name in the form of “domain\user” or, if it’s a local user, the servers computer name as “serverComputerName\user”.&lt;/span&gt;&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Password&lt;/strong&gt;&lt;span&gt;&lt;br&gt; Should be clear – Provide the user accounts password&lt;/span&gt;&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;StorageName&lt;/strong&gt;&lt;span&gt;&lt;br&gt; This is the key under which the credentials will be stored in Windows' Credentials Manager, you’ll later retrieve them by going like “use credentials for StorageName”&lt;/span&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;&lt;span&gt;The &lt;span class="Code"&gt;authType=’Basic’&lt;/span&gt; parameter is not strictly required. If your happen to be in an Active Directory domain, you could also use Windows Authentication and completely omit any credentials in the Web Deploy process.&lt;br&gt; The &lt;span class="Code"&gt;verb=dump&lt;/span&gt; will result in msdeploy listing you the contents of the remote location (upon successful authorization), so you know you've got it right when you see that. (or see no error messages, since we haven’t yet deployed any files=)&lt;br&gt; As per the msdeploy documentation, the credentials will be stored even if the response is a 401 – Unauthorized, so keep that in mind if you're just testing stuff. Also, t&lt;/span&gt;&lt;span&gt;he path to the msdeploy.exe may vary on your installation if you chose not to install it in the default location.&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;span&gt;Now that we've stored the credentials we can finally approach the last line of commands we’re going to need for the Jenkins job:&lt;/span&gt;&lt;/p&gt;
&lt;pre&gt;&lt;span class="Code"&gt;"C:\Program Files (x86)\IIS\Microsoft Web Deploy V3\msdeploy.exe" -verb:sync -source:IisApp='%WORKSPACE%\publish -dest:iisapp='YourSiteName',computerName='https://WebDeployUrl:8172/msdeploy.axd?site=YourSiteName',getCredentials='StorageName',authType='Basic' -enableRule:DoNotDeleteRule - -enablerule:AppOffline&lt;/span&gt;&lt;/pre&gt;
&lt;ul&gt;
&lt;li&gt;&lt;span&gt;&lt;span class="Code"&gt;verb:sync&lt;/span&gt;&lt;br&gt; We want to sync our publish folder with the server, so this is the operation (“verb”) to perform&lt;/span&gt;&lt;/li&gt;
&lt;li&gt;&lt;span class="Code"&gt;source:IisApp='%WORKSPACE%\publish&lt;/span&gt;&lt;span&gt;&lt;br&gt; The publish directory, will be by default like &lt;span class="Code"&gt;bin\EnvironmentName\Platform\publish&lt;/span&gt; below the project folder&lt;/span&gt;&lt;/li&gt;
&lt;li&gt;&lt;span&gt;&lt;span class="Code"&gt;dest&lt;/span&gt;&lt;br&gt; The destination where we want to deploy to. &lt;span class="Code"&gt;YourSiteName&lt;/span&gt; and the &lt;span class="Code"&gt;WebDeployUrl&lt;/span&gt; should be adjusted to your needs, as well as the &lt;span class="Code"&gt;StorageName&lt;/span&gt; under which you stored the credentials. It’s the same command you used to store the credentials earlier.&lt;/span&gt;&lt;/li&gt;
&lt;li&gt;&lt;span&gt;&lt;span class="Code"&gt;enableRule:DoNotDeleteRule&lt;/span&gt;&lt;br&gt; This is playing on the safe side, the setting prevents files from being deleted on the server if they’re not present in the source. It’s not required for most applications and might cause troubles when your website folder contains old files. So say, for example, you only want to keep your AppData folder where you store local files for your site, then just swap this command with &lt;br&gt; &lt;em&gt;&lt;span class="Code"&gt;-skip:Directory=\\App_Data&lt;/span&gt;&lt;br&gt; &lt;/em&gt;You can chain multiple rules, so you can always exclude additional folders from the deployment.&lt;/span&gt;&lt;/li&gt;
&lt;li&gt;&lt;span class="Code"&gt;enablerule:AppOffline&lt;/span&gt;&lt;br&gt; &lt;span&gt;This will copy an app_offline.html file before the operation and remove it afterwards, telling IIS to shut the site down during the deployment process. It’s so you don’t get any conflicts where DLLs are still loaded and can’t be overwritten.&lt;/span&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;&lt;span&gt;If you want to dig further into msdeploy, these three links from the technet contain a lot of info:&lt;/span&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a rel="noopener noreferrer" href="https://technet.microsoft.com/de-de/library/dd568992(v=ws.10).aspx" target="_blank" title="Technet Web Deploy Rules"&gt;Web Deploy Rules&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a rel="noopener noreferrer" href="https://technet.microsoft.com/de-de/library/dd569089(v=ws.10).aspx" target="_blank" title="Technet Web Deploy Operation Settings"&gt;Web Deploy Operation Settings&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a rel="noopener noreferrer" href="https://technet.microsoft.com/de-de/library/dd569001(v=ws.10).aspx" target="_blank" title="Technet Web Deploy Provider Settings"&gt;Web Deploy Provider Settings&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;h2&gt;Summary&lt;/h2&gt;
&lt;p&gt;&lt;span&gt;Here’s a screenshot of all three required build steps:&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;span&gt;&lt;img id="__mcenew" style="display: block; margin-left: auto; margin-right: auto;" src="https://blog.dangl.me/media/1087/allbuildsteps.png" alt="Jenkins .Net Core Build Steps for Publishing to IIS" data-id="1195"&gt;&lt;/span&gt;&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt;&lt;span&gt;Now you should be able to click on “Build now” and let the magic happen, continuous deployment for your .Net Core website. Have fun!&lt;/span&gt;&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;</description>
      <pubDate>Thu, 23 Jun 2016 20:42:58 Z</pubDate>
      <a10:updated>2016-06-23T20:42:58Z</a10:updated>
    </item>
    <item>
      <guid isPermaLink="false">1179</guid>
      <link>https://blog.dangl.me/archive/setting-asp-net-core-environment-variables-via-web-config-in-iis/</link>
      <category>Continuous Integration</category>
      <title>Setting Asp.Net Core Environment Variables via web.config in IIS</title>
      <description>&lt;p&gt;Since RC2, hosting .Net Core apps within IIS &lt;a href="https://github.com/aspnet/IISIntegration/issues/105" target="_blank" title="Change from HttpPlatformHandler to ASP.NET Core Module Announcement"&gt;is no longer using the common HttpPlatformHandler&lt;/a&gt; as with any other processes but instead the &lt;a href="https://www.microsoft.com/net/download#core" target="_blank" title=".Net Core Official Downloads"&gt;ASP.NET Core Module for Windows Server Hosting&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;While not a lot has changed, the Xml schema for the web.config elements is new, so you need to specify environment variables the following way in your main web.config file:&lt;/p&gt;
&lt;pre class="brush: xhtml;"&gt;&amp;lt;?xml version="1.0" encoding="utf-8"?&amp;gt;
&amp;lt;configuration&amp;gt;
  &amp;lt;system.webServer&amp;gt;
    &amp;lt;handlers&amp;gt;
      &amp;lt;add name="aspNetCore" path="*" verb="*" modules="AspNetCoreModule" resourceType="Unspecified" /&amp;gt;
    &amp;lt;/handlers&amp;gt;
    &amp;lt;aspNetCore processPath="%LAUNCHER_PATH%" arguments="%LAUNCHER_ARGS%" forwardWindowsAuthToken="false" stdoutLogEnabled="true" &amp;gt;
      &amp;lt;environmentVariables&amp;gt;
        &amp;lt;environmentVariable name="ASPNETCORE_ENVIRONMENT" value="Production" /&amp;gt;
      &amp;lt;/environmentVariables&amp;gt;
    &amp;lt;/aspNetCore&amp;gt;
  &amp;lt;/system.webServer&amp;gt;
&amp;lt;/configuration&amp;gt;&lt;/pre&gt;
&lt;p&gt;Alternatively, &lt;a data-id="1178" href="/archive/applying-web-config-transformations-without-msbuild/" target="_blank" title="Applying web.config Transformations without MSBuild"&gt;you can still use regular web.config transformations&lt;/a&gt; for publishing to IIS (there's currently no support for that in dotnet-publish-iis), for example in your web.Production.config:&lt;/p&gt;
&lt;pre class="brush: xhtml;"&gt;&amp;lt;?xml version="1.0" encoding="utf-8"?&amp;gt;
&amp;lt;configuration xmlns:xdt="http://schemas.microsoft.com/XML-Document-Transform"&amp;gt;
  &amp;lt;system.webServer&amp;gt;
    &amp;lt;aspNetCore&amp;gt;
      &amp;lt;environmentVariables xdt:Transform="Insert"&amp;gt;
        &amp;lt;environmentVariable name="ASPNETCORE_ENVIRONMENT" value="Production" /&amp;gt;
      &amp;lt;/environmentVariables&amp;gt;
    &amp;lt;/aspNetCore&amp;gt;
  &amp;lt;/system.webServer&amp;gt;
&amp;lt;/configuration&amp;gt;&lt;/pre&gt;
&lt;p&gt; &lt;/p&gt;</description>
      <pubDate>Thu, 09 Jun 2016 23:32:44 Z</pubDate>
      <a10:updated>2016-06-09T23:32:44Z</a10:updated>
    </item>
    <item>
      <guid isPermaLink="false">1178</guid>
      <link>https://blog.dangl.me/archive/applying-web-config-transformations-without-msbuild/</link>
      <category>Continuous Integration</category>
      <title>Applying web.config Transformations without MSBuild</title>
      <description>&lt;p&gt;There are certain cases where you deploy a website to IIS, rely on a web.config file but don't have MSBuild in the build process, like deploying .Net Core or php apps.&lt;/p&gt;
&lt;p&gt;&lt;span&gt;&lt;a href="https://www.nuget.org/packages/WebConfigTransformRunner/" target="_blank" title="ASP.Net Web Config Transform Runner NuGet"&gt;ASP.Net Web Config Transform Runner&lt;/a&gt; to the rescue! That's a nice, little command line tool wrapper around the &lt;a href="https://www.nuget.org/packages/Microsoft.Web.Xdt/" target="_blank" title="Microsoft.Web.Xdt NuGet"&gt;Microsoft.Web.Xdt&lt;/a&gt; package. I'm using that to deploy .Net Core and reverse proxy configurations to IIS where I'm not having any MSBuild step in the deployment process.&lt;/span&gt;&lt;/p&gt;
&lt;h2&gt;&lt;span&gt;Applying the Transformations&lt;/span&gt;&lt;/h2&gt;
&lt;p&gt;&lt;span&gt;The PowerShell script below can be used in Continuous Integration systems like Jenkins or simply as part of a scripted deployment.&lt;/span&gt;&lt;/p&gt;
&lt;pre class="brush: powershell;"&gt;$configuration = "Production"                  # The name of the configuration, e.g. "Release" to apply transformations from web.Release.config
$wwwRoot = $ENV:WORKSPACE + "\PublishFolder"   # The folder under which to recursively look for web.config files
$deleteAfterTransformation = true              # true if web.*.config files should be deleted after the process
# Get path to WebConfigTransformationRunner
# INFO: Supply manual value to "WebConfigTransformRunner.exe" for $latestWebConfigTranformator if it isnt restored via "dotnet restore"
$webConfigTransformatorPackages = Join-Path -Path $env:USERPROFILE -ChildPath "\.nuget\packages\WebConfigTransformRunner"
$latestWebConfigTranformator = Join-Path -Path ((Get-ChildItem -Path $webConfigTransformatorPackages | Sort-Object Fullname -Descending)[0].FullName) -ChildPath "Tools\WebConfigTransformRunner.exe"
# Find all directories containing web.config foles
$webConfigDirs = Get-ChildItem -Path $wwwRoot -Recurse -Filter "web*.config" | Select -Property Directory -Unique
ForEach ($directory in $webConfigDirs.Directory){
    # Check if file for transformation is present
    $transformationSource = (Get-ChildItem -Path $directory -Filter ("web." + $configuration + ".config"))
    if ($transformationSource) {
        # Found a file to apply transformations
        $guid = [Guid]::NewGuid().ToString()
        $transformArguments = @("""" + (Join-Path -Path $directory -ChildPath "web.config") + """",`
                                """" + $transformationSource[0].FullName + """",`
                                """" + (Join-Path -Path $directory -ChildPath $guid) + """")
        $transformationProcess = Start-Process -FilePath $latestWebConfigTranformator -ArgumentList $transformArguments -Wait -PassThru -NoNewWindow
        if ($transformationProcess.ExitCode -ne 0) {
            "Exiting due to web.config transformation tool having returned an error, exit code: " + $transformationProcess.ExitCode
            exit $transformationProcess.ExitCode
        }
        # Delete original web.config and rename the created one
        Remove-Item -Path (Join-Path -Path $directory -ChildPath "web.config")
        Rename-Item -Path (Join-Path -Path $directory -ChildPath $guid) -NewName (Join-Path -Path $directory -ChildPath "web.config") 
    }
    if ($deleteAfterTransformation) {
        # Delete all web.*.config files
        ForEach ($file in (Get-ChildItem -Path $directory -Filter "web.*.config")){
            Remove-Item -Path $file.FullName
        }
    }
}&lt;/pre&gt;
&lt;ul&gt;
&lt;li&gt;The &lt;span class="Code"&gt;$configuration&lt;/span&gt; variable determines which sources for the transformations to use, e.g. setting it to &lt;span class="Code"&gt;Production&lt;/span&gt; will apply transformations from &lt;span class="Code"&gt;web.Production.config&lt;/span&gt; files&lt;/li&gt;
&lt;li&gt;&lt;span class="Code"&gt;$wwwroot&lt;/span&gt; specifies which folder to search for &lt;span class="Code"&gt;web.config&lt;/span&gt; files. All child folders are searched as well&lt;/li&gt;
&lt;li&gt;&lt;span class="Code"&gt;$deleteAfterTransformation&lt;/span&gt; indicates if &lt;span class="Code"&gt;web.*.config&lt;/span&gt; files should be deleted after the transformation has been applied. Set it to true to remove unnecessary files for publishing.&lt;/li&gt;
&lt;/ul&gt;
&lt;blockquote&gt;&lt;strong&gt;Attention:&lt;/strong&gt; This script does automatically search for the latest &lt;span class="Code"&gt;WebConfigTransformRunner.exe&lt;/span&gt; in the default path on the system where dotnet caches NuGet packages to. If you manually download the tool, change &lt;span class="Code"&gt;$latestWebConfigTranformator&lt;/span&gt; to the full file path of the exe.&lt;/blockquote&gt;
&lt;p&gt; &lt;/p&gt;</description>
      <pubDate>Thu, 09 Jun 2016 22:29:21 Z</pubDate>
      <a10:updated>2016-06-09T22:29:21Z</a10:updated>
    </item>
    <item>
      <guid isPermaLink="false">1172</guid>
      <link>https://blog.dangl.me/archive/unit-testing-and-code-coverage-with-jenkins-and-net-core/</link>
      <category>Continuous Integration</category>
      <title>Unit Testing and Code Coverage with Jenkins and .NET Core</title>
      <description>&lt;h2&gt;&lt;strong&gt;Update 19.08.2017:&lt;/strong&gt;&lt;/h2&gt;
&lt;p&gt;&lt;strong&gt;There's &lt;a data-udi="umb://document/0af7dc1507fd4c188880e6c98a47fe4d" href="/archive/unit-testing-and-code-coverage-with-jenkins-and-net-core-2017-edition/" title="Unit Testing and Code Coverage with Jenkins and .NET Core - 2017 Edition"&gt;an updated version of this post&lt;/a&gt; that's targeted to the most recent tools and frameworks. It's simplified and uses the current dotnet CLI *.csproj format. &lt;/strong&gt;&lt;/p&gt;
&lt;hr&gt;
&lt;p&gt;&lt;strong&gt;This post is describing a rather generic way of unit testing .Net Core projects and I've published quite a long script to achieve that. If you're happy with a really short and simple solution, you can &lt;a data-id="1203" href="/archive/netcoreheroes-angular2-with-net-core-in-visual-studio-2015-part-iv/" title="NetCoreHeroes: Angular 2 with .Net Core in Visual Studio 2015, Part IV"&gt;read here how to set up basic unit testing for .Net Core and TypeScript&lt;/a&gt;.&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;This post is assuming a &lt;strong&gt;Windows&lt;/strong&gt; environment. &lt;a data-udi="umb://document/4e3dc17d230c47e7bc6290f3cf425cd3" href="/archive/net-core-and-typescript-continuous-integration-testing-with-jenkins-on-linux/" title=".Net Core and TypeScript Continuous Integration Testing with Jenkins on Linux"&gt;Check here&lt;/a&gt; for instructions on &lt;strong&gt;Linux&lt;/strong&gt;.&lt;/p&gt;
&lt;p&gt;Now that .NET Core RC2 has been released (this tutorial still works in the released versions, for example 1.0 and 1.1), there have been quite a few changes to the tooling around the stack. I've converted a small sample application (&lt;a rel="noopener noreferrer" href="https://github.com/GeorgDangl/WebDocu" target="_blank" title="GitHub Dangl.WebDocumentation Project Site"&gt;available at GitHub&lt;/a&gt;) and tried to set up the CI system for continuous testing and deployment. In this article, I'll describe how to perform unit test and code coverage analysis on the new platform.&lt;/p&gt;
&lt;h2&gt;Setup Required Tools&lt;/h2&gt;
&lt;p&gt;In order to successfully run unit tests and code coverage analysis for a .NET Core RC2 project, you first have to install the .NET Core Preview 1 SDK. It is available &lt;a rel="noopener noreferrer" href="https://www.microsoft.com/net/download#core" target="_blank" title=".NET Core Official Download Site"&gt;at the official dot.net site&lt;/a&gt; and installs the dotnet CLI tool and registers it globally for all users on the PATH variable.&lt;/p&gt;
&lt;p&gt;In Jenkins, you'll need the following plugins:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a rel="noopener noreferrer" href="https://wiki.jenkins-ci.org/display/JENKINS/PowerShell+Plugin" target="_blank" title="Jenkins CI PowerShell Plugin"&gt;PowerShell Plugin&lt;/a&gt;, since later there'll be a script that takes care of running the tests and collecting the data&lt;/li&gt;
&lt;li&gt;&lt;a rel="noopener noreferrer" href="https://wiki.jenkins-ci.org/display/JENKINS/xUnit+Plugin" target="_blank" title="Jenkins CI xUnit Plugin"&gt;xUnit Plugin&lt;/a&gt; to evaluate test results&lt;/li&gt;
&lt;li&gt;&lt;a rel="noopener noreferrer" href="https://wiki.jenkins-ci.org/display/JENKINS/Cobertura+Plugin" target="_blank" title="Jenkins CI Cobertura Plugin"&gt;Cobertura Plugin&lt;/a&gt; for the code coverage data&lt;/li&gt;
&lt;li&gt;&lt;a rel="noopener noreferrer" href="https://wiki.jenkins-ci.org/display/JENKINS/HTML+Publisher+Plugin" target="_blank" title="Jenkins CI Html Publisher Plugin"&gt;Html Publisher Plugin&lt;/a&gt; to show additional coverage reports&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Additionally, add the following three dependencies to the project.json of your unit test project(s): &lt;/p&gt;
&lt;pre class="brush: javascript;"&gt;"dependencies": {
    "OpenCover": "4.6.261-rc",
    "OpenCoverToCoberturaConverter": "0.2.4",
    "ReportGenerator": "2.4.5"
}&lt;/pre&gt;
&lt;blockquote&gt;&lt;strong&gt;Tip:&lt;/strong&gt; If you're having trouble with the xUnit test runner, make sure to have at least version &lt;span class="Code"&gt;1.0.0-rc2-build10025&lt;/span&gt; of the &lt;span class="Code"&gt;dotnet-test-xunit&lt;/span&gt; package referenced.&lt;/blockquote&gt;
&lt;p&gt;&lt;span class="Code"&gt;OpenCover&lt;/span&gt; is the process that wraps around the actual &lt;span class="Code"&gt;dotnet test&lt;/span&gt; runner and collects coverage analysis. &lt;span class="Code"&gt;OpenCoverToCoberturaConverter&lt;/span&gt;  translates the coverage reports into the Cobertura format for which there are Jenkins plugins. Additionally, &lt;span class="Code"&gt;ReportGenerator&lt;/span&gt; is a tool that creates some Html pages to visualize code coverage.&lt;/p&gt;
&lt;h2&gt;Configure the Job in Jenkins&lt;/h2&gt;
&lt;p&gt;Create a &lt;span class="Code"&gt;Execute Windows Batch Command&lt;/span&gt; as first build step with the single command &lt;span class="Code"&gt;dotnet restore&lt;/span&gt;, so that all projects within the workspace have their dependencies restored.&lt;/p&gt;
&lt;blockquote&gt;&lt;strong&gt;Tip&lt;/strong&gt;: If you're using private NuGet feeds as package sources, the NuGet version that is bundled with the RC2 tooling does not support encrypted passwords but only plain text passwords in a NuGet.config file. See the &lt;a rel="noopener noreferrer" href="https://docs.nuget.org/consume/nuget-config-settings#credentials-for-package-source" target="_blank" title="NuGet Source Credentials Configuration Documentation"&gt;NuGet documentation&lt;/a&gt; for how to setup clear text passwords for private repositories.&lt;/blockquote&gt;
&lt;p&gt;The second step is to execute the following PowerShell script (sorry for it being so long=):&lt;/p&gt;
&lt;pre class="brush: powershell;"&gt;$testProjects = @("src\Dangl.WebDocumentation.Tests")                                  # Array of test projects containing the xUnit tests you want to run, relative to the workspace directory
$filterRootNamespace = "Dangl"                                                         # If you only want to get coverage for a specific root namespace, eg. "MyCompany.*" Leave empty otherwise
$reportGeneratorHistoryPath = "C:\BuildCoverageReportHistories\Dangl.WebDocumentation" # Path where the history of the code coverage Html reports are stored - Must be outside the workspace so it does get. Leave empty if no history is wanted
# Modify the values below here if you need to - Standards should be fine
$dotnetPath = "C:\Program Files\dotnet\dotnet.exe"
$jenkinsWorkspace = $ENV:WORKSPACE # Use $ENV:WORKSPACE in Jenkins
$codeCoverageHtmlReportDirectory = "CodeCoverageHtmlReport"
$xUnitResultName = "xUnitResults.testresults"
$openCoverResultPath = Join-Path -Path $jenkinsWorkspace -ChildPath "OpenCoverCoverageReport.coverage"
$coberturaResultPath = Join-Path -Path $jenkinsWorkspace -ChildPath "CoberturaCoverageReport.coberturacoverage"
# Get the most recent ReportGenerator NuGet package from the dotnet nuget packages
$nugetReportGeneratorPackage = Join-Path -Path $env:USERPROFILE -ChildPath "\.nuget\packages\ReportGenerator"
$latestReportGenerator = Join-Path -Path ((Get-ChildItem -Path $nugetReportGeneratorPackage | Sort-Object Fullname -Descending)[0].FullName) -ChildPath "tools\ReportGenerator.exe"
# Get the most recent OpenCover NuGet package from the dotnet nuget packages
$nugetOpenCoverPackage = Join-Path -Path $env:USERPROFILE -ChildPath "\.nuget\packages\OpenCover"
$latestOpenCover = Join-Path -Path ((Get-ChildItem -Path $nugetOpenCoverPackage | Sort-Object Fullname -Descending)[0].FullName) -ChildPath "tools\OpenCover.Console.exe"
# Get the most recent OpenCoverToCoberturaConverter from the dotnet nuget packages
$nugetCoberturaConverterPackage = Join-Path -Path $env:USERPROFILE -ChildPath "\.nuget\packages\OpenCoverToCoberturaConverter"
$latestCoberturaConverter = Join-Path -Path (Get-ChildItem -Path $nugetCoberturaConverterPackage | Sort-Object Fullname -Descending)[0].FullName -ChildPath "tools\OpenCoverToCoberturaConverter.exe"
# Run unit tests with OpenCover attached for each test project
ForEach ($testProject in $testProjects){
    $testProjectPath = Join-Path -Path $jenkinsWorkspace -ChildPath $testProject
    # Create a unique output file name for the xUnit result
    $xUnitOutputCommand = "-xml \""" + (Join-Path -Path $jenkinsWorkspace -ChildPath ([Guid]::NewGuid().ToString() + "_" + $xUnitResultName)) + "\"""
    # Construct OpenCover arguments
    $openCoverArguments = New-Object System.Collections.ArrayList
    [void]$openCoverArguments.Add("-register:user")
    [void]$openCoverArguments.Add("-target:""" + $dotnetPath + """")
    [void]$openCoverArguments.Add("-targetargs:"" test " + "\""" +$testProjectPath + "\project.json\"" " + $xUnitOutputCommand + """") # dnx arguments
    [void]$openCoverArguments.Add("-output:""" + $openCoverResultPath + """") # OpenCover result output
    [void]$openCoverArguments.Add("-returntargetcode") # Force OpenCover to return an errorenous exit code if the xUnit runner returns one
    [void]$openCoverArguments.Add("-mergeoutput") # Needed if there are multiple test projects
    [void]$openCoverArguments.Add("-oldstyle") # Necessary until https://github.com/OpenCover/opencover/issues/595 is resolved
    if(!([System.String]::IsNullOrWhiteSpace($filterRootNamespace))) {
        [void]$openCoverArguments.Add("-filter:""+[" + $filterRootNamespace + "*]*""") # Check only defined namespaces if specified
    }
    # Run OpenCover with the dotnet text command
    "Running OpenCover tests with the dotnet test command"
    $openCoverProcess = Start-Process -FilePath $latestOpenCover -ArgumentList $openCoverArguments -Wait -PassThru -NoNewWindow
}
# Converting coverage reports to Cobertura format
$coberturaConverterArguments = New-Object System.Collections.ArrayList
[void]$coberturaConverterArguments.Add("-input:""" + $openCoverResultPath + """")
[void]$coberturaConverterArguments.Add("-output:""" + $coberturaResultPath + """")
[void]$coberturaConverterArguments.Add("-sources:""" + $jenkinsWorkspace + """")
$coberturaConverterProcess = Start-Process -FilePath $latestCoberturaConverter -ArgumentList $coberturaConverterArguments -Wait -PassThru -NoNewWindow
if ($coberturaConverterProcess.ExitCode -ne 0) {
    "Exiting due to CoberturaToOpenCoverConverter process having returned an error, exit code: " + $coberturaConverterProcess.ExitCode
    exit $coberturaConverterProcess.ExitCode
} else {
    "Finished running CoberturaToOpenCoverConverter"
}
"Creating the Html report for code coverage results"
# Creating the path for the Html code coverage reports
$codeCoverageHtmlReportPath = Join-Path -Path $jenkinsWorkspace -ChildPath $codeCoverageHtmlReportDirectory
if (-Not (Test-Path -Path $codeCoverageHtmlReportPath -PathType Container)) {
    New-Item -ItemType directory -Path $codeCoverageHtmlReportPath | Out-Null
}
# Create arguments to be passed to the ReportGenerator executable
$reportGeneratorArguments = New-Object System.Collections.ArrayList
[void]$reportGeneratorArguments.Add("-reports:""" + $openCoverResultPath + """")
[void]$reportGeneratorArguments.Add("-targetdir:""" + $codeCoverageHtmlReportPath + """")
if(!([System.String]::IsNullOrWhiteSpace($reportGeneratorHistoryPath))) {
    "Using history for ReportGenerator with directory: " + $reportGeneratorHistoryPath
    [void]$reportGeneratorArguments.Add("-historydir:""" + $reportGeneratorHistoryPath + """") # Check only defined namespaces if specified
} else {
    "Not using history for ReportGenerator"
}
# Run ReportGenerator
$reportGeneratorProcess = Start-Process -FilePath $latestReportGenerator -ArgumentList $reportGeneratorArguments -Wait -PassThru -NoNewWindow
if ($reportGeneratorProcess.ExitCode -ne 0) {
    "Exiting due to ReportGenerator process having returned an error, exit code: " + $reportGeneratorProcess.ExitCode
    exit $reportGeneratorProcess.ExitCode
} else {
    "Finished running ReportGenerator"
}
"Finished running unit tests and code coverage"&lt;/pre&gt;
&lt;blockquote&gt;&lt;strong&gt;Attention:&lt;/strong&gt; Line 33 is currently necessary due to a &lt;a rel="noopener noreferrer" href="https://github.com/OpenCover/opencover/issues/595" target="_blank" title="GitHub - OpenCover Issue 595"&gt;renaming of the mscrolib.dll in .Net Core&lt;/a&gt;&lt;/blockquote&gt;
&lt;p&gt;This script does call the OpenCover tool to consecutively run all test projects and aggregates the results. Then, they are converted into the Cobertura format (so that Jenkins can pick them up). Finally, the ReportGenerator tool is called to create an additional visualization of the coverage analysis.&lt;/p&gt;
&lt;p&gt;Since the resulting files are being stored in the workspace, you need to either configure the job to clean before each build or add another step to delete all earlier results from previous runs. Otherwise, you might run into trouble with duplicated result data.&lt;/p&gt;
&lt;h2&gt;Evaluate Results in Jenkins&lt;/h2&gt;
&lt;p&gt;The following three images show the necessary post build steps to publish the results in Jenkins. The file patterns are actually from the generated files in the PowerShell script above, so if you're modifying the script to use other filenames don't forget to adjust your post build actions!&lt;/p&gt;
&lt;p&gt;&lt;img id="__mcenew" style="display: block; margin-left: auto; margin-right: auto;" src="https://blog.dangl.me/media/1076/publishxunittestresults.png" alt="Jenkins xUnit Result Publishing Configuration" data-id="1176"&gt;&lt;/p&gt;
&lt;p&gt;&lt;img id="__mcenew" style="display: block; margin-left: auto; margin-right: auto;" src="https://blog.dangl.me/media/1074/publishcoberturacoveragereport.png" alt="Jenkins Cobertura Code Coverage Analysis Report" data-id="1174"&gt;&lt;/p&gt;
&lt;p&gt;&lt;img id="__mcenew" style="display: block; margin-left: auto; margin-right: auto;" src="https://blog.dangl.me/media/1075/publishhtmlreport.png" alt="Jenkins Custom Html Report Publishing Configuration" data-id="1175"&gt;&lt;/p&gt;
&lt;h2&gt;View the Results in Jenkins&lt;/h2&gt;
&lt;p&gt;Now when you run the job, you'll have the following graphs available in your jobs main page:&lt;/p&gt;
&lt;p&gt;&lt;img id="__mcenew" style="display: block; margin-left: auto; margin-right: auto;" src="https://blog.dangl.me/media/1077/testsandcoverageresultsinjenkins.png" alt="Jenkins CI Unit Test and Code Coverage Results" data-id="1177"&gt;&lt;/p&gt;
&lt;p&gt;There's also an entry called &lt;span class="Code"&gt;Code Coverage Source Visualization&lt;/span&gt; on your left menu in the job overview where you'll see visualized code coverage analysis for your project.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Update 25.09.2016:&lt;/strong&gt; Added info about how to resolve zero code coverage reports due to rename of mscorlib.dll&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Update 31.05.2017:&lt;/strong&gt; Added link to Linux tutorial&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Update 19.08.2017:&lt;/strong&gt; Added link to updated edition for the current csproj format&lt;/p&gt;</description>
      <pubDate>Mon, 30 May 2016 22:34:28 Z</pubDate>
      <a10:updated>2016-05-30T22:34:28Z</a10:updated>
    </item>
    <item>
      <guid isPermaLink="false">1152</guid>
      <link>https://blog.dangl.me/archive/configure-git-hooks-on-bonobo-git-server-in-windows/</link>
      <category>Continuous Integration</category>
      <title>Configure Git Hooks on Bonobo Git Server in Windows</title>
      <description>&lt;p&gt;&lt;span&gt;This guide will show you how to configure a post receive hook on a Bonobo Git Server that notifies a Jenkins CI server about repository changes. This means that whenever a new commit is pushed to the Git repository, the cURL command line tool will be launched to initiate a Http request telling Jenkins to trigger a job. Bonobo Git Server is a really nice way of setting up private, remote repositories. &lt;a href="https://bonobogitserver.com" target="_blank" title="Bonobo Git Server Project Site"&gt;You can get it at the project site.&lt;/a&gt;&lt;/span&gt;&lt;/p&gt;
&lt;h2&gt;&lt;span&gt;Making cUrl Available to the Git Server&lt;/span&gt;&lt;/h2&gt;
&lt;p&gt;&lt;span&gt;cUrl is a nice little command line utility that does all things Http for you. &lt;br&gt; In order to have cURL available on your server, you need to perform the following steps:&lt;/span&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;span&gt;Download cURL and extract the executable and the &lt;span class="Code"&gt;ca-bundle.crt&lt;/span&gt; certificate store to a folder of your choice, e.g. &lt;span class="Code"&gt;C:\cURL\&lt;/span&gt; &lt;strong&gt;(1)&lt;/strong&gt;&lt;/span&gt;&lt;/li&gt;
&lt;li&gt;Add the folder you have chosen to the Windows Path environment variable. To do this, open the Windows Explorer, right click on &lt;span class="Code"&gt;This Computer&lt;/span&gt; &lt;strong&gt;(2)&lt;/strong&gt;, select &lt;span class="Code"&gt;Properties&lt;/span&gt; and then select &lt;span class="Code"&gt;Advanced Settings&lt;/span&gt; &lt;strong&gt;(3)&lt;/strong&gt;. Now click on &lt;span class="Code"&gt;Environment Variables&lt;/span&gt; &lt;strong&gt;(4)&lt;/strong&gt;. In the following dialog, find the system variable called &lt;span class="Code"&gt;Path&lt;/span&gt; &lt;strong&gt;(5)&lt;/strong&gt;, select it and click on edit. Add the path to the folder containing cURL at the end of the variable &lt;strong&gt;(6)&lt;/strong&gt;. Remember to separate it from the previous entry with a semicolon.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;&lt;img id="__mcenew" style="display: block; margin-left: auto; margin-right: auto;" src="https://blog.dangl.me/media/1057/addcurlpathvariable.png" alt="Add an entry to the Windows Path variable" data-id="1154"&gt;&lt;/p&gt;
&lt;h2&gt;&lt;span&gt;Setting Up the Git Hook to Notify Jenkins&lt;/span&gt;&lt;/h2&gt;
&lt;p&gt;&lt;span&gt;Navigate  to the repository in Windows Explorer (by default, it’s relative to the IIS Website path in &lt;span class="Code"&gt;~\App_Data\Repositories\&amp;lt;RepositoryName&amp;gt;&lt;/span&gt;)&lt;br&gt; Go to the &lt;span class="Code"&gt;hooks&lt;/span&gt; subdirectory and create an empty file with the name &lt;span class="Code"&gt;post-receive&lt;/span&gt; (yes, no extension!)&lt;br&gt; Open the file with a text editor and paste the following content:&lt;/span&gt;&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;pre class="brush: bash;"&gt;#!/bin/sh
echo Notifying Jenkins Server
curl https://&amp;lt;YourJenkinsServer&amp;gt;/git/notifyCommit?url=&amp;lt;YourGitRepositoryUrl&amp;gt;&lt;/pre&gt;
&lt;p&gt;&lt;span&gt;&lt;img id="__mcenew" style="display: block; margin-left: auto; margin-right: auto;" src="https://blog.dangl.me/media/1058/gitpostreceivehook.png" alt="Git Post Receive Hook in Notepad++" data-id="1155"&gt;&lt;/span&gt;&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt;&lt;span&gt;This script will be automatically executed whenever you make a push to the server, and the three lines mean the following:&lt;br&gt; The first line tells Git what type of script the file contains (/bin/sh here meaning a regular shell script)&lt;br&gt; The second line echos a message to indicate that it’s doing work (you can view the whole console output also in your Git response)&lt;br&gt; On the third line, curl is executed to perform a Http GET request to &lt;span class="Code"&gt;&amp;lt;YourJenkinsServer&amp;gt;&lt;/span&gt; that passes the Url of &lt;span class="Code"&gt;&amp;lt;YourGitRepoUrl&amp;gt;&lt;/span&gt; as query parameter, so Jenkins knows which Repository it should check for updates.&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;Jenkins is always listening on that endpoint. If a request hits, Jenkins will start all jobs that are configured for this Git repository.&lt;/p&gt;</description>
      <pubDate>Thu, 12 May 2016 21:02:55 Z</pubDate>
      <a10:updated>2016-05-12T21:02:55Z</a10:updated>
    </item>
    <item>
      <guid isPermaLink="false">1148</guid>
      <link>https://blog.dangl.me/archive/basic-jenkins-configuration-for-dotnet-continuous-integration/</link>
      <category>Continuous Integration</category>
      <title>Basic Jenkins Configuration for .Net Continuous Integration</title>
      <description>&lt;p&gt;This article will cover how to do a basic setup for .Net projects using MSBuild, MSTest and Git in a Jenkins Continuous Integration server. It doesn't cover the Jenkins installation, so I'll assume you've got a working installation ready. Otherwise, you can read &lt;a data-id="1092" href="/archive/installing-and-configuring-jenkins-on-windows-server-with-iis/" title="Installing and Configuring Jenkins on Windows Server with IIS"&gt;this post about how to install Jenkins on a Windows Server&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;If you follow this tutorial, you'll see how to create automatic tests and builds, triggered after committing code to a Git repository.&lt;/p&gt;
&lt;h2&gt;Windows Setup&lt;/h2&gt;
&lt;p&gt;&lt;span&gt;Before turning your attention to Jenkins, you need to set up some prerequisites on your server. Some tools and settings are required for a great DevOps environment.&lt;/span&gt;&lt;/p&gt;
&lt;blockquote&gt;&lt;span&gt;&lt;strong&gt;Reminder:&lt;/strong&gt; When you configure system settings such as Windows’ PATH variable or creating tool installations, make sure that the user account under which your Jenkins installation is running has access to them.&lt;/span&gt;&lt;/blockquote&gt;
&lt;ul&gt;
&lt;li&gt;&lt;span&gt;Install Git on the server&lt;br&gt; Jenkins will rely on an external Git installation, so just download the latest release &lt;a rel="noopener noreferrer" href="https://git-scm.com/downloads" target="_blank" title="Git SCM Downloads"&gt;at the official site&lt;/a&gt; and install it on the server. If you’re using a self-signed certificate, make sure to follow the steps to add the certificate to Gits trusted certificates as described in another post.&lt;/span&gt;&lt;/li&gt;
&lt;li&gt;&lt;span&gt;Agents for Microsoft Visual Studio 2015 (If you don’t have Visual Studio installed)&lt;br&gt; Go to &lt;a rel="noopener noreferrer" href="https://www.microsoft.com/en-us/download/details.aspx?id=48152" target="_blank" title="Microsoft Build Agents for Visual Studio 2015"&gt;the official download site&lt;/a&gt; and download the Agents for Microsoft Visual Studio 2015. That’s a package intended for running MSTest without having Visual Studio installed. It’s got that typical Visual Studio installer with that it's extra-long install duration, so better download and start the install right now before you reach the point in this tutorial where it’s required 😉&lt;/span&gt;&lt;/li&gt;
&lt;li&gt;&lt;span&gt;Microsoft Build Tools&lt;br&gt;Go to &lt;a rel="noopener noreferrer" href="https://www.visualstudio.com/downloads/download-visual-studio-vs" target="_blank" title="VisualStudio.com Download Section"&gt;the download section at VisualStudio.com&lt;/a&gt; and navigate to the downloads section. Under &lt;span class="Code"&gt;Tools for Visual Studio 2015&lt;/span&gt;, you’ll find a download for Microsoft Build Tools that you need to grab. &lt;span&gt;The picture below shows the correct download. &lt;/span&gt;&lt;img id="__mcenew" style="display: block; margin-left: auto; margin-right: auto;" src="https://blog.dangl.me/media/1056/visualstudiocombuildtoolsdownload.png" alt="VisualStudio.com Microsoft Build Tools Download Page" data-id="1151"&gt;&lt;/span&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;blockquote&gt;&lt;strong&gt;Update 2017-11-06:&lt;/strong&gt; I'd actually recommend now to go for &lt;strong&gt;Visual Studio 2017&lt;/strong&gt;. You can get it at &lt;a rel="noopener noreferrer" href="https://www.visualstudio.com/downloads/" target="_blank" title="Visual Studio Downloads"&gt;the official download site&lt;/a&gt; and if you select the workloads you need, you'll get all the required SDKs without having to worry about each single one.&lt;/blockquote&gt;
&lt;blockquote&gt;&lt;strong&gt;Tip:&lt;/strong&gt; If you’re having a lot of different target frameworks, you might actually be better off to go along with Visual Studio Community Edition instead of the separate Build Tools &amp;amp; Agents. It’s more heavy on the install side, but it comes with all different .Net SDKs and features installed that you’ll need (except the Dotnet CLI environment, but that’s in another post).&lt;/blockquote&gt;
&lt;ul&gt;
&lt;li&gt;NuGet&lt;br&gt;You'll be fine by getting the latest executable from the &lt;a rel="noopener noreferrer" href="https://docs.nuget.org/consume/installing-nuget" target="_blank" title="NuGet Documentation and Download Section"&gt;NuGet documentation&lt;/a&gt; and place it in a directory that's accessible to your Jenkins service.&lt;/li&gt;
&lt;/ul&gt;
&lt;h2&gt;&lt;span&gt;Git Windows Style Line Ending Setting&lt;/span&gt;&lt;/h2&gt;
&lt;p&gt;&lt;span&gt;You’re probably familiar with the issue of different operating systems using different defaults for representing line endings, such as &lt;span class="Code"&gt;\r\n&lt;/span&gt; in Windows or just &lt;span class="Code"&gt;\n&lt;/span&gt; in Linux. Since Git is not a native Windows tool, it defaults to the behavior of normalizing line endings to &lt;span class="Code"&gt;\n&lt;/span&gt; when checking out code. &lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;span&gt;However, imagine a case where you’re performing some integration testing of a library that creates text and uses C#’s &lt;span class="Code"&gt;Environment.NewLine&lt;/span&gt; as line endings. In Windows, this will result in a &lt;span class="Code"&gt;\r\n&lt;/span&gt; newline. Now when you compare text created in your tool in an integration test, you might run into issues where &lt;span class="Code"&gt;Expected Line #1\nExpected Line #2&lt;/span&gt; is different to &lt;span class="Code"&gt;Expected Line #1\nExpected Line #2&lt;/span&gt;. You can handle this in your tests by ignoring line endings, but you can also easily reconfigure Git to use standard Windows line endings. I prefer the latter since I’m in a pure Windows environment and don’t need to deal with cross-OS normalization.&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;span&gt;Logged in as the user under which Jenkins runs, issue the following command in Git to ensure that on checking out, Windows line endings are used:&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;span class="Code"&gt;Git config --global core.autocrlf true&lt;/span&gt;&lt;/p&gt;
&lt;h2&gt;&lt;span&gt;Required Plugins&lt;/span&gt;&lt;/h2&gt;
&lt;p&gt;&lt;span&gt;To make Jenkins work with Git and the .Net stack, we need to add a few plugins in the &lt;span class="Code"&gt;Manage Jenkins&lt;/span&gt; -&amp;gt; &lt;span class="Code"&gt;Manage Plugins&lt;/span&gt; section:&lt;/span&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;span&gt;&lt;a rel="noopener noreferrer" href="https://wiki.jenkins-ci.org/display/JENKINS/Git+Plugin" target="_blank" title="Jenkins Git Plugin"&gt;Git Plugin&lt;/a&gt;&lt;br&gt; Provides support for Git repositories&lt;/span&gt;&lt;/li&gt;
&lt;li&gt;&lt;span&gt;&lt;a rel="noopener noreferrer" href="https://wiki.jenkins-ci.org/display/JENKINS/MSBuild+Plugin" target="_blank" title="Jenkins MSBuild Plugin"&gt;MSBuild Plugin&lt;/a&gt;&lt;br&gt; We need that to build our .Net projects&lt;/span&gt;&lt;/li&gt;
&lt;li&gt;&lt;span&gt;&lt;a rel="noopener noreferrer" href="https://wiki.jenkins-ci.org/display/JENKINS/VsTestRunner+Plugin" target="_blank" title="Jenkins VSTest Runner Plugin"&gt;VSTest Runner plugin&lt;/a&gt;&lt;br&gt; This will use vstest.console.exe to run unit tests. That’s the unit test execution engine Visual Studio uses. You’ll want this since I’ve encountered issues with MSTest, for example the &lt;span class="Code"&gt;ExpectedException&lt;/span&gt; attribute does not work with the MSTest that’s currently included in the Agents for Visual Studio and there might be more issues. It might work for you, but if you’re relying on MSTest then better take the VSTest Runner&lt;/span&gt;&lt;/li&gt;
&lt;li&gt;&lt;span&gt;&lt;a rel="noopener noreferrer" href="https://wiki.jenkins-ci.org/display/JENKINS/MSTest+Plugin" target="_blank" title="Jenkins MSTest Plugin"&gt;MSTest plugin&lt;/a&gt;&lt;br&gt; To convert MSTest results to the JUnit Xml format&lt;/span&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;&lt;span&gt;Install these plugins and restart Jenkins.&lt;/span&gt;&lt;/p&gt;
&lt;h2&gt;&lt;span&gt;Plugin Configuration&lt;/span&gt;&lt;/h2&gt;
&lt;h3&gt;&lt;span&gt;Git Server Credentials&lt;/span&gt;&lt;/h3&gt;
&lt;p&gt;&lt;span&gt;You have to create a login on your Git server so that Jenkins can access the repositories. When you have created a user with access rights to the repository, go back to Jenkins, navigate to &lt;span class="Code"&gt;Manage Jenkins&lt;/span&gt; -&amp;gt; &lt;span class="Code"&gt;Manage Credentials&lt;/span&gt; -&amp;gt; &lt;span class="Code"&gt;Add Credentials&lt;/span&gt; and then select &lt;span class="Code"&gt;Username with password&lt;/span&gt; in the combobox.&lt;br&gt; Enter the credentials you have created on the Git server here so that we can use whenever Jenkins needs to access the Git server.&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;span&gt;&lt;img id="__mcenew" style="display: block; margin-left: auto; margin-right: auto;" src="https://blog.dangl.me/media/1062/addgitcredentials.png" alt="Entering Git Server Credentials in Jenkins" data-id="1159"&gt;&lt;/span&gt;&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt;&lt;span&gt;If you haven’t done it yet, create a new Git repository containing a simple C# (or VB.Net or...) class library and a MSTest unit test project.&lt;/span&gt;&lt;/p&gt;
&lt;h3&gt;&lt;span&gt;MSBuild Configuration&lt;/span&gt;&lt;/h3&gt;
&lt;p&gt;MSBuild should already be present on your system. If it is not, you can manually download the .Net framework or use the server manager to add the ASP.Net features to your IIS installation and have the framework automatically come along. The folder shown is the standard location of MSBuild (for the currently latest .Net version). This configuration section is again found in &lt;span class="Code"&gt;Manage Jenkins&lt;/span&gt; -&amp;gt; &lt;span class="Code"&gt;Global Tool Configuration&lt;/span&gt;.&lt;/p&gt;
&lt;p&gt;&lt;img id="__mcenew" style="display: block; margin-left: auto; margin-right: auto;" src="https://blog.dangl.me/media/1069/configuremsbuild.png" alt="Jenkins MSBuild Plugin Configuration" data-id="1166"&gt;&lt;/p&gt;
&lt;h3&gt;&lt;span&gt;VSTest Configuration&lt;/span&gt;&lt;/h3&gt;
&lt;p&gt;This is similar to setting up MSBuild. You’ll need to point the path to the Visual Studio directory that contains the &lt;span class="Code"&gt;vstest.console.exe&lt;/span&gt; (it’s also installed with the Agents for Microsoft Visual Studio 2015). I named it by the Visual Studio version.&lt;/p&gt;
&lt;p&gt;&lt;img id="__mcenew" style="display: block; margin-left: auto; margin-right: auto;" src="https://blog.dangl.me/media/1070/configurevstest.png" alt="Jenkins VSTest Runner Plugin" data-id="1167"&gt;&lt;/p&gt;
&lt;h2&gt;&lt;span&gt;Create your first job&lt;/span&gt;&lt;/h2&gt;
&lt;p&gt;In Jenkins, go to the main menu and click &lt;span class="Code"&gt;New Item&lt;/span&gt; in the upper left corner. Give the job a name and select &lt;span class="Code"&gt;Freestyle Project&lt;/span&gt;, then click OK.&lt;/p&gt;
&lt;p&gt;&lt;img id="__mcenew" style="display: block; margin-left: auto; margin-right: auto;" src="https://blog.dangl.me/media/1071/jenkinscreatejob.png" alt="Create a Freestyle Project Job in Jenkins" data-id="1168"&gt;&lt;/p&gt;
&lt;p&gt;Now you are in the job configuration page. Here, go to the &lt;span class="Code"&gt;Source Code Management&lt;/span&gt; section and tell Jenkins to use Git and specify the Url of the repository. Select the login credentials that you’ve configured earlier for the Git server. Also click on &lt;span class="Code"&gt;Add Additional Behaviours&lt;/span&gt; and select &lt;span class="Code"&gt;Clean before checkout&lt;/span&gt;, otherwise the results from previous builds will not be cleaned up and you get more and more test results with every build=)&lt;/p&gt;
&lt;p&gt;&lt;img id="__mcenew" style="display: block; margin-left: auto; margin-right: auto;" src="https://blog.dangl.me/media/1061/sourcecontrolconfiguration.png" alt="Jenkins Git Repository Source Code Management" data-id="1158"&gt;&lt;/p&gt;
&lt;p&gt;In &lt;span class="Code"&gt;Build Triggers&lt;/span&gt;, just select &lt;span class="Code"&gt;Poll SCM&lt;/span&gt; without entering a schedule. It’ll warn you that it will never run, but that’s wrong, it will run whenever you push to the Git server. It’s important to leave &lt;span class="Code"&gt;Ignore post-commit hooks&lt;/span&gt; unchecked!&lt;/p&gt;
&lt;p&gt;&lt;img id="__mcenew" style="display: block; margin-left: auto; margin-right: auto;" src="https://blog.dangl.me/media/1068/buildtriggers.png" alt="Jenkins Git Poll SCM Build Trigger Configuration" data-id="1165"&gt;&lt;/p&gt;
&lt;p&gt;Now we configure two build steps. The first will make the build using MSBuild. It’s pretty straightforward, we’re just referencing the *.sln file via its relative path in the repository (It’s in the root folder for this sample project).&lt;/p&gt;
&lt;p&gt;&lt;img id="__mcenew" style="display: block; margin-left: auto; margin-right: auto;" src="https://blog.dangl.me/media/1066/buildstep_build.png" alt="Jenkins MSBuild Build Step Configuration" data-id="1163"&gt;&lt;/p&gt;
&lt;p&gt;Another build step is configured to execute VSTest. The steps are done consecutively, so it’ll wait until the build step is done to execute the tests. Two parameters are specified: The dll containing the unit tests and the name of the file in which to store the unit test results. Note that you could enter multiple unit test dlls here if necessary. You can leave the advanced settings as they are.&lt;/p&gt;
&lt;p&gt;&lt;img id="__mcenew" style="display: block; margin-left: auto; margin-right: auto;" src="https://blog.dangl.me/media/1067/buildstep_test.png" alt="Jenkins VSTest MSTest Runner Build Step" data-id="1164"&gt;&lt;/p&gt;
&lt;blockquote&gt;&lt;strong&gt;Tip:&lt;/strong&gt; If you’re having any NuGet packages referenced in your project, you’ll most likely do not have them in source control (which is good!). Then you need to restore them before you can build the project. Hence, add the following build step as FIRST step to your build actions (and set the paths according to your local config):&lt;img id="__mcenew" style="display: block; margin-left: auto; margin-right: auto;" src="https://blog.dangl.me/media/1072/nugetrestorestep.png" alt="Jenkins Build NuGet Restore Step" data-id="1169"&gt;&lt;/blockquote&gt;
&lt;p&gt;After the build we tell Jenkins to do two things: Read that *.trx MSTest results file from the VSTest Runner and convert it to a JUnit output (this allows Jenkins to display the unit test results) and additionally we’ll set up an email notification for when things go wrong.&lt;/p&gt;
&lt;p&gt;&lt;img id="__mcenew" style="display: block; margin-left: auto; margin-right: auto;" src="https://blog.dangl.me/media/1060/postbuild.png" alt="Jenkins Post Build Actions to Analyze Tests and Email Notification" data-id="1157"&gt;&lt;/p&gt;
&lt;p&gt;Now click &lt;span class="Code"&gt;Save&lt;/span&gt; and you’re in the detail view of your created Job. Click on &lt;span class="Code"&gt;Build Now&lt;/span&gt; on the left to start your first build! In a few moments, the &lt;span class="Code"&gt;Build History&lt;/span&gt; will show you the job. You can hover over the date entry and expand the sub menu. Now select &lt;span class="Code"&gt;Console Output&lt;/span&gt; to see what’s going on.&lt;/p&gt;
&lt;p&gt;&lt;img id="__mcenew" style="display: block; margin-left: auto; margin-right: auto;" src="https://blog.dangl.me/media/1065/buildrunning.png" alt="Jenkins Job Overview with Completed Job" data-id="1162"&gt;&lt;/p&gt;
&lt;p&gt;&lt;span&gt;If everything went right, the last line in the console output should say: &lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;span class="Code"&gt;Finished: SUCCESS&lt;/span&gt;&lt;/p&gt;
&lt;blockquote&gt;&lt;strong&gt;Tip:&lt;/strong&gt; If the test failed, the console output should provide you with information on what went wrong.&lt;/blockquote&gt;
&lt;h2&gt;&lt;span&gt;Testing the Git hook&lt;/span&gt;&lt;/h2&gt;
&lt;p&gt;&lt;span&gt;If you've already setup a remote Git repository and configured a Git hook to notify Jenkins on updates (&lt;a data-id="1152" href="/archive/configure-git-hooks-on-bonobo-git-server-in-windows/" title="Configure Git Hooks on Bonobo Git Server in Windows"&gt;here's a post describing how to do that with a self hosted Bonobo Git Server&lt;/a&gt;), now would be the time to trigger an automatic build.&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;span&gt;When you look at Jenkins after having done a commit on your remote Git repository, &lt;span class="Code"&gt;Build History&lt;/span&gt; should now have a second entry.&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;span&gt;&lt;img id="__mcenew" style="display: block; margin-left: auto; margin-right: auto;" src="https://blog.dangl.me/media/1059/secondbuildhistoryentry.png" alt="Jenkins Build History with Second Job Entry" data-id="1156"&gt;&lt;/span&gt;&lt;/p&gt;
&lt;h2&gt;&lt;span&gt;Break the Build&lt;/span&gt;&lt;/h2&gt;
&lt;p&gt;This rarely ever happens in real life, but legend has it that builds break. Let’s add some faulty code and make the compiler unable to build the project. Then commit. Refresh the page...&lt;/p&gt;
&lt;p&gt;&lt;img id="__mcenew" style="display: block; margin-left: auto; margin-right: auto;" src="https://blog.dangl.me/media/1063/brokenbuild.png" alt="Broken Build in Jenkins Build History" data-id="1160"&gt;&lt;/p&gt;
&lt;p&gt;&lt;span&gt;… and face disaster! &lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;span&gt;Luckily, you've also been alerted by email of the failing build. You can go to the build and view the console output (that’s also what’s been sent to you by email). It’ll show you a log that will list some compiler errors near the bottom.&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;span&gt;Let’s fix that and make another commit. You’ll get another email telling you that the build is back to normal.&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;span&gt;To summarize, whenever there is a change that makes a build fail, you’ll be notified by email. As soon as the build starts working again, you’ll get another email. Quite simple yet so convenient!&lt;/span&gt;&lt;/p&gt;</description>
      <pubDate>Thu, 12 May 2016 20:22:13 Z</pubDate>
      <a10:updated>2016-05-12T20:22:13Z</a10:updated>
    </item>
    <item>
      <guid isPermaLink="false">1121</guid>
      <link>https://blog.dangl.me/archive/set-up-private-nuget-feeds-with-proget-and-jenkins/</link>
      <category>Continuous Integration</category>
      <title>Set Up Private NuGet Feeds with ProGet and Jenkins</title>
      <description>&lt;p&gt;&lt;span&gt;This posts goal is to have a continuous integration server (Jenkins) pull a MSBuild project from a remote repository, build it and upload it to a private NuGet server. The NuGet feeds is hosted via ProGet with an authorization mechanism to restrict user access. Since I’m only using the free version of ProGet, this tutorial won’t cover setting up different feeds for different users.&lt;/span&gt;&lt;/p&gt;
&lt;h2&gt;&lt;span&gt;Install &amp;amp; Configure &lt;/span&gt;ProGet&lt;/h2&gt;
&lt;p&gt;&lt;span&gt;There’s a &lt;a rel="noopener noreferrer" href="http://inedo.com/proget" target="_blank" title="ProGet Website"&gt;free edition of ProGet&lt;/a&gt; available that can host private NuGet (and more) feeds. Installation and initial configuration is nicely covered by their documentation, so I won’t go into much detail here. There was just one bit missing in the documentation that caused some troubles for me: When you’re using MS SQL Server, it’s possible that your default collation is different to the one ProGet will try to set during the database initialization, so make sure to create the ProGet database with &lt;span class="Code"&gt;SQL_Latin1_General_CP1_CI_AS&lt;/span&gt; collation and you’ll not run into issues, otherwise ProGet might fail to apply its database schema during installation.&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;span&gt;To keep your feeds private, go into the settings page of &lt;span class="Code"&gt;ProGet&lt;/span&gt; -&amp;gt; &lt;span class="Code"&gt;Assign Privileges&lt;/span&gt; and remove the one that grants anonymous users &lt;span class="Code"&gt;View Only&lt;/span&gt; rights.&lt;/span&gt;&lt;/p&gt;
&lt;h2&gt;&lt;span&gt;Set Up a Private Feed on the Server&lt;/span&gt;&lt;/h2&gt;
&lt;p&gt;&lt;span&gt;In ProGets configuration section, go to &lt;span class="Code"&gt;Manage Feeds&lt;/span&gt; -&amp;gt; &lt;span class="Code"&gt;Create Feed&lt;/span&gt; -&amp;gt; &lt;span class="Code"&gt;NuGet&lt;/span&gt; and assign a name for that feed. The only thing you’ll want to set here is the &lt;span class="Code"&gt;NuGet API Key&lt;/span&gt;, since we’re going to push it via a Jenkins Job and we don’t want to have to include user credentials in a build job in plain text. Leave the symbols server option turned on.&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: center;"&gt;&lt;span&gt;&lt;img id="__mcenew" src="https://blog.dangl.me/media/1034/progetfeedconfiguration.png" alt="ProGet NuGet Feed Configuration" data-id="1123"&gt;&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: center;"&gt; &lt;/p&gt;
&lt;p style="text-align: left;"&gt;Now if you go back to the feed overview, your feed will be listed along the default feed with its Url.&lt;/p&gt;
&lt;p style="text-align: center;"&gt;&lt;img id="__mcenew" src="https://blog.dangl.me/media/1035/progetfeedsoverview.png" alt="ProGet NuGet Feeds Overview" data-id="1124"&gt;&lt;/p&gt;
&lt;p&gt;&lt;span&gt;Admittedly, in the free edition, creating a feed isn’t really necessary at all since there’s no ways to customize access based on feeds. It’s all or nothing in the free edition.&lt;/span&gt;&lt;/p&gt;
&lt;h2&gt;&lt;span&gt;Enable NuGet Package Upload with an ApiKey&lt;/span&gt;&lt;/h2&gt;
&lt;p&gt;&lt;span&gt;Later, you’ll want to use the ApiKey to simplify uploading packages via the command line. For this, you have to create a new role with the task &lt;span class="Code"&gt;Feeds_AddPackage&lt;/span&gt; in the &lt;span class="Code"&gt;Manage Roles&lt;/span&gt; section:&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;span&gt;&lt;img id="__mcenew" style="display: block; margin-left: auto; margin-right: auto;" src="https://blog.dangl.me/media/1036/progetcreateprivilege.png" alt="ProGet User Privilege Creation" data-id="1125"&gt;&lt;/span&gt;&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt;&lt;span&gt;Then, assign that role to the &lt;span class="Code"&gt;Anonymous&lt;/span&gt; user in the &lt;span class="Code"&gt;Manage Priviliges&lt;/span&gt; section:&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;span&gt;&lt;img id="__mcenew" style="display: block; margin-left: auto; margin-right: auto;" src="https://blog.dangl.me/media/1037/progetcreaterole.png" alt="ProGet User Role Creation" data-id="1126"&gt;&lt;/span&gt;&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt;&lt;span&gt;Now, to upload a package, only the ApiKey is required (when one is configured for the feed).&lt;/span&gt;&lt;/p&gt;
&lt;h2&gt;&lt;span&gt;Configure your Project for NuGet Packaging&lt;/span&gt;&lt;/h2&gt;
&lt;p&gt;&lt;span&gt;Creating a NuGet feed is really easy. At first, you must find out where your NuGet.exe is located or simple download the latest executable from &lt;a href="https://www.nuget.org"&gt;https://www.nuget.org&lt;/a&gt;. Then, in your project dir, run the &lt;span class="Code"&gt;nuget.exe spec&lt;/span&gt; command. &lt;/span&gt;&lt;span&gt;A *.nuspec file is created with the same name as your *.csproj file with this default content:&lt;/span&gt;&lt;/p&gt;
&lt;pre class="brush: xhtml;"&gt;&amp;lt;?xml version="1.0"?&amp;gt;
&amp;lt;package &amp;gt;
  &amp;lt;metadata&amp;gt;
    &amp;lt;id&amp;gt;$id$&amp;lt;/id&amp;gt;
    &amp;lt;version&amp;gt;$version$&amp;lt;/version&amp;gt;
    &amp;lt;title&amp;gt;$title$&amp;lt;/title&amp;gt;
    &amp;lt;authors&amp;gt;$author$&amp;lt;/authors&amp;gt;
    &amp;lt;owners&amp;gt;$author$&amp;lt;/owners&amp;gt;
    &amp;lt;licenseUrl&amp;gt;http://LICENSE_URL_HERE_OR_DELETE_THIS_LINE&amp;lt;/licenseUrl&amp;gt;
    &amp;lt;projectUrl&amp;gt;http://PROJECT_URL_HERE_OR_DELETE_THIS_LINE&amp;lt;/projectUrl&amp;gt;
    &amp;lt;iconUrl&amp;gt;http://ICON_URL_HERE_OR_DELETE_THIS_LINE&amp;lt;/iconUrl&amp;gt;
    &amp;lt;requireLicenseAcceptance&amp;gt;false&amp;lt;/requireLicenseAcceptance&amp;gt;
    &amp;lt;description&amp;gt;$description$&amp;lt;/description&amp;gt;
    &amp;lt;releaseNotes&amp;gt;Summary of changes made in this release of the package.&amp;lt;/releaseNotes&amp;gt;
    &amp;lt;copyright&amp;gt;Copyright 2016&amp;lt;/copyright&amp;gt;
    &amp;lt;tags&amp;gt;Tag1 Tag2&amp;lt;/tags&amp;gt;
  &amp;lt;/metadata&amp;gt;
&amp;lt;/package&amp;gt;&lt;br&gt;&lt;br&gt;&lt;/pre&gt;
&lt;p&gt;&lt;span&gt;All the fields enclosed in &lt;span class="Code"&gt;$&lt;/span&gt; tags will be automatically populated when you create the package, but you can change them to hard coded values or have your CI system populate them. Just make sure that when you have the placeholder in the file, your Assembly must have the attribute. For example, if in your Project Settings in Visual Studio there is no Description for the assembly, then packaging will fail due to the &lt;span class="Code"&gt;$description$&lt;/span&gt; tag missing content.&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;span&gt;You can do a test run to create a package via &lt;span class="Code"&gt;nugget.exe pack &amp;lt;PathToYourCsprojFile&amp;gt; -Prop Configuration=Release -Symbols&lt;/span&gt;&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;span&gt;This will generate a *.nupkg and a *.symbols.nupkg file. The symbols file does contain your *.pdb files.&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;span&gt;Next, before you can deploy you need to set the NuGet ApiKey for your ProGet server via this command &lt;span class="Code"&gt;nuget setapikey &amp;lt;YourApiKey&amp;gt; -source &amp;lt;UrlOfYourFeed&amp;gt;&lt;/span&gt;&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;span&gt;This will locally (for the current user, so execute the command also with the user that is configured for your Jenkins service) store the ApiKey for that specific feed for pushing packages.&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;span&gt;Now pushing the package is really easy with &lt;/span&gt;&lt;span class="Code"&gt;nuget push &amp;lt;PathToYourPackage&amp;gt; -source &amp;lt;PathToYourFeed&amp;gt;&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;span&gt;If you do want to include symbols, then upload the *.symbols.nupkg instead of the *.nupkg one. ProGet does not have different endpoints for symbol- and regular packages, it will only accept a single package per project and version and, if this is a symbol package, additionally offer the *.pdb files in its symbol feed while stripping them from the regular feed.&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;span&gt;If everything went smooth, your Package is now available on your private NuGet server.&lt;/span&gt;&lt;/p&gt;
&lt;h2 id="AccessNugetFeeds"&gt;&lt;span&gt;Accessing the Feeds&lt;/span&gt;&lt;/h2&gt;
&lt;p&gt;&lt;span&gt;Since the feeds are now private and protected, it would be advisable to create a user with the &lt;span class="Code"&gt;View_Only&lt;/span&gt; role for accessing the feeds in ProGet.&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;span&gt;Once that’s done, you need, on every machine and for every user account that will access the feed, run the following command:&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;span&gt;&lt;span class="Code"&gt;nuget source add -Name &amp;lt;NameOfTheStoredCredentials&amp;gt; -Source &amp;lt;YourNuGetFeedUrl&amp;gt; -User &amp;lt;Username&amp;gt; -Pass &amp;lt;Password&amp;gt;&lt;/span&gt;&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;span&gt;By default, this will store it in a config file in the current users AppData directory and will be automatically retrieved for this source. The package source will then also be made known for all package restores on your account through NuGet, so you don’t have to specify your custom NuGet feed in Visual Studio. In fact, Visual Studio will be aware of your package source after you’ve entered this command. To search for packages there, set this as your package source in the upper right corner of your NuGet package manager view in Visual Studio. &lt;em&gt;Please note that you can overwrite global feed sources via a project specific NuGet.config file.&lt;/em&gt;&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;span&gt;Try it out with &lt;span class="Code"&gt;nuget list -source &amp;lt;YourNuGetFeedUrl&amp;gt;&lt;/span&gt;. If it works, you've got time to fetch a coffee while it lists all the feeds (it also relays the official NuGet feed). You could also press &lt;span class="Code"&gt;Ctrl + C&lt;/span&gt; in the cmd window to abort=)&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;span&gt;Setting the credentials is an easy way for integrating the private feed in your CI system as well as your IDE, since all you have to do is specify an Url and don’t have to wiggle with the credentials yourself.&lt;/span&gt;&lt;/p&gt;
&lt;h2&gt;&lt;span&gt;Automate NuGet Package Deployment with Jenkins&lt;/span&gt;&lt;/h2&gt;
&lt;p&gt;&lt;span&gt;In Jenkins, you want two build actions:&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;span&gt;First, build the project in release mode with the MSBuild plugin (NuGet should do that automatically, but I had issue with NuGet not being able to find the correct MSBuild version)&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;span&gt;&lt;img id="__mcenew" style="display: block; margin-left: auto; margin-right: auto;" src="https://blog.dangl.me/media/1038/msbuildpackage.png" alt="Jenkins MSBuild Task Configuration" data-id="1127"&gt;&lt;/span&gt;&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt;&lt;span&gt;Then in the second step, you need the following PowerShell script (if you haven’t already, you should install the &lt;a rel="noopener noreferrer" href="https://wiki.jenkins-ci.org/display/JENKINS/PowerShell+Plugin" target="_blank" title="Jenkins CI PowerShell plugin"&gt;Jenkins PowerShell plugin&lt;/a&gt;):&lt;/span&gt;&lt;/p&gt;
&lt;pre class="brush: powershell;"&gt;$workspacePath = $ENV:WORKSPACE
$projectName = "Your.Namespace" # Name of the project. Omit ".csproj", this will be added automatically
$pathToNuGet = "C:\BuildTools\NuGet\nuget.exe"
$configuration = "Release" # The configuration you want to publish
$nuGetFeedTarget = "&amp;lt;YourNuGetFeed&amp;gt;"
# Find the project file
$pathToCsproj = Get-ChildItem $workspacePath -Recurse -Filter ($projectName + ".csproj")
# Create the NuGet package
"Creating the NuGet package"
$nuGetPackArguments = @(("pack"),`
                        ("""" + $pathToCsproj.FullName + """"),`
                        ("-Prop Configuration=" + $configuration),`
                        ("-Symbols"))
                        $nuGetPackArguments
$nuGetPackProcess = Start-Process -FilePath $pathToNuGet -ArgumentList $nuGetPackArguments -Wait -PassThru -NoNewWindow
if ($nuGetPackProcess.ExitCode -ne 0) {
    "Error during NuGet package creation, exiting"
    exit $nuGetPackProcess.ExitCode
} else {
    "Finished NuGet package creation"
}
# Find the package
$pathToPackage = Get-ChildItem $workspacePath -Recurse -Filter ($projectName + "*.symbols.nupkg")
# Upload the package
$nuGetPublishArguments = @(("push"),`
                            ("-NonInteractive"),`
                            ("""" + $pathToPackage.FullName + """"),`
                            ("-source " + $nuGetFeedTarget))
$nuGetPublishProcess =  Start-Process -FilePath $pathToNuGet -ArgumentList $nuGetPublishArguments -Wait -PassThru -NoNewWindow
if ($nuGetPublishProcess.ExitCode -ne 0) {
    "Error during NuGet package publishing, exiting"
    exit $nuGetPublishProcess.ExitCode
}
"Finished package publishing"&lt;/pre&gt;
&lt;p&gt;&lt;span&gt;Set the five variables at the top according to your project setup and then start a Jenkins build. When everything’s configured correctly, you should see something like this in the ProGet web view of your feed:&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;span&gt;&lt;img id="__mcenew" style="display: block; margin-left: auto; margin-right: auto;" src="https://blog.dangl.me/media/1039/progetpackageview.png" alt="ProGet Package Overview" data-id="1128"&gt;&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;span&gt;And in Visual Studio:&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;span&gt;&lt;img id="__mcenew" style="display: block; margin-left: auto; margin-right: auto;" src="https://blog.dangl.me/media/1040/visualstudioprojectview.png" alt="Private NuGet Feed in Visual Studio" data-id="1129"&gt;&lt;/span&gt;&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt;&lt;span&gt;Now you’ve just drastically simplified using your own shared libraries!&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;span&gt;Oh, and on a side note: You should take into consideration when you perform that Jenkins job. It might be worthwhile to leave it as manual so you don’t get a new NuGet package version with every single commit you do.&lt;/span&gt;&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;</description>
      <pubDate>Sat, 30 Apr 2016 22:59:23 Z</pubDate>
      <a10:updated>2016-04-30T22:59:23Z</a10:updated>
    </item>
  </channel>
</rss>