ASP.NET Core + PostgreSQL + Docker + Bitbucket = ♥

By | April 25, 2017

Advertisement from Google

aspnet-core-bitbucket-docker-logoHow to build, test and deploy your ASP.NET Core application in a single click (commit & push)? In this article I will answer this question and show you how to configure CI and CD with Docker and Bitbucket.

We will develop simple ASP.NET Core application with single API method to save string value in database. We will use PostgreSQL as a storage for those values. All code will be hosted in bitbucket git repository and we will configure Bitbucket pipelines to build our application, create docker image and push it to docker hub every time we push code to the remote repository. After our image has been pushed to Docker Hub it will trigger webhook on our “production” server which will pull uploaded image from Docker Hub and restart docker-composer with new image

Docker and docker-compose

Let’s start with some basic tools we going to use in this article. Those who are already familiar with docker and docker-compose can skip directly to the next chapter.

What is docker? Here is an official answer. And simple one for those who never worked with containers before but has experience with virtual machines:

docker container – lightweight “virtual machine”

docker image – initial snapshot of the “vm”

Good explanation from Stack Overflow about difference between container and image: “An instance of an image is called a container. You have an image, which is a set of layers as you describe. If you start this image, you have a running container of this image. You can have many running containers of the same image.” Thomas Uhrig

Docker Hub – cloud based registry of docker images. You can create your own image and then push it to docker hub (similar to github for your code)

docker cli tools – set of tools to manage images and containers, as well to pull and push images from docker hub and do many other things

Dockerfile – file that contains instruction to create an image

docker-compose – tool to define and run multiple containers as a single application

ASP.NET Core application

Create WebAPI application:

mkdir test
cd test 
dotnet new webapi

Add PostgreSQL to your project (edit test.csproj file):

<Project Sdk="Microsoft.NET.Sdk.Web">

  <PropertyGroup>
    <TargetFramework>netcoreapp1.1</TargetFramework>
  </PropertyGroup>

  <ItemGroup>
    <Folder Include="wwwroot\" />
  </ItemGroup>

  <ItemGroup>
    <PackageReference Include="Microsoft.AspNetCore" Version="1.1.1" />
    <PackageReference Include="Microsoft.AspNetCore.Mvc" Version="1.1.2" />
    <PackageReference Include="Microsoft.Extensions.Logging.Debug" Version="1.1.1" />
	<PackageReference Include="Microsoft.EntityFrameworkCore" Version="1.1.1" />
    <PackageReference Include="Microsoft.EntityFrameworkCore.Tools" Version="1.1.0" />
	<PackageReference Include="Npgsql.EntityFrameworkCore.PostgreSQL" Version="1.1.0" />
    <PackageReference Include="Npgsql.EntityFrameworkCore.PostgreSQL.Design" Version="1.1.0" />
  </ItemGroup>

</Project>

Add AppDbContext.cs and Value.cs:

using Microsoft.EntityFrameworkCore;

namespace test
{
    public class AppDbContext : DbContext
    {
        public AppDbContext(DbContextOptions<AppDbContext> options) :base(options)
        {
        }

        public DbSet<Value> Values { get; set; }

        protected override void OnModelCreating(ModelBuilder builder)
        {
        }
    }
}
namespace test
{
    public class Value
    {
        public int Id { get; set; }
        public string Date { get; set; }
    }
}

Edit Startup.cs

public void ConfigureServices(IServiceCollection services)
{
	// Add framework services.
	services.AddMvc();
	
	var sqlConnectionString = Configuration.GetConnectionString("DataAccessPostgreSqlProvider");

	services.AddDbContext<AppDbContext>(options =>
		options.UseNpgsql(sqlConnectionString)
	);
}

public void Configure(IApplicationBuilder app, IHostingEnvironment env, ILoggerFactory loggerFactory)
{
	loggerFactory.AddConsole(Configuration.GetSection("Logging"));
	loggerFactory.AddDebug();

	app.UseMvc();
	
	using (var context = app.ApplicationServices.GetService(typeof(AppDbContext)) as AppDbContext)
	{
		context.Database.Migrate();
		// Other db initialization code.
	}
}

And now, specify connection string in application.json

{
  "Logging": {
    "IncludeScopes": false,
    "LogLevel": {
      "Default": "Warning"
    }
  },
  "ConnectionStrings": {
    "DataAccessPostgreSqlProvider": "User ID=test;Password=test;Host=testpostgres;Port=5432;Database=test;Pooling=true;"
  }
}

You should remember Host, ID, Password (Database should be the same as ID), we will use those values during postgresql container configuration.

Restore, build and publish:

dotnet restore test.csproj
dotnet build test.csproj
dotnet publish -c Release -o publish_output test.csproj

If you try to start it right now, you will get error that postgresql port is not reachable.

If you have postgresql installed locally, you can change Host, ID, Password to your local one, and run it again to test.

Build Docker image

Now we have application artifacts in a publish_output folder, it is time to build our docker image.

Create Dockerfile file in project root:


Advertisement from Google

FROM microsoft/aspnetcore:1.1
EXPOSE 80
COPY publish_output .
ENTRYPOINT ["dotnet", "test.dll"]

Here we define that our image should be based on microsoft/aspnetcore:1.1 from Docker Hub, then we say that we expose port 80, copy our application to the root of container and define entry point (script that will be executed when we start our container)

You can already test it by running:

docker build -t test-image .

This command will create image from Dockerfile with test-image name

You can run it:

docker run test-image

Create docker-compose.yml

In this file we will describe dependencies between our application image and official postgres image

version: '2'

services:
  testpostgres:
     image: postgres
     restart: always
     environment:
         POSTGRES_USER: test
         POSTGRES_PASSWORD: test
     volumes:
       - pgdata:/var/lib/postgresql/data
  testapp:
    image: testapp
    restart: always
    build:
      context: .
      dockerfile: Dockerfile
    ports:
       - 5000:80
    depends_on:
       - "testpostgres"

volumes:
  pgdata:

In this file we describe two services, their parameters and dependencies between them. As we can see we use standard postgres image from Docker Hub and pass some parameters. Service name should be the same as specified in connection string, user and password as well. Then we specify docker volume – persistent data storage for our postgres container.

In second part we define our application service by specifying which Dockerfile composer should use during build and what ports we forward from host to container.

Our postgres instance will be reachable by “service name”, so from our application we can connect via testpostgres to the database server.

To build service:

docker-compose build

And to run it locally:

docker-compose up -d

Configure CI

Create account and repository on Docker Hub, for example user name will be username and repository name testapp

Now you should enable pipelines on bitbucket and create bitbucket-pipelines.yml in the root of your repo:

image: microsoft/aspnetcore-build:1.0-1.1

pipelines:
  default:
    - step:
        script: # Modify the commands below to build your repository.
          - dotnet restore test.csproj
          - dotnet buildtest.csproj
          - dotnet publish -c Release -v n -o ./publish_output test.csproj
          - docker login -u $DOCKER_USERNAME -p $DOCKER_PASSWORD
          - docker build -t username/testapp:dev .
          - docker push username/test:dev

options:
  docker: true

Here we have defined where to build our application (inside microsoft/aspnetcore-build:1.0-1.1 image) and what to do during build (script section)

As you can see, with last three steps we connect to docker hub, build our image and push it to remote repo.

Run on remote server

On our “production server” we can create similar docker-compose file to use images from Docker Hub:

version: '2'

services:
  testpostgres:
     image: postgres
     restart: always
     environment:
         POSTGRES_USER: test
         POSTGRES_PASSWORD: test
     volumes:
       - pgdata:/var/lib/postgresql/data
  testapp:
    image: username/testapp:dev
    restart: always
    ports:
       - 5000:80
    depends_on:
       - "testpostgres"

volumes:
  pgdata:

As you can see it’s almost the same but we have removed build section and renamed image property, so we will use previously pushed image from Docker Hub.

Run:

 

docker-compose pull
docker-compose up -d

Go to http://localhost:5000/api/Values

Done!

6 thoughts on “ASP.NET Core + PostgreSQL + Docker + Bitbucket = ♥

  1. Pingback: ASP.NET Core + PostgreSQL + Docker + Bitbucket = ♥ - How to Code .NET

  2. Chris

    I need to run “dotnet ef database update” before I’m able to connect to my database. How can I execute this command automatically?

    Reply
  3. Dennis

    I am so glad you wrote a post that explained all of the things I needed to read about like Docker and BitBucket. Sites like these help the general public learn so much about network services. It’s a good thing Docker makes files so easy to access.

    Reply

Leave a Reply

Your email address will not be published. Required fields are marked *