Author Archives: druss

5 Simple Steps to Migrate Let’s Encrypt Certificates (certbot) to a New Server


Advertisement from Google

This guide is helpful for people who decided to migrate a website to another web server and have SSL certificates from Let’s Encrypt

letsencryptNote: This article describes the process for Ubuntu 18.04 but can also be used for other Linux distros (maybe with some small changes). As well, replace divbyte.com with your own domain

 

To successfully migrate your certificates you need to do this 5 simple steps:

  • Archive certificates on the old servers
  • Move them to a new server
  • Extract to the correct location
  • Create symlinks
  • Redirect domain

Let’s go through them in a bit more details:

Archive SSL certificates

First of all, you should find the actual location of the certificates. You can open your nginx or apache configuration to see the location:

cat /etc/nginx/sites-enabled/divbyte.com
...
 ssl_certificate /etc/letsencrypt/live/divbyte.com/fullchain.pem; # managed by Certbot
 ssl_certificate_key /etc/letsencrypt/live/divbyte.com/privkey.pem; # managed by Certbot
...

But this is not the actual place where certificates are located. These are symlinks, to see the actual location you should execute the following command:

sudo ls -l /etc/letsencrypt/live/divbyte.com
total 0
lrwxrwxrwx 1 root root 46 Mar 25 13:23 cert.pem -> /etc/letsencrypt/archive/divbyte.com/cert2.pem
lrwxrwxrwx 1 root root 47 Mar 25 13:24 chain.pem -> /etc/letsencrypt/archive/divbyte.com/chain2.pem
lrwxrwxrwx 1 root root 51 Mar 25 13:24 fullchain.pem -> /etc/letsencrypt/archive/divbyte.com/fullchain2.pem
lrwxrwxrwx 1 root root 49 Mar 25 13:24 privkey.pem -> /etc/letsencrypt/archive/divbyte.com/privkey2.pem

You also need to archive renewal config for your website. It’s located in the /etc/letsencrypt/renewal/<domain>/ folder. To archive all files, run the following:

sudo tar -chvzf certs.tar.gz /etc/letsencrypt/archive/divbyte.com /etc/letsencrypt/renewal/divbyte.com.conf

Now you can copy this archive to the web site location, so you can download it to the new server in the next step:

cp certs.tar.gz /var/www/divbyte.com/html/

Move SSL certificates

This is a really simple step. Log in to the new server and download certificates:


Advertisement from Google

ssh sevennet.org
wget https://divbyte.com/certs.tar.gz

Extract to the correct location

Now you need to extract files to the correct location on the new server. Insite archive we already have the correct folder structure, so you can extract it “as is” if you are in the root folder:

cd /
sudo tar -xvf ~/certs.tar.gz

Note: If on the new server you have different Linux distro or custom letsencrypt installation you may need to manually copy files to the correct location.

Create symlinks

For the correct work, you need to create symlinks in the live folder for your domain:

sudo ln -s /etc/letsencrypt/archive/divbyte.com/cert2.pem /etc/letsencrypt/live/divbyte.com/cert.pem
sudo ln -s /etc/letsencrypt/archive/divbyte.com/chain2.pem /etc/letsencrypt/live/divbyte.com/chain.pem
sudo ln -s /etc/letsencrypt/archive/divbyte.com/fullchain2.pem /etc/letsencrypt/live/divbyte.com/fullchain.pem
sudo ln -s /etc/letsencrypt/archive/divbyte.com/privkey2.pem /etc/letsencrypt/live/divbyte.com/privkey.pem

Point domain to the new server

Update nginx or apache configuration to use new certificates (for nginx):

 ssl_certificate /etc/letsencrypt/live/divbyte.com/fullchain.pem; # managed by Certbot
 ssl_certificate_key /etc/letsencrypt/live/divbyte.com/privkey.pem; # managed by Certbot

Go to your DNS manager and change the A record, so it is pointing to the new server.

Note: At this point, you should have all the content and database migrated to the new server, so you can safely switch your domain to the new server.

This step is required to successfully run a test renewal:

sudo letsencrypt renew --dry-run

You do not need to modify cron tasks for certbot since it’s configured in a way that will renew all certificates:

sudo cat /etc/cron.d/certbot

SHELL=/bin/sh
PATH=/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin

0 */12 * * * root test -x /usr/bin/certbot -a \! -d /run/systemd/system && perl -e 'sleep int(rand(43200))' && certbot -q renew

That’s it, the domain name is pointing to the new server and certificates can be automatically renewed

Divbyte – Software Development Company

divbyteI would like to share with everyone the project I have been working on for some time.

Since I love solving problems, communicating with people, and programming (significantly more than 40 hours per week), and working on pet-projects became a bit boring, I decided to start helping real people with real problems.

I had a choice, find a part-time freelance job, which I did for a few times, or move to the next level, establish my own software development company and, together with other enthusiasts, start helping people on a larger scale. I discussed this idea with my developer friends and Tatiana, a top-notch business development specialist, and we decided to go for it!

That’s how Divbyte was born!

The group of hardcore software engineers and the excellent Biz Dev specialist, what could be better for a great start? Tatiana immediately found a way to offer our services on the market. From the engineering side, we deliver high-quality results on time, without exceeding the original budget.

You can see reviews of satisfied clients for the first projects on the website (we will publish case studies soon).
The team has grown since then and ready to take new, more significant projects. Today we are launching our website and want to share it with you! If you need a team of excellent developers, designers, architects or QA engineers – don’t hesitate to contact us!

We are ready to solve your problems.

ASP.NET Core + PostgreSQL + Docker + Bitbucket = ♥

aspnet-core-bitbucket-docker-logoHow to build, test and deploy your ASP.NET Core application in a single click (commit & push)? In this article I will answer this question and show you how to configure CI and CD with Docker and Bitbucket.

We will develop simple ASP.NET Core application with single API method to save string value in database. We will use PostgreSQL as a storage for those values. All code will be hosted in bitbucket git repository and we will configure Bitbucket pipelines to build our application, create docker image and push it to docker hub every time we push code to the remote repository. After our image has been pushed to Docker Hub it will trigger webhook on our “production” server which will pull uploaded image from Docker Hub and restart docker-composer with new image

Docker and docker-compose

Let’s start with some basic tools we going to use in this article. Those who are already familiar with docker and docker-compose can skip directly to the next chapter.

What is docker? Here is an official answer. And simple one for those who never worked with containers before but has experience with virtual machines:

docker container – lightweight “virtual machine”

docker image – initial snapshot of the “vm”

Good explanation from Stack Overflow about difference between container and image: “An instance of an image is called a container. You have an image, which is a set of layers as you describe. If you start this image, you have a running container of this image. You can have many running containers of the same image.” Thomas Uhrig

Docker Hub – cloud based registry of docker images. You can create your own image and then push it to docker hub (similar to github for your code)

docker cli tools – set of tools to manage images and containers, as well to pull and push images from docker hub and do many other things

Dockerfile – file that contains instruction to create an image

docker-compose – tool to define and run multiple containers as a single application

ASP.NET Core application

Create WebAPI application:

mkdir test
cd test 
dotnet new webapi

Add PostgreSQL to your project (edit test.csproj file):

<Project Sdk="Microsoft.NET.Sdk.Web">

  <PropertyGroup>
    <TargetFramework>netcoreapp1.1</TargetFramework>
  </PropertyGroup>

  <ItemGroup>
    <Folder Include="wwwroot\" />
  </ItemGroup>

  <ItemGroup>
    <PackageReference Include="Microsoft.AspNetCore" Version="1.1.1" />
    <PackageReference Include="Microsoft.AspNetCore.Mvc" Version="1.1.2" />
    <PackageReference Include="Microsoft.Extensions.Logging.Debug" Version="1.1.1" />
	<PackageReference Include="Microsoft.EntityFrameworkCore" Version="1.1.1" />
    <PackageReference Include="Microsoft.EntityFrameworkCore.Tools" Version="1.1.0" />
	<PackageReference Include="Npgsql.EntityFrameworkCore.PostgreSQL" Version="1.1.0" />
    <PackageReference Include="Npgsql.EntityFrameworkCore.PostgreSQL.Design" Version="1.1.0" />
  </ItemGroup>

</Project>

Add AppDbContext.cs and Value.cs:

using Microsoft.EntityFrameworkCore;

namespace test
{
    public class AppDbContext : DbContext
    {
        public AppDbContext(DbContextOptions<AppDbContext> options) :base(options)
        {
        }

        public DbSet<Value> Values { get; set; }

        protected override void OnModelCreating(ModelBuilder builder)
        {
        }
    }
}
namespace test
{
    public class Value
    {
        public int Id { get; set; }
        public string Date { get; set; }
    }
}

Edit Startup.cs

public void ConfigureServices(IServiceCollection services)
{
	// Add framework services.
	services.AddMvc();
	
	var sqlConnectionString = Configuration.GetConnectionString("DataAccessPostgreSqlProvider");

	services.AddDbContext<AppDbContext>(options =>
		options.UseNpgsql(sqlConnectionString)
	);
}

public void Configure(IApplicationBuilder app, IHostingEnvironment env, ILoggerFactory loggerFactory)
{
	loggerFactory.AddConsole(Configuration.GetSection("Logging"));
	loggerFactory.AddDebug();

	app.UseMvc();
	
	using (var context = app.ApplicationServices.GetService(typeof(AppDbContext)) as AppDbContext)
	{
		context.Database.Migrate();
		// Other db initialization code.
	}
}

And now, specify connection string in application.json

{
  "Logging": {
    "IncludeScopes": false,
    "LogLevel": {
      "Default": "Warning"
    }
  },
  "ConnectionStrings": {
    "DataAccessPostgreSqlProvider": "User ID=test;Password=test;Host=testpostgres;Port=5432;Database=test;Pooling=true;"
  }
}

You should remember Host, ID, Password (Database should be the same as ID), we will use those values during postgresql container configuration.

Restore, build and publish:

dotnet restore test.csproj
dotnet build test.csproj
dotnet publish -c Release -o publish_output test.csproj

If you try to start it right now, you will get error that postgresql port is not reachable.

If you have postgresql installed locally, you can change Host, ID, Password to your local one, and run it again to test.

Build Docker image

Now we have application artifacts in a publish_output folder, it is time to build our docker image.

Create Dockerfile file in project root:

FROM microsoft/aspnetcore:1.1
EXPOSE 80
COPY publish_output .
ENTRYPOINT ["dotnet", "test.dll"]

Here we define that our image should be based on microsoft/aspnetcore:1.1 from Docker Hub, then we say that we expose port 80, copy our application to the root of container and define entry point (script that will be executed when we start our container)

You can already test it by running:

docker build -t test-image .

This command will create image from Dockerfile with test-image name

You can run it:

docker run test-image

Create docker-compose.yml

In this file we will describe dependencies between our application image and official postgres image

version: '2'

services:
  testpostgres:
     image: postgres
     restart: always
     environment:
         POSTGRES_USER: test
         POSTGRES_PASSWORD: test
     volumes:
       - pgdata:/var/lib/postgresql/data
  testapp:
    image: testapp
    restart: always
    build:
      context: .
      dockerfile: Dockerfile
    ports:
       - 5000:80
    depends_on:
       - "testpostgres"

volumes:
  pgdata:

In this file we describe two services, their parameters and dependencies between them. As we can see we use standard postgres image from Docker Hub and pass some parameters. Service name should be the same as specified in connection string, user and password as well. Then we specify docker volume – persistent data storage for our postgres container.

In second part we define our application service by specifying which Dockerfile composer should use during build and what ports we forward from host to container.

Our postgres instance will be reachable by “service name”, so from our application we can connect via testpostgres to the database server.

To build service:

docker-compose build

And to run it locally:

docker-compose up -d

Configure CI

Create account and repository on Docker Hub, for example user name will be username and repository name testapp

Now you should enable pipelines on bitbucket and create bitbucket-pipelines.yml in the root of your repo:

image: microsoft/aspnetcore-build:1.0-1.1

pipelines:
  default:
    - step:
        script: # Modify the commands below to build your repository.
          - dotnet restore test.csproj
          - dotnet buildtest.csproj
          - dotnet publish -c Release -v n -o ./publish_output test.csproj
          - docker login -u $DOCKER_USERNAME -p $DOCKER_PASSWORD
          - docker build -t username/testapp:dev .
          - docker push username/test:dev

options:
  docker: true

Here we have defined where to build our application (inside microsoft/aspnetcore-build:1.0-1.1 image) and what to do during build (script section)

As you can see, with last three steps we connect to docker hub, build our image and push it to remote repo.

Run on remote server

On our “production server” we can create similar docker-compose file to use images from Docker Hub:

version: '2'

services:
  testpostgres:
     image: postgres
     restart: always
     environment:
         POSTGRES_USER: test
         POSTGRES_PASSWORD: test
     volumes:
       - pgdata:/var/lib/postgresql/data
  testapp:
    image: username/testapp:dev
    restart: always
    ports:
       - 5000:80
    depends_on:
       - "testpostgres"

volumes:
  pgdata:

As you can see it’s almost the same but we have removed build section and renamed image property, so we will use previously pushed image from Docker Hub.

Run:

 

docker-compose pull
docker-compose up -d

Go to http://localhost:5000/api/Values

Done!

Does anyone read LinkedIn?

LinkedIn is the most popular business-oriented social network. A lot of us have an account there, but not a lot write something there. Most of the time you can see posts from recruiters about new “awesome” position in “the best” company in the world, but almost no articles about technical aspects of the work.

I have asked myself, is it worth to share my “IT” related articles on LinkedIn, does anyone gonna read them?

So, I made a post there ten days ago: “Hi, Guys! Can you somehow let me know if you see this post (like or comment). I want to see how many people are reading this feed. Just a small experiment. Thanks!

After 10 days I have 10014 views, 131 likes and 13 comments, and that’s with 1140 connections.

Here is detailed statistic about views:

Developers is the most common group in my network, so it is clear that they see this post the most. Recruiters on the second place 🙂

Currently, I am living in The Netherlands, close to Amsterdam, so a lot of views came from people living in the same area. I am surprised that Ukraine is not in the list.

But in the next screen you will see only Ukrainian companies. The biggest outsourcing companies in Ukraine, Luxoft and Ciklum are missing….

And of course the most views came from my 2nd degree network:

I think that LinkedIn is good enough to share work related articles. You will get quite a lot of views and most of them will be from people who share your work-related interests.

P.S. Join my network: https://www.linkedin.com/in/ivan-derevianko-5a237239/

Enable bash on Windows 10

winbdows_bashFinally! You can now use bash and almost any linux program on your Windows 10 machine. You do not need cygwin or MinGW anymore! This will give you opportunity to use rich variety of tools available only on linux. For example wrk – a great HTTP benchmarking tool which I plan to use for a new ASP.NET Core 1.1 benchmark.

Linux’s user space available from Windows 10 version 1607. But it’s disabled by default.

To enable it you should:

  1. Go to Settings -> Updates & security -> for developers and switch to Developer mode
  2. Go to Control Panel -> Programs and Features -> Turn Windows features on or off -> enable Windows subsystem for Linux (Beta)
  3. Restart computer
  4. Run bash!

 

Nginx Server Blocks (Virtual Hosts) Example Template

nginxHere is an example of the nginx server block (Virtual Hosts) which you can use to host multiple web sites on the same server.

Just replace example.com with your own domain name. Do not forget to create all required folders and sent right permissions.

Copy one of those templates to the /etc/nginx/sites-available/example.com

Proxy example

# Ivan Derevianko aka druss http://druss.co
# Force without domain domain without www
server {
    server_name  www.example.com;
    listen 80;
    rewrite ^(.*) http://example.com$1 permanent;
}

# Main server
server {
    server_name  example.com;
    listen 80;
    root /var/example.com/www;
    access_log /var/example.com/logs/access.log;
    error_log /var/example.com/logs/error.log;

    location / {
        proxy_pass http://localhost:5000;
	proxy_http_version 1.1;
        proxy_set_header Upgrade $http_upgrade;
        proxy_set_header Connection keep-alive;
        proxy_set_header Host $host;
        proxy_cache_bypass $http_upgrade;
    }
}

Php-fpm example

# Ivan Derevianko aka druss http://druss.co
# Force without domain domain without www
server {
    server_name  www.example.com;
    listen 80;
    rewrite ^(.*) http://example.com$1 permanent;
}

# Main server
server {
    server_name  example.com;
    listen 80;
    root /var/example.com/www;
    access_log /var/example.com/logs/access.log;
    error_log /var/example.com/logs/error.log;

    client_max_body_size 128M;

    index index.php index.html index.htm;
	
    location / {
        try_files $uri $uri/ /index.php$is_args$args;
    }
	
    location ~ \.php$ {
        include snippets/fastcgi-php.conf;
        fastcgi_pass unix:/run/php/php7.0-fpm.sock;
    }
}

Deploy and run .NET Core application without installed runtime. Self-contained applications.

net-core-self-contained.NET Core framework provides one very useful feature – Self-contained application. You don’t need to install any .net runtime on your computer, you can just copy your application to the target host and run it. Furthermore, you can run it on any platform (Windows, Linux, OSX)!

When you create a self-contained application, after publishing, you will see the whole .net runtime next to your application.

There are some advantages:

  • You do not need to have installed .NET Core on a target machine
  • You can run applications with different .NET Core versions at the same time

But also, disadvantages:

  • Bigger application size (for empty web API project around 50 MB)
  • Limited target runtimes list (see list here)

Let’s create simple web API application and deploy it to the clean Ubuntu 14.04 machine!

Developer prerequisites

  • Installed .NET Core
  • Editor (Visual Studio Code, Visual Studio 2015, Rider, notepad, etc…)
  • sftp client (winscp)
  • ssh client (putty)

Application

Create empty application

dotnet new

Edit project.json:

{
  "version": "1.0.0-*",
  "buildOptions": {
    "emitEntryPoint": true
  },
  
  "dependencies": {
    "Microsoft.NETCore.App": "1.0.0",
    "Microsoft.AspNetCore.Mvc": "1.0.0",
    "Microsoft.AspNetCore.Server.Kestrel": "1.0.0",
    "Microsoft.Extensions.Logging": "1.0.0",
    "Microsoft.Extensions.Logging.Console": "1.0.0",
    "Microsoft.Extensions.Logging.Debug": "1.0.0"
  },
  
  "frameworks": {
    "netcoreapp1.0": {
      "imports": "dnxcore50"
    }
  },
  
  "runtimeOptions": {
    "configProperties": {
      "System.GC.Server": true
    }
  },
  
  "runtimes": {
      "win10-x64": {},
      "ubuntu.14.04-x64": {}
  }
}

An important part is runtimes section. You can find supported runtimes here: RID catalog

Also, you should not include  “type”: “platform” option in the following section

"Microsoft.NETCore.App": "1.0.0",

Now, edit Program.cs

using Microsoft.AspNetCore.Hosting;

namespace WebServiceApplication
{
    public class Program
    {
        public static void Main(string[] args)
        {
            var host = new WebHostBuilder()
                .UseKestrel()
                .UseUrls("http://+:5000")
                .UseStartup<Startup>()
                .Build();

            host.Run();
        }
    }
}

In this file, we have configured our hosting environment, which is Kestrel (cross-platform web server) and specified which IP and port it will use.

Create Startup.cs

using Microsoft.AspNetCore.Builder;
using Microsoft.AspNetCore.Hosting;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Logging;

namespace WebServiceApplication
{
    public class Startup
    {
        public void ConfigureServices(IServiceCollection services)
        {
            services.AddMvc();
        }

        public void Configure(IApplicationBuilder app, IHostingEnvironment env, ILoggerFactory loggerFactory)
        {
            loggerFactory.AddConsole();
            loggerFactory.AddDebug();

            app.UseMvc(routes =>
            {
                routes.MapRoute(
                    name: "default",
                    template: "{controller}/{action}/{id?}");
            });
        }
    }
}

Here we enabled MVC and Logging in our application.

Now the logic, LuckController.cs

using Microsoft.AspNetCore.Mvc;

namespace WebServiceApplication
{
    public class LuckController : Controller
    {
        public IActionResult Try()
	{
		return Ok(42);
	}
    }
}

Now let’s restore all required packages. In command line:

dotnet restore

And run application

dotnet run

If everything is ok you can go to http://localhost:5000/Luck/Try  and you will see 42.

Publish and run application on a remote machine

To create self-contained application run following command:

dotnet publish --configuration Release --runtime ubuntu.14.04-x64

where –runtime value is one of the runtimes elements from project.json

You will see folder where you can find self-contained application

Archive and copy folder to the target machine.

Extract archive, navigate to the extracted folder and make file executable:

chmod +x test

Run application:

./test

Your first self-contained ASP.NET Core application is ready!

Here is more information about different .NET Core application types

Speed up Selenium WebDriver’s page parsing time

selenium-csqueryIf you are using Selenium WebDriver as a web crawler and thinking that it’s too slow, welcome inside!

In this article, we will see how to make page parsing time around 50 times faster.

As an example, I will parse comments from another article from this blog. I will first parse it using default WebDriver API (FindElement… methods) and then will compare it to CsQuery

Here is WebDriver parsing code:

var driver = new ChromeDriver();
driver.Navigate().GoToUrl("http://druss.co/2014/07/fixed-setup-was-unable-to-create-a-new-system-partition-or-locate-an-existing-system-partition-during-installing-windows-8-18-7-vista-etc-from-usb/");

// start stopwatch
var comments = driver.FindElementsByCssSelector("li.comment");
foreach (var comment in comments)
{
    var parserComment = new Comment();

    parserComment.Author = comment.FindElement(By.CssSelector("cite.fn")).Text;
    parserComment.Date = comment.FindElement(By.TagName("time")).Text;
    parserComment.Content = comment.FindElement(By.ClassName("comment-content")).Text;
}
// stop stopwatch

And this is tooooo slow! I have around 225 comments there, and it took me 26 seconds to parse their structure.  All WebDriver’s search methods are very slow!, I think this is because it does not cache page’s content and always make a request to a browser.

The other approach I gonna test is a bit different. First, I will get HTML code for a comments block and then parse it using CsQuery library (fast, open-source, .net jQuery implementation)

var driver = new ChromeDriver();
driver.Navigate().GoToUrl("http://druss.co/2014/07/fixed-setup-was-unable-to-create-a-new-system-partition-or-locate-an-existing-system-partition-during-installing-windows-8-18-7-vista-etc-from-usb/");
// start stopwatch
var html = driver.FindElementById("comments").GetAttribute("innerHTML");
CQ dom = html;
var comments = dom["li.comment"];

foreach (var comment in comments.Select(x => x.Cq()))
{
 var parserComment = new Comment();
 parserComment.Author = comment["cite.fn"].Text();
 parserComment.Date = comment["time"].Text();
 parserComment.Content = comment[".comment-content"].Text();
}
// stop stopwatch

This code took only 600 milliseconds

You could also use other approaches such as Html Agility Pack, RegEx, IndexOf(), etc.

The main idea is to wait until page content is loaded, load HTML using FindElement* or  similar method and then parse it using more performance friendly tools.

[Mono] Selenium with headless ChomeDriver on Ubuntu Server

selenium-chromedriver-headless-monoIf you want to run C# application (mono) with Selenium ChomeDriver on Ubuntu Server 16.04 in headless mode, you definitely should read this article. We will use Xvfb as X server, this will let us emulate “headless” mode because Xvfb performs all graphical operations in memory without showing any screen output.

Install mono

sudo apt-key adv --keyserver hkp://keyserver.ubuntu.com:80 --recv-keys 3FA7E0328081BFF6A14DA29AA6A19B38D3D831EF
echo "deb http://download.mono-project.com/repo/debian wheezy main" | sudo tee /etc/apt/sources.list.d/mono-xamarin.list
sudo apt-get update
sudo apt-get install mono-complete
sudo apt-get install referenceassemblies-pcl

(official docs)

Install Google Chrome

wget -q -O - https://dl-ssl.google.com/linux/linux_signing_key.pub | sudo apt-key add - sudo sh -c 'echo "deb [arch=amd64] http://dl.google.com/linux/chrome/deb/ stable main" >> /etc/apt/sources.list.d/google.list'
sudo apt-get update 
sudo apt-get install google-chrome-stable

Install Xvfb

apt-get install xvfb imagemagick

Run ChromeDriver in headless mode

We will use following application for test:

class Program
{
    static void Main(string[] args)
    {
        var driver = new ChromeDriver();
        driver.Navigate().GoToUrl("http://druss.co");

        Console.WriteLine("Started. Press Enter to exit");
        Console.ReadLine();
    }
}
chmod +x chromedriver

For example, you compiled your app to /home/druss/Selenium/TestChromeDriver.exe

xvfb-run --server-args='-screen 0, 1920x1080x24' --auth-file=/home/druss/.Xauth -a mono /home/druss/Selenium/TestChromeDriver.exe

You should see an output from the program.

To run your program in background mode:

xvfb-run --server-args='-screen 0, 1920x1080x24' --auth-file=/home/druss/.Xauth -a mono /home/druss/Selenium/TestChromeDriver.exe &> /dev/null &

Make screenshots

You can make a screenshot of the running process. Every time you start xvfb-run with -a parameter it searches for free display (starts from 99) and uses it as an output.

To access this data you should set one variable:

--auth-file=/home/druss/.Xauth

After this you can take screenshot:

DISPLAY=:99 import -window root testChromeDriver.png

ZohoPeopleTimeLogger v1.4 – Smart filtering

ZohoPeopleTimeLoggerChanges:

  • Show vacations only for currently logged in user (Previously, if you have access to other user’s leave tracker you will see his days off as your vacation)
  • Show only days off with Holiday and Sick type (Exclude 2hours short leave and others)
  • Display leave type in calendar (Before was always Vacation, now Holiday or Sick)

Download (GitHub)

ZohoPeopleClient

This C# library was also updated to v1.0.2. There are two new methods available in Fetch Record API:

public enum SearchColumn
{
     EMPLOYEEID,
     EMPLOYEEMAILALIAS
}

Task<List<dynamic>> IFetchRecordApi.GetByFormAsync(string formName, SearchColumn searchColumn, string searchValue);
Task<List<dynamic>> IFetchRecordApi.GetByViewAsync(string viewName, SearchColumn searchColumn, string searchValue);

GitHub