Warning: This document is for the development version of Rasa Platform. The latest version is 0.18.0.

Deploying Custom Rasa Core or Rasa NLU Code

The Rasa Stack (Core and NLU) are open source and many Rasa Platform users have developed custom NLU components and Rasa Core Policies. This page explains how to let your custom Rasa Core & NLU servers communicate with Rasa Platform. We will show you how to run them as Docker containers, but this is optional. You can skip Section 2 if you don’t want to use Docker.

Steps:
  1. Creating your custom Rasa NLU and Core servers
  2. Running these servers as Docker containers
  3. Modifying the docker-compose.yml
  4. Deploying the changes

1. Building Custom Rasa Stack Servers

Both servers for Rasa NLU and the Rasa Core require the correct endpoint configuration to connect to the other components. Please refer to the Rasa Core endpoint documentation and the Rasa NLU endpoint documentation for detailed instructions.

1.1 Custom Rasa NLU

Your custom version of Rasa NLU should launch an instance of the builtin server class RasaNLU. We recommend that you define it in a script called nlu_server.py. This should be located in your custom Rasa NLU branch or directory, and can make use of any custom code you need.

1.2 Custom Rasa Core

The steps for your custom Rasa Core instance are the same as we saw above for NLU. Create a script called core_server.py that runs the serve_application() method defined in rasa_core.run.

2. Docker Containers for Custom Stack

2.1 Rasa NLU

The next step is to write a Dockerfile instructing Docker to install any necessary dependencies and run the script when the container is started. You can either extend the official Rasa NLU image or build your own from scratch using the offical Dockerfiles as template.

You can now build your Docker image. The following command creates the image and tags it as <YOUR_RASA_NLU_IMAGE>:

$ sudo docker build -t <YOUR_RASA_NLU_IMAGE> .

Note

This command requires your Dockerfile to be located in the root directory of the Rasa NLU version you wish to install.

You may push your image to your private Docker registry with:

$ sudo docker login -u USER -p PASSWORD https://my-private-docker-registry.com
$ sudo docker push <YOUR_RASA_NLU_IMAGE>

2.2 Rasa Core

You can either extend the official Docker image or create an image from scratch. You can use the official Dockerfile as template.

Note

As for NLU, this command requires your Dockerfile to be located in the root directory of the Rasa Core version you would like to install.

Finally, you can build your image with:

$ sudo docker build -t <YOUR_RASA_CORE_IMAGE> .

Pushing this image to your private docker registry works as follows:

$ sudo docker login -u USER -p PASSWORD https://my-private-docker-registry.com
$ sudo docker push <YOUR_RASA_CORE_IMAGE>

3. Modifying docker-compose config

Case 1: Your custom servers are docker images

Rasa NLU and Core can both be run as standalone docker containers which can be used on Rasa Platform instead of an official build. You have to make sure docker-compose points to a Docker image of your custom version of Rasa NLU and Core. To do this, edit or create docker-compose.override.yml, and create entries for the nlu or core service (or both). Here’s an example in which both core and nlu point to custom images:

version: "3.4"
services:
  nlu:
    image: <YOUR_RASA_NLU_IMAGE>
  core:
    image: <YOUR_RASA_CORE_IMAGE>

Replace <YOUR_RASA_NLU_IMAGE> and <YOUR_RASA_CORE_IMAGE> with the image names of your custom NLU and Core versions. These could either be images that you’ve built locally on your server, or they could be URLs pointing to your private Docker registry.

Case 2: Your custom servers run natively in python

If your custom code runs as plain python code on another server, you will have to modify your docker-compose.override.yml to make sure the Platform does not run the default Core and NLU services, and instead runs dummy images that do nothing but print a “Hello” message. Your docker-compose.override.yml should contain the following:

version: "3.4"
services:
  nlu:
    image: hello-world
    restart: "no"
    volumes: []
    depends_on: []
  core:
    image: hello-world
    restart: "no"
    volumes: []
    depends_on: []
  app:
    depends_on: []
  logger:
    depends_on:
    - api
    - platform-ui
    - event-service
    - app
    - nginx
    - mongo
    - duckling
    - rabbit

4. Deploying the updated Platform

Start your Platform with your custom NLU or Core images with

$ cd /etc/rasaplatform
$ sudo docker login  -u _json_key -p "$(cat gcr-auth.json)" https://gcr.io
$ sudo docker-compose pull
$ sudo docker-compose up -d

Note

In case your images point to a Docker registry, you need to pull them before starting up the Platform. For example, if your NLU image is located at my-private-docker-registry.com/my-custom-nlu:latest, and your Core image at my-private-docker-registry.com/my-custom-core:latest, the commands are

$ sudo docker login -u USER -p PASSWORD https://my-private-docker-registry.com
$ sudo docker pull my-private-docker-registry.com/my-custom-nlu:latest
$ sudo docker pull my-private-docker-registry.com/my-custom-core:latest