Warning: This document is for an old version of Rasa Core. The latest version is 0.14.5.

Installation

Install Rasa Core to get started with the Rasa stack.

Note

You can also get started without installing anything by going to the Quickstart

Install Rasa Core

Stable (Most recent release)

The recommended way to install Rasa Core is using pip:

    
pip install rasa_core
copied!

If you already have rasa_core installed and want to update it run:

    
pip install -U rasa_core
copied!

Unless you’ve already got numpy & scipy installed, we highly recommend that you install and use Anaconda.

Note

If you want to run custom action code, please also take a look at Actions. You’ll need to install Rasa Core to train and use the model and rasa_core_sdk to develop your custom action code.

Latest (Most recent github)

If you want to use the bleeding edge version of Rasa use github + setup.py:

git clone https://github.com/RasaHQ/rasa_core.git
cd rasa_core
pip install -r requirements.txt
pip install -e .

Development (github & development dependencies)

If you want to change the Rasa Core code and want to run the tests or build the documentation, you need to install the development dependencies:

pip install -r dev-requirements.txt
pip install -e .

Add Natural Language Understanding

We use Rasa NLU for intent classification & entity extraction. To get it, run:

pip install rasa_nlu[tensorflow]

Full instructions can be found in the NLU documentation.

You can also use other NLU services like wit.ai, dialogflow, or LUIS. In fact, you don’t need to use NLU at all, if your messaging app uses buttons rather than free text.

Build your first Rasa assistant!

After following the quickstart and installing Rasa Core, the next step is to build your first Rasa assistant yourself! To get you started, we have prepared a Rasa Stack starter-pack which has all the files you need to build your first custom chatbot. On top of that, the starter-pack includes a training data set ready for you to use.

Click the link below to get the Rasa Stack starter-pack:

Rasa Stack starter-pack

Let us know how you are getting on! If you have any questions about the starter-pack or using Rasa Stack in general, post your questions on Rasa Community Forum!

Using Docker Compose

Rasa provides all components as official Docker images which are continuously updated. To quickly run Rasa Core with other components, you can use the provided docker compose file. This is useful for a quick local setup or if you want to host the Rasa components on cloud services.

Compose File Example

version: '3.0'

services:
  rasa_core:
    image: rasa/rasa_core:latest
    networks: ['rasa-network']
    ports:
    - "5005:5005"
    volumes:
    - "./rasa-app-data/models/current/dialogue:/app/model"
    - "./rasa-app-data/config:/app/config"
    - "./rasa-app-data/project:/app/project"
    command:
    - start
    - -d
    - ./model
    - -c
    - rest
    - -u
    - current/nlu
    - --endpoints
    - config/endpoints.yml

  action_server:
    image: rasa/rasa_core_sdk:latest
    networks: ['rasa-network']
    ports:
    - "5055:5055"
    volumes:
    - "./rasa-app-data/actions:/app/actions"

  rasa_nlu:
    image: rasa/rasa_nlu:latest-full
    networks: ['rasa-network']
    ports:
    - "5000:5000"
    volumes:
    - "./rasa-app-data/models/:/app/projects"
    - "./rasa-app-data/logs:/app/logs"

  duckling:
    image: rasa/duckling:latest
    networks: ['rasa-network']
    ports:
    - "8000:8000"

networks: {rasa-network: {}}

Note

If you do not require components like nlu or duckling, you can simply remove them from your docker compose file.

Running it

To run all components locally, execute docker-compose up. You can then interact with your chat bot using the HTTP API. For example:

curl -XPOST \
    --header 'content-type: application/json' \
    --data '{"message": "Hi Bot"}' \
    http://localhost:5005/webhooks/rest/webhook

To run commands inside a specific container, use docker-compose run <container name>. For example to train the core model:

docker-compose run rasa_core train

Volume Explanation

  • ./rasa-app-data/models/current/dialogue: This directory contains the trained Rasa Core models. You can also move previously trained models to this directory to load them within the Docker container.

  • ./rasa-app-data/config: This directory is for the configuration of the endpoints and of the different Chat & Voice platforms you can use Rasa Core with.

    • To connect other components with Rasa Core this directory should contain a file endpoints.yml, which specifies how to reach these components. For the shown docker-compose example the file should look like this:

      action_endpoint:
          url: 'http://action_server:5055/webhook'
      nlu:
          url: 'http://rasa_nlu:5000'
      
    • If you use connectors to Chat & Voice platforms you have to configure the required credentials for these in a file credentials.yml. Use the provided credentials by adding --credentials <path to your credentials file> to the run command of Rasa Core.

  • ./rasa-app-data/project: This directory contains your Rasa project and may be used to train a model.

  • ./rasa-app-data/models/: This directory contains the nlu project and its trained models. You can also move previously trained models to this directory to load them within the Docker container.

Note

You can also use custom directory structures or port mappings. But don’t forget to reflect this changes in the docker compose file and in your endpoint configuration.

Have questions or feedback?

We have a very active support community on Rasa Community Forum that is happy to help you with your questions. If you have any feedback for us or a specific suggestion for improving the docs, feel free to share it by creating an issue on Rasa Core GitHub repository.