Rasa nlu Installation and configuration

Introduction

  • Rasa nlu is the open source machine learning and artificial intelligence tool written in python and it can be installed and configured on standalone server.
  • It is compatible with other AI tools like wit , LUIS , or api.ai, so you can migrate your application data into rasa nlu tool.
  • Common use case of rasa nlu tool is to build chat bot application.

Setup Rasa nlu and configure

We are installing rasa nlu tool inside docker container, so please install docker on windows or linux machine. Refer Docker Installation URL.

Download latest rasa nlu project source code from github repository https://github.com/RasaHQ/rasa_nlu

git clone https://github.com/RasaHQ/rasa_nlu

Grant permission on on rasa_nlu directory for ec2-user:

build rasa nlu image:

cd rasa_nlu

Grant permissions on rasa_nlu folder, always be specific about permissions, but in my case its learning blogs:
sudo chown -R $USER .
sudo chmod go+rw .

Copy file from docker folder to root path and build docker image:

sudo cp /home/ec2-user/rasa_nlu/docker/Dockerfile_full /home/ec2-user/rasa_nlu/Dockerfile

docker build -t rasa/rasa_nlu .

Verify whether rasa/rasa_nlu image is created or not:

docker images

rasa nlu docker image

Rasa nlu docker image and its supporting images has been created successfully, then skip below section and continue with “Now, start docker container” section:

If images are not created as shown in above screen, then simply it means their is some error :

will download ready made image from https://hub.docker.com/r/rasa/rasa_nlu/ website, simply follow below steps:

First, remove if any running container and existing faulty images:

Then download ready made images using below command, which is created from Dockerfile_full file, so it has every thing that we required:

docker pull rasa/rasa_nlu

docker images

Can you see, list of images as shown above screenshot, if yes then continue with below section & if not then need some troubleshooting:

Now, start docker container:

docker run -p 5000:5000 rasa/rasa_nlu start

Verify, whether container is started or not?

docker ps

When rasa nlu service is, up and running then default model will be available, only few intents like hey, hello are available in default model, which is written/ hard coded inside code.

Check below screen, there is no trained model, so whatever we will query it will be on default model and its response may be something like that may be different as well keep going to the next step.

curl localhost:5000/status | python -mjson.tool

Check Model Command

Lets, query on default model, check below screen:

curl -XPOST localhost:5000/parse -d '{"q":"hey"}' | python -mjson.tool

Query on default model

Now, Lets install its dependents packages that are mentioned inside “rasa_nlu/alt_requirements/requirements_full.txt” or required to run rasa nlu, their might be chances like already installed it depends on Dockerfile we choose, but Dockerfile_full has installed it already:

Syntax:

Get container Id using “docker ps” command.

docker ps

Change <Docker id> / hard coded edc6fb6b876a with your latest id:

docker exec -it edc6fb6b876a pip install -r /app/alt_requirements/requirements_full.txt

Now, install language dependency package:

docker exec -it edc6fb6b876a python -m spacy.en.download all

Training model with “./data/examples/rasa/demo-rasa.json” default file, that is configured in “/sample_configs/config_spacy.json” file:

docker exec -it edc6fb6b876a python -m rasa_nlu.train -c /app/sample_configs/config_spacy.json

After successfully completion of the model training, it will return path where trained data kept like below:

INFO:rasa_nlu.model:Successfully saved model into ‘/app/projects/default/model_20170730-133446’

Using bash command lets check files on this path inside docker container:

docker exec -it edc6fb6b876a bash

cd /app/projects/default/model_20170730-133446

ls

below is the output:

crf_model.pkl         intent_classifier.pkl  regex_featurizer.json

entity_synonyms.json  metadata.json          training_data.json

Return back to host machine:

exit

Restart docker container or rasa nlu service to apply changes:

docker restart edc6fb6b876a

Now, Lets query on latest trained or ‘default’ model.

curl -XPOST localhost:5000/parse -d '{"q":"central indian restaurant"}' | python -mjson.tool

Here is the output of the passed query and its intent details:

Query on trained model

As a practice, you can query using complete or partial text string and see what intent it will return and find out nature rasa nlu working. Try below queries and also make your own:

curl -XPOST localhost:5000/parse -d '{"q":"hey"}' | python -mjson.tool

curl -XPOST localhost:5000/parse -d '{"q":"howdy"}' | python -mjson.tool

curl -XPOST localhost:5000/parse -d '{"q":"good evening"}' | python -mjson.tool

 

Great Job!!!! you have done with RASA NLU setup and configuration and thanks you very much  !!!

 

Please refer below rasa urls for more details:

rasa-nlu.readthedocs.io

10 thoughts on “Rasa nlu Installation and configuration”

  1. Sir, I would like you to thank for this wonderful article about installation. I was expecting this only. One doubt, why we are choosing Docker here? Without container it won’t run.. Please reply me back sir,I want to understand.. Trained models are used across the application?

    1. Actually, on same docker machine we hosted two services rasa nlu service and its rasa nlu UI application, this is as per our requirement, but I think you can install and configure it without docker as well, but try to follow and understand all the dependencies written in dockerfile.

      1. Thanks for your reply. And one more, I’m using Docker to start rasa nlu server and whenever I started its creating a new instance name. I have to setup the spacy installation every time for the new instance or I can’t use the instance which previously created(a week before or past) because I already downloaded n setup the spacy but today I started the server it’s not working with spacy after the installation only its working. Can u Pls help me to understand, Can we use the existing rasanlu instance(Docker PS will give the container ID right that’s I pointed as instance here) ? Or have to setup always?

        1. Hi Pradeepan, Thanks for question, This is happening because of every time you shutdown your machine or laptop, but in case of server it will not happen.

  2. Hi,
    Thanks for the steps
    Could you please explain how to use RASA NLU, create a chatbot and deploy locally on windows without using docker?

    Best Regards,
    Suni

  3. Hi,
    I am totally new to dockers and RASA. I followed this article to setup RASA nlu in Windows 7 64 bit machine. I was able to build and create the rasa docker image When I start docker container using docker run -p 5000:5000 rasa/rasa_nlu start , it says “starting webserver at 5000” “starting service…” and then hangs for infinite time! Is it downloading or processing something while starting the container?
    If I close the docker quick start window, open it again , execute “docker ps” command , i am able to see the container id, time etc.
    Now when I try to access the default model “curl localhost:5000/status | python -mjson.tool” , it throws “connection refused at 5000 , needs value for the argument” error.. am I missing something?
    Note : Tried changing the network but no luck.

    Thanks.

  4. I am getting following error on creating docker image:
    “no matching manifest for windows/amd64 in the manifest list entries”
    Could you please help me in this?

  5. Hi after restarting the docker as mentioned above,
    Restart docker container or rasa nlu service to apply changes:
    I could not find the updated model there. Please help me in this regards

Leave a Reply

Your email address will not be published. Required fields are marked *