Warning: Bocadillo is now UNMAINTAINED. Users are recommended to migrate to a supported alternative, such as Starlette or FastAPI. Please see #344 for more information.


You installed Bocadillo and got up to speed with async in Python? Perfect! Let us take you on a guided tour on how to work with Bocadillo.

In this step-by-step tutorial, we will build a chatbot server together.

We'll go through many aspects of building apps with Bocadillo, including:

  • Generating a project using the Bocadillo CLI.
  • Using the built-in WebSocket support to handle multiple connections in real-time.
  • Creating REST endpoints using routing and views.
  • Using providers to inject reusable resources into views.
  • Testing an application using pytest.

We'll use the ChatterBot library to build Diego, a friendly conversational agent. Don't worry, this won't require any background in data science nor chatbot technology!

Sounds exciting? Let's dive in! 🙌

Setting up the project

First things first: let's set up our project.

  1. Open up a terminal, and go to your favorite development directory. For example
cd ~/dev
  1. Install the Bocadillo CLI globally:
pip install bocadillo-cli
  1. Use the CLI to generate a new project called chatbot:
bocadillo create chatbot
  1. Run cd chatbot, and you should have the following directory structure:
$ tree
├── .gitignore
├── README.md
├── requirements.txt
└── chatbot
    ├── __init__.py
    ├── app.py
    ├── asgi.py
    ├── providerconf.py
    └── settings.py
  1. Edit requirements.txt to add Chatterbot there:
bocadillo >= 0.14
pytz  # Required by Chatterbot
  1. Install dependencies. We're just using pip and a virtualenv here, but you can use any other dependency management solution:
python -m venv venv
source venv/bin/activate
pip install -r requirements.txt

Serving the application

Once this is all done, let's try and serve the app before we go any further. Run the following command:

uvicorn chatbot.asgi:app

If you go to http://localhost:8000 and get a 404 Not Found response, you're all good! Enter Ctrl+C in your terminal to stop the server.

Writing the WebSocket endpoint

We're now ready to get to the meat of it! The first thing we'll build is the WebSocket endpoint.

If you're not familiar with WebSocket, don't worry — here's a 10-word summary: it allows a server and a client to exchange messages in a bidirectional way. It's good old sockets reinvented for the web.

Due to their bidirectional nature, they're very suitable for the kind of application we're building here — some sort of conversation between a client and a server (i.e. our chatbot).

If you're interested in learning more about WebSockets in Python, I strongly recommend this talk: A beginner's guide to WebSockets.

Alright, so we're not going to plug the chatbot in just yet. Instead, let's make the server send back any message it receives — a behavior also known as an "echo" endpoint.

Add the following at the end of app.py:

# chatbot/app.py


async def converse(ws):
    async for message in ws:
        await ws.send(message)

A few minimal explanations here, for the curious:

  • This defines a WebSocket endpoint accessible at the ws://localhost:8000/conversation location.
  • The async for message in ws: line iterates over messages received over the WebSocket.
  • Lastly, await ws.send(message) sends the received message as-is back to the client.

Trying out the WebSocket endpoint

How about we try this out by creating a WebSocket client? Fear not — we don't need to write any JavaScript. We'll stick to Python and use the websockets library, which comes installed with Bocadillo.

Create a client.py file in the project root directory, and paste the following code there. It connects to the WebSocket endpoint and runs a simple REPL:

# client.py
import asyncio
from contextlib import suppress
import websockets

async def client(url: str):
    async with websockets.connect(url) as websocket:
        while True:
            message = input("> ")
            await websocket.send(message)
            response = await websocket.recv()

with suppress(KeyboardInterrupt):
    # 3.7+. See asyncio docs for <3.7 usage.

Serve the app again and, in a separate terminal, run $ python client.py. You should be greeted with a > prompt. If so, start chattin'!

> Hi!
> Is there anyone here?
Is there anyone here?

Pretty cool, isn't it? 🤓

Type Ctrl+C to exit the session and close the WebSocket connection.

Hello, Diego!

Now that we're able to make the server and a client communicate, how about we replace the echo implementation with an actual, intelligent and friendly chatbot?

This is where ChatterBot comes in! We'll create a chatbot named Diego.

Go ahead and create a bot.py file, and add Diego in there:

# chatbot/bot.py
from chatterbot import ChatBot
from chatterbot.trainers import ChatterBotCorpusTrainer

diego = ChatBot("Diego")

trainer = ChatterBotCorpusTrainer(diego)

(ChatterBot's chatbots are quite dumb out of the box, so the code above trains Diego on an English corpus to make him a bit smarter.)

At this point, you can try out the chatbot in a Python interpreter:

$ python
>>> from chatbot.bot import diego  # Be patient — this may take a few seconds to load!
>>> diego.get_response("Hi, there!")
<Statement text:There should be one-- and preferably only one --obvious way to do it.>

(Hmm. Interesting response! 🐍)

Let's now plug Diego into the WebSocket endpoint: each time we receive a new message, we'll give it to Diego and send his response back.

# chatbot/app.py
from .bot import diego


async def converse(ws):
    async for message in ws:
        response = diego.get_response(message)
        await ws.send(str(response))

If you run the server/client setup from earlier, you can now see that Diego converses with us over the WebSocket!

> Hi there!
I am a chat bot. I am the original chat bot. Did you know that I am incapable of error?
> Where are you?
I am on the Internet.

Looks like Diego is a jokester. 😉

Refactoring the chatbot as a provider

Clients are now able to chat with Diego over a WebSocket connection. That's great!

However, there are a few non-functional issues with our current setup. If you think about it, Diego is a resource — ideally, it should only be made available to the WebSocket endpoint at the time of processing a connection request. Instead, we load it as soon as the app module gets imported. Plus, it is injected as a global dependency which makes the code hard to test and less readable.

So, there must be a better way… and there is: providers. ✨

Providers are quite unique to Bocadillo. They were inspired by pytest fixtures and offer an elegant, modular and flexible way to manage and inject resources into web views.

Let's use them to fix the code, shall we?

Open the providerconf.py module that was generated by the CLI, and add the following code:

# chatbot/providerconf.py
from bocadillo import provider

async def diego():
    from .bot import diego

    return diego

This code declares a diego provider which lazily loads the chatbot on app startup (hence scope="app").

We can now inject Diego into the WebSocket view. All we have to do is declare it as a parameter to the view. Let's do just that by updating the app.py script, which is listed here in full:

# chatbot/app.py
from bocadillo import App

app = App()

async def converse(ws, diego):  # <-- 👋, Diego!
    async for message in ws:
        response = diego.get_response(message)
        await ws.send(str(response))

No imports required — Diego will automagically get injected in the WebSocket view when processing the WebSocket connection request. ✨

Ready to try things out?

  1. Run the server. You should see additional logs corresponding to Bocadillo setting up Diego on startup:
$ uvicorn chatbot.asgi:app
INFO: Started server process [29843]
INFO: Waiting for application startup.
[nltk_data] Downloading package averaged_perceptron_tagger to
[nltk_data]     /Users/Florimond/nltk_data...
[nltk_data]   Package averaged_perceptron_tagger is already up-to-
[nltk_data]       date!
[nltk_data] Downloading package punkt to /Users/Florimond/nltk_data...
[nltk_data]   Package punkt is already up-to-date!
[nltk_data] Downloading package stopwords to
[nltk_data]     /Users/Florimond/nltk_data...
[nltk_data]   Package stopwords is already up-to-date!
Training greetings.yml: [####################] 100%
Training conversations.yml: [####################] 100%
INFO: Uvicorn running on (Press CTRL+C to quit)
  1. Run the client.py script again, and start chatting! You shouldn't see any difference from before. In particular, Diego responds just as fast.
> Hello!
> I would like to order a sandwich
Yes it is.

There you go! Beautiful, modular and flexible dependency injection with Bocadillo providers.

Keeping track of clients

Let's go one step further. True, we have quite elegantly implemented conversation with a chatbot over WebSocket. Now, how about we keep track of how many clients are currently talking to the chatbot?

If you were wondering, the answer is yes — we can implement this with providers too!

  1. Add a clients provider to providerconf.py:
# chatbot/providerconf.py
from bocadillo import provider


async def clients():
    return set()
  1. Add another provider which returns a context manager that takes care of adding the ws connection to the set of clients. FYI, this is an example of a factory provider, but you don't really need to understand the whole code at this point.
# chatbot/providerconf.py
from contextlib import contextmanager
from bocadillo import provider


async def save_client(clients):
    def _save(ws):
            yield ws

    return _save
  1. In the WebSocket view, use the new save_client provider to register the WebSocket client:
# chatbot/app.py


async def converse(ws, diego, save_client):
    with save_client(ws):
        async for message in ws:
            response = diego.get_response(message)
            await ws.send(str(response))

That's it! While the client is chatting with Diego, it will be present in the set of clients.

How about we do something with this information?

Exposing clients count via a REST endpoint

As a final feature, let's step aside from WebSocket for a moment and go back to the good old HTTP protocol. We'll create a simple REST endpoint to view the number of currently connected clients.

Go back to app.py and add the following code:

# chatbot/app.py


async def client_count(req, res, clients):
    res.json = {"count": len(clients)}

Hopefully this code shouldn't come as a surprise. All we do here is send the number of clients (obtained from the clients provider) in a JSON response.

Go ahead! Run uvicorn chatbot.asgi:app, and a few python client.py instances, and check out how many clients are connected by opening http://localhost:8000/client-count in a web browser. Press Ctrl+C for one of the clients, and see the client count go down!

Did it work? Congrats! ✨


We're mostly done in terms of the features we wanted to cover together. We have some ideas you can explore as exercises, of course, but before getting to that let's write some tests.

One of Bocadillo's design principles is to make it easy to write high-quality applications. As such, Bocadillo has all the tools built-in to write tests for this chatbot server.

You can write those with your favorite test framework. We'll choose pytest for the purpose of this tutorial. Let's install it first:

pip install pytest

Next, create a tests package:

mkdir tests

We can now setup our testing environment:

  • We'll write a pytest fixture that sets up a test client. The test client exposes a Requests-like API as well as helpers to test WebSocket endpoints.
  • We don't actually need to test the chatbot here, so we'll override the diego provider with an "echo" mock — this will have the nice side effect of greatly speeding up the tests.

Go ahead and create a conftest.py script and place the following in there:

# tests/conftest.py
import pytest
from bocadillo import provider, create_client

from chatbot.asgi import app

async def diego():
    class EchoDiego:
        def get_response(self, query):
            return query

    return EchoDiego()

async def client():
    return create_client(app)

Now is the time to write some tests! Create a test_app.py file in the tests package:

touch tests/test_app.py

First, let's test that we can connect to the WebSocket endpoint, and that we get a response from Diego if we send a message:

# tests/test_app.py

def test_connect_and_converse(client):
    with client.websocket_connect("/conversation") as ws:
        assert ws.receive_text() == "Hello!"

Now, let's test the incrementation of the client counter when clients connect to the WebSocket endpoint:

# tests/test_app.py

def test_client_count(client):
    assert client.get("/client-count").json() == {"count": 0}

    with client.websocket_connect("/conversation"):
        assert client.get("/client-count").json() == {"count": 1}

    assert client.get("/client-count").json() == {"count": 0}

Run these tests using:


And, well, guess what?

==================== test session starts =====================
platform darwin -- Python 3.7.2, pytest-4.3.1, py-1.8.0, pluggy-0.9.0
rootdir: ..., inifile: pytest.ini
collected 2 items

test_app.py ..                                         [100%]

================== 2 passed in 0.08 seconds ==================

Tests pass! ✅🎉

Wrapping up

If you've made it so far — congratulations! You've just built a chatbot server powered by WebSocket, ChatterBot and Bocadillo.

Together, we've seen how to:

  • Setup a project using the Bocadillo CLI and edit its contents.
  • Write WebSocket and HTTP endpoints.
  • Use providers to decouple resources from their consumers.
  • Test WebSocket and HTTP endpoints using pytest and Bocadillo's testing helpers.

The complete code for this tutorial is available on the GitHub repo: get the code.

Next steps

Obviously, we've only scratched the surface of what you can do with Bocadillo. The goal of this tutorial was to take you through the steps of building your First Meaningful Application.

You can iterate upon this chatbot server we've built together very easily. We'd be happy to see what you come up with!

Want to challenge yourself? Here are a few ideas:

  • Add a home page rendered with templates. The web browser should connect to the chatbot server via a JavaScript program. You'll probably also need to serve static files to achieve this.
  • Train Diego to answers questions like "How many people are you talking to currently?"
  • Currently, all clients talk to the same instance of Diego. Yet, it would be nice if each client had their own Diego to ensure a bespoke conversation. You may want to investigate cookie-based sessions and factory providers to implement this behavior.

You're now about the dive into the details of the framework. Have fun working with Bocadillo! 🥪