Skip to content
This repository has been archived by the owner on Oct 19, 2023. It is now read-only.

Commit

Permalink
Merge pull request #112 from jina-ai/slackbot-readme
Browse files Browse the repository at this point in the history
docs: add slackbot readme
  • Loading branch information
deepankarm authored Jun 23, 2023
2 parents d08ea04 + da00b4a commit 5be0d81
Show file tree
Hide file tree
Showing 7 changed files with 80 additions and 23 deletions.
Binary file added .github/images/slack-thread-1.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added .github/images/slack-thread-2.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added .github/images/slack-thread-3.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added .github/images/slack-thread-4.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
54 changes: 47 additions & 7 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@

[Jina](https://github.com/jina-ai/jina) is an open-source framework for building scalable multi modal AI apps on Production. [LangChain](https://python.langchain.com/en/latest/index.html) is another open-source framework for building applications powered by LLMs.

**langchain-serve** helps you deploy your LangChain apps on Jina AI Cloud in just a matter of seconds. You can now benefit from the scalability and serverless architecture of the cloud without sacrificing the ease and convenience of local development. OR you can also deploy your LangChain apps on your own infrastructure making sure your data remains private.
**langchain-serve** helps you deploy your LangChain apps on Jina AI Cloud in a matter of seconds. You can benefit from the scalability and serverless architecture of the cloud without sacrificing the ease and convenience of local development. And if you prefer, you can also deploy your LangChain apps on your own infrastructure to ensure data privacy. With langchain-serve, you can craft REST/Websocket APIs, spin up LLM-powered conversational Slack bots, or wrap your LangChain apps into FastAPI packages on cloud or on-premises.

> Give us a :star: and tell us what more you'd like to see!
Expand Down Expand Up @@ -188,15 +188,17 @@ langchain-serve currently wraps following apps as a service to be deployed on Ji

### 🎉 LLM Apps on production

- 👉 **[Define your API using `@serving` decorator](#-rest-apis-using-serving-decorator)** OR,
- 👉 **[Bring your own FastAPI app](#-bring-your-own-fastapi-app)** !
- 👉 **[Define your API using `@serving` decorator](#-rest-apis-using-serving-decorator)**
- 👉 **[Build, deploy & distribute Slack bots using `@slackbot` decorator](#-build-deploy--distribute-slack-bots-built-with-langchain)**
- 👉 **[Bring your own FastAPI app](#-bring-your-own-fastapi-app)**

### 🔥 Secure, Scalable, Serverless, Streaming REST/Websocket APIs on [Jina AI Cloud](https://cloud.jina.ai/).

- 🌎 Globally available REST/Websocket APIs with automatic TLS certs.
- 🌊 Stream LLM interactions in real-time with Websockets.
- 👥 Enable human in the loop for your agents.
- 🔑 Protect your APIs with [API authorization](#-authorize-your-apis) using Bearer tokens.
- 💬 Build, deploy & distribute Slack bots built with langchain.
- 🔑 Protect your APIs with [API authorization](#-authorize-your-apis) using Bearer tokens.
- 📄 Swagger UI, and OpenAPI spec included with your APIs.
- ⚡️ Serverless, autoscaling apps that scales automatically with your traffic.
- 📁 Persistent storage (EFS) mounted on your app for your data.
Expand Down Expand Up @@ -538,6 +540,14 @@ curl -X 'POST' \

---

## 🤖💬 Build, Deploy & Distribute Slack bots built with LangChain

langchain-serve exposes a `@slackbot` decorator to quickly build, deploy & distribute LLM-powered Slack bots without worrying about the infrastructure. It provides a simple interface to any langchain app on and makes them super accessible to users a platform they're already comfortable with.

✨ Ready to dive in? There's a [step-by-step guide in the repository](lcserve/apps/slackbot/) to help you build your own bot.

---

## 🔐 Authorize your APIs

To add an extra layer of security, we can integrate any custom API authorization by adding a `auth` argument to the `@serving` decorator.
Expand Down Expand Up @@ -742,7 +752,8 @@ Applications hosted on JCloud are priced in two categories:
### Examples
**Example 1:**
<details>
<summary><b>Example 1</b></summary>
Consider an HTTP application that has served requests for `10` minutes in the last hour and uses a custom config:
```
Expand All @@ -760,7 +771,11 @@ Serving credits = 20 * 10/60 = 3.33
Total credits per hour = 0.208 + 3.33 = 3.538
```
**Example 2:**
</details>
<details>
<summary><b>Example 2</b></summary>
Consider a WebSocket application that had active connections for 20 minutes in the last hour and uses the default configuration.
```
Expand All @@ -778,6 +793,8 @@ Serving credits = 10 * 20/60 = 3.33
Total credits per hour = 10.104 + 3.33 = 13.434
```
</details>
# ❓ Frequently Asked Questions
- [`lc-serve` command not found](#lc-serve-command-not-found)
Expand All @@ -788,19 +805,31 @@ Total credits per hour = 10.104 + 3.33 = 13.434
### `lc-serve` command not found
<details>
<summary><b>Expand</b></summary>
`lc-serve` command is registered during `langchain-serve` installation. If you get `command not found: lc-serve` error, please replace `lc-serve` command with `python -m lcserve` & retry.
</details>
### My client that connects to the JCloud hosted App gets timed-out, what should I do?
<details>
<summary><b>Expand</b></summary>
If you make long HTTP/ WebSocket requests, the default timeout value (2 minutes) might not be suitable for your use case. You can provide a custom timeout value during JCloud deployment by using the `--timeout` argument.
Additionally, for HTTP, you may also experience timeouts due to limitations in the OSS we used in `langchain-serve`. While we are working to permanently address this issue, we recommend using HTTP/1.1 in your client as a temporary workaround.
For WebSocket, please note that the connection will be closed if idle for more than 5 minutes.
</details>
### How to pass environment variables to the app?
<details>
<summary><b>Expand</b></summary>
We provide 2 options to pass environment variables:
1. Use `--env` during app deployment to load env variables from a `.env` file. For example, `lc-serve deploy jcloud app --env some.env` will load all env variables from `some.env` file and pass them to the app. These env variables will be available in the app as `os.environ['ENV_VAR_NAME']`.
Expand All @@ -816,12 +845,21 @@ We provide 2 options to pass environment variables:
}
```
</details>
### JCloud deployment failed at pushing image to Jina Hubble, what should I do?
<details>
<summary><b>Expand</b></summary>
Please use `--verbose` and retry to get more information. If you are operating on computer with `arm64` arch, please retry with `--platform linux/amd64` so the image can be built correctly.
</details>
### Debug babyagi playground request/response for external integration
<details>
<summary><b>Expand</b></summary>
1. Start textual console in a terminal (exclude following groups to reduce the noise in logging)
```bash
Expand All @@ -834,6 +872,8 @@ Please use `--verbose` and retry to get more information. If you are operating o
lc-serve playground babyagi --verbose
```
</details>
# 📣 Reach out to us
Want to deploy your LLM apps on your own infrastructure with all capabilities of Jina AI Cloud?
Expand Down
43 changes: 29 additions & 14 deletions lcserve/apps/slackbot/README.md
Original file line number Diff line number Diff line change
@@ -1,16 +1,32 @@
# Langchain Slack Bots on [Jina AI Cloud](https://cloud.jina.ai/)
# LangChain Slack Bots on [Jina AI Cloud](https://cloud.jina.ai/)

In addition to deploying scalable APIs for your LLM applications, `langchain-serve` can also be used to deploy conversational bots on Slack using langchain components. This is a step-by-step guide to deploy and configure a demo bot on Slack.
Complementing its capacity to deploy robust APIs for your LangChain applications, `langchain-serve` also brings your the ability to launching conversational bots on Slack using LangChain components. This is a step-by-step guide to build, deploy and distribute a Slack bot using `langchain-serve`.

### Step 1: Install Langchain Serve
<table align="center">
<tr>
<td><img src="../../../.github/images/slack-thread-1.png" width="200"/></td>
<td><img src="../../../.github/images/slack-thread-2.png" width="200"/></td>
<td><img src="../../../.github/images/slack-thread-3.png" width="200"/></td>
<td><img src="../../../.github/images/slack-thread-4.png" width="200"/></td>
</tr>
<tr>
<td align="center">1</td>
<td align="center">2</td>
<td align="center">3</td>
<td align="center">4</td>
</tr>
</table>


### 👉 Step 1: Install langchain-serve

Let's start by installing langchain-serve if you haven't already

```bash
pip install langchain-serve
```

### Step 2: Create the app manifest
### 👉 Step 2: Create the app manifest

Slack apps can be created from scratch or, from a manifest. We have a command to generate the manifest for you.

Expand Down Expand Up @@ -52,7 +68,7 @@ settings:
token_rotation_enabled: false
```
### Step 3: Create the app and configure it
### 👉 Step 3: Create the app and configure it
- Go to [slack apps](https://api.slack.com/apps?new_app=1) page.
- Choose `From an app manifest` and pick the workspace you want to install the app in.
Expand All @@ -73,7 +89,7 @@ You will be redirected to the app configuration page. Your app needs 2 tokens to
- You can find the token under `OAuth & Permissions` -> `OAuth Tokens for Your Workspace`. Copy it and save it somewhere safe.
- It'd be used as `SLACK_BOT_TOKEN` in the next step.

### Step 4: Deploy the demo langchain bot on Jina AI Cloud
### 👉 Step 4: Deploy the demo langchain bot on Jina AI Cloud

Create a `.env` file with the following content. Replace the values with the ones you got in the previous step. Without these, the bot won't be able to authenticate itself with Slack.

Expand Down Expand Up @@ -114,11 +130,11 @@ After the deployment is complete, you will see `Slack Events URL` in the output,

</details>

### Step 5: Configure the app to use the deployed endpoint
### 👉 Step 5: Configure the app to use the deployed endpoint

Go to `Event Subscriptions` -> `Request URL` and set it to the Events URL you got in the previous step. Upon saving, Slack will send a request to the URL to verify it. If everything is configured correctly, you will see a green Verified checkmark. If you see an error instead, check the logs of the deployment on [Jina AI Cloud](https://cloud.jina.ai/user/flows).

### Step 6: Use the bot on Slack
### 👉 Step 6: Use the bot on Slack

There are 2 ways to interact with the bot.

Expand All @@ -136,7 +152,7 @@ There are 2 ways to interact with the bot.

---

### Step 7: Enhance the bot to suit your application
### 👉 Step 7: Enhance the bot to suit your application

Let's dig deep into the demo bot code and see how it works. This example uses Agents with Tools & Chat conversation memory to answer questions from Slack threads.

Expand Down Expand Up @@ -184,7 +200,7 @@ def agent(
reply(agent_executor.run(message))
```

We define a decorator `@slackbot` to mark this function as a slackbot. This is used by the `lcserve` CLI to identify the slackbot function. The function takes the following arguments:
We define a decorator `@slackbot` to mark this function as the entrypoint to the bot, used by the `lc-serve` CLI to identify the slackbot function. Functions wrapped with `@slackbot` take the following arguments:

| Parameter | Type | Description |
|---|---|:---:|
Expand Down Expand Up @@ -217,22 +233,21 @@ Every bot deployed on Jina AI Cloud gets a persistent storage path. This can be
The `reply` function is used to send a reply back to the user. It takes a single argument - `message` - which is the reply message to be sent to the user.


### Step 8: Deploy your customized bot on Jina AI Cloud
### 👉 Step 8: Deploy your customized bot on Jina AI Cloud

After customizing the bot to suit your application, you can deploy it on Jina AI Cloud & use the new Slack Events URL in the app configuration page.

```bash
lc-serve deploy jcloud app --env .env
```

### Step 9: Distribute your bot to the world
### 👉 Step 9: Distribute your bot to the world

Once you have a bot that works for your application, you can go to `Manage Distribution` -> `Add to Slack` to get sharaable links for your bot. You can read more about how to distribute your bot to the world [here](https://api.slack.com/start/distributing).


## What's next?
## 👀 What's next?

- [Learn more about Langchain](https://python.langchain.com/docs/)
- [Learn more about langchain-serve](https://github.com/jina-ai/langchain-serve)
- Have questions? [Join our Discord community](https://discord.jina.ai/)

6 changes: 4 additions & 2 deletions lcserve/backend/slackbot/template.yml
Original file line number Diff line number Diff line change
@@ -1,8 +1,10 @@
display_information:
name: langchain-bot
name: Langchain Bot
description: Slack bot built with Langchain, hosted on Jina AI Cloud
background_color: "#323336"
features:
bot_user:
display_name: langchain-bot
display_name: Langchain Bot
always_online: true
oauth_config:
redirect_urls:
Expand Down

0 comments on commit 5be0d81

Please sign in to comment.