Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- ### Setting Up a Custom GPT with `server.py`, `instructions.txt`, and `openai.yaml`
- This guide will help you set up and run a Custom GPT using OpenAI's "Custom GPT" feature by hosting your own API endpoint using `server.py` with `uvicorn` and `ngrok`.
- #### 1. **Place Your Files**
- Ensure the following files are properly set up:
- - **`server.py`** – Your FastAPI-based server that serves as the backend for the GPT actions.
- - **`instructions.txt`** – Custom instructions that define how the GPT should behave.
- - **`openai.yaml`** – The configuration file that specifies API endpoint details.
- Keep these files in a dedicated project directory.
- #### 2. **Run `server.py` Locally**
- You'll need to start your FastAPI server using `uvicorn`:
- 1. Install dependencies (if not already installed):
- ```bash
- pip install fastapi uvicorn
- ```
- 2. Run the server:
- ```bash
- uvicorn server:app --host 0.0.0.0 --port 8000
- ```
- This will start the API server locally at `http://127.0.0.1:8000`.
- #### 3. **Expose the Server Using Ngrok**
- To make your local server accessible to OpenAI’s API, use `ngrok`:
- 1. Install `ngrok` (if not installed):
- ```bash
- brew install ngrok # macOS
- sudo apt install ngrok # Debian/Ubuntu
- ```
- Or download it from [ngrok.com](https://ngrok.com/).
- 2. Start `ngrok` to expose port 8000:
- ```bash
- ngrok http 8000
- ```
- This will generate a public URL like `https://xyz.ngrok.io/`.
- #### 4. **Update `openai.yaml`**
- Edit your `openai.yaml` file and set the `base_url` to the ngrok-generated URL:
- ```yaml
- base_url: "https://xyz.ngrok.io"
- ```
- Replace `https://xyz.ngrok.io` with your actual `ngrok` URL.
- #### 5. **Upload `openai.yaml` and `instructions.txt`**
- - Open OpenAI's [Custom GPT](https://platform.openai.com/gpts).
- - Click **Create** or **Edit** a Custom GPT.
- - Upload `instructions.txt` and `openai.yaml` under **Advanced Settings**.
- #### 6. **Test Your Custom GPT**
- - After uploading, test interactions in the OpenAI interface.
- - Ensure API calls to your `server.py` are working correctly.
- #### Notes:
- - Each time you restart `ngrok`, the public URL changes—update `openai.yaml` accordingly.
- - Consider using an `ngrok` authtoken to maintain a stable URL.
- This setup ensures your Custom GPT can call external actions via your locally hosted FastAPI server. 🚀
Advertisement
Add Comment
Please, Sign In to add comment