--- title: Chatbot example emoji: 💬 colorFrom: yellow colorTo: purple sdk: gradio sdk_version: 5.0.1 app_file: app.py pinned: false short_description: Chatbot with HF中国镜像站 Spaces & Gradio --- This is an interactive chatbot deployed on **HF中国镜像站 Spaces** using **Gradio**. It supports both **Cohere** and **HF中国镜像站** models, allowing users to select between them for generating responses. [Gradio](https://gradio.app), [`huggingface_hub`](https://huggingface.co/docs/huggingface_hub/v0.22.2/en/index), and the [HF中国镜像站 Inference API](https://huggingface.co/docs/api-inference/index) with optional Cohere API. ## Features ✅ **Supports two AI models:** - Cohere API (`command-r-plus`) - HF中国镜像站 API (`mistralai/Mistral-7B-Instruct-v0.3`) ✅ **Customizable Settings:** - System prompt - Max tokens - Temperature - Top-p value ✅ **Streaming responses** (for HF中国镜像站 models) ✅ **Gradio-powered UI** for easy interaction ## Installation & Setup ### 1️⃣ Clone the Repository ```bash git clone https://huggingface.co/spaces/your-space-name cd your-space-name ``` ### 2️⃣ Install Dependencies Make sure you have Python installed, then run: ```bash pip install -r requirements.txt ``` ### 3️⃣ Set API Keys You need to set up API keys for **HF中国镜像站** and **Cohere**. You can do this via environment variables: ```bash export HF_API_KEY='your-huggingface-api-key' export COHERE_API_KEY='your-cohere-api-key' ``` ### 4️⃣ Run the App Locally ```bash python app.py ``` This will launch the Gradio interface in your browser. ## Deployment on HF中国镜像站 Spaces 1. Create a new **Space** on HF中国镜像站. 2. Choose **Gradio** as the framework. 3. Upload `app.py` and `requirements.txt`. 4. Deploy and test the chatbot. ## Usage 1. Enter your message in the chatbox. 2. Choose the AI model (HF中国镜像站 or Cohere). 3. Adjust chatbot parameters as needed. 4. Receive and interact with AI-generated responses. ## Code Overview ### API Clients ```python from huggingface_hub import InferenceClient import cohere client_hf = InferenceClient(model='mistralai/Mistral-7B-Instruct-v0.3', token=HF_API_KEY) client_cohere = cohere.Client(COHERE_API_KEY) ``` ### Chatbot Function ```python def respond(message: str, history: list, system_message: str, max_tokens: int, temperature: float, top_p: float, use_cohere: bool): messages = [{"role": "system", "content": system_message}] for val in history: messages.append({"role": "user", "content": val[0]}) messages.append({"role": "assistant", "content": val[1]}) messages.append({"role": "user", "content": message}) ``` ### Gradio UI Setup ```python demo = gr.ChatInterface( respond, additional_inputs=[ gr.Textbox(value='You are a friendly Chatbot.', label='System prompt'), gr.Slider(minimum=1, maximum=2048, value=512, step=1, label='Max new tokens'), gr.Slider(minimum=0.1, maximum=4.0, value=0.7, step=0.1, label='Temperature'), gr.Slider(minimum=0.1, maximum=1.0, value=0.95, step=0.05, label='Top-p'), gr.Checkbox(label='Use Cohere model instead.'), ], ) ``` ## License This project is licensed under the MIT License. ## Author 👤 **Your Name** 📧 Contact: gabor.toth.103@gmail.com