runtime error
Exit code: 1. Reason: │ │ │ │ ) │ │ │ │ math = <module 'math' from │ │ │ │ '/usr/local/lib/python3.10/lib-dynload/math.cpyt… │ │ │ │ MODEL = 'meta-llama/Meta-Llama-3.1-8B-Instruct' │ │ │ │ OLLAMA_BASE_URL = None │ │ │ │ OPENAI_BASE_URL = None │ │ │ │ random = <module 'random' from │ │ │ │ '/usr/local/lib/python3.10/random.py'> │ │ │ │ TOKEN_INDEX = 1 │ │ │ │ TOKENIZER_ID = None │ │ │ │ VLLM_BASE_URL = None │ │ │ ╰──────────────────────────────────────────────────────────────────────────╯ │ ╰──────────────────────────────────────────────────────────────────────────────╯ Exception: Error loading InferenceEndpointsLLM: You are trying to access a gated repo. Make sure to have access to it at https://huggingface.co/meta-llama/Meta-Llama-3.1-8B-Instruct. 401 Client Error. (Request ID: Root=1-67818566-4dbaad342487c99a5dce4279;032326b2-915c-450a-b0b7-2bb272746e7e) Cannot access gated repo for url https://huggingface.co/meta-llama/Meta-Llama-3.1-8B-Instruct/resolve/main/config .json. Access to model meta-llama/Llama-3.1-8B-Instruct is restricted. You must have access to it and be authenticated to access it. Please log in.
Container logs:
Fetching error logs...