Configure OpenRouter BYOK and local AI models
google/gemini-2.0-flash-exp:free
mistralai/mistral-small-3.1-24b-instruct:free
anthropic/claude-sonnet-4
google/gemini-2.5-flash
deepseek/deepseek-chat-v3-0324
codestral-2501
deepseek-r1-distill-qwen-7b
ollama pull codellama
http://localhost:11434
pip install vllm
python -m vllm.entrypoints.api_server --model codellama/CodeLlama-7b-Instruct-hf