Noelia Molina

Noelia Molina

@silverostrich706292

She says it all the time with the most massive smile on her face and no one has corrected her yet.

Estación Catorce, Mexico Joined Jan 2026

Only @silverostrich706292 can see everyone listening in. Visitors see a rotating sample.

Noelia Molina
@silverostrich706292 · Jan 12, 2026

I forked Andrej Karpathy's LLM Council and added a Modern UI & Settings Page, multi-AI API support, web search provi

Hey everyone!
I recently spent a couple of weekends improving Karpathy's excellent LLM Council Open Source Project.
The [original project](https://github.com/karpathy/llm-council) was brilliant but lacked usability and flexibility imho.
**What I added:**
* Web search integration (DuckDuckGo, Tavily, Brave, Jina AI)
* Clean Modern UI with a settings page to support:
* Support for multiple API providers (OpenRouter, Anthropic, OpenAI, Google, etc.)
* Customizable system prompts and temperature controls (the custom prompts open up tons of use cases beyond a "council")
* Export & Import of councils, prompts, and settings (for backup and even sharing)
* Control the council size (from 1 to 8 - original only supported 3)
* Full Ollama support for local models
* "I'm Feeling Lucky" random model selector
* Filter only Free models on OpenRouter (although Rate Limits can be an issue)
* Control the Process, from a simple asking multiple models a question in parallel (Chat Only), Chat & peer rating where models rate the responses of other models, and Full end-to-end deliberation where the Chairman model makes the final decision on the best answer
You can compare up to 8 models simul

31 likes 104 responses