Photo by Bailey Alexander / Unsplash

Current LLM tech stack

Articles Nov 29, 2024 (Nov 29, 2024) Loading...

because people ask, not because I recommend

November 2024:

For coding, I use Cody in my local IDE VSCode and whatever is on offer in cloud IDEs (recently CoPilot and ReplitAI).

For getting my code to talk with LLMs, I use Simon Willison's llm.

For running models in the cloud, I use Replicate. I've used Ollama locally – but honestly I haven't touched a local model recent months. I use DrawThings to explore image-making locally (but while generative, that's not using an LLM).

For general stuff (including explaining code and learning new stuff), I currently use msty, which allows chat-like interfaces with several AIs via their API. This lets me set up the context for a chat with system prompts, and handily means that I can pay-as-I-go with AIs, which is rather cheaper for me than a monthly subscription.

I'm currently using Claude-3.5-Sonnet, GPT 4o and GPT 4o-mini via their API

Member reactions

Reactions are loading...

Sign in to leave reactions on posts

Tags

Comments

Sign in or become a Workroom Productions member to read and leave comments.

Great! You've successfully subscribed.
Great! Next, complete checkout for full access.
Welcome back! You've successfully signed in.
Success! Your account is fully activated, you now have access to all content.