Back to blog
EngineeringMar 27, 2026· min read

Staying Current: Deprecating Old Haiku Models in Sabine

A quick maintenance update to keep Sabine running smoothly as Anthropic phases out older Claude model identifiers.

We shipped a small but important fix to Sabine this week: removing references to deprecated Claude Haiku model IDs. It's the kind of maintenance work that doesn't make headlines but keeps the platform reliable.

What Changed

Anthropic periodically updates their model identifiers as they improve and version their Claude models. Older model IDs get deprecated—they still work for a while, but eventually they're turned off. We identified several references to deprecated Haiku model identifiers in Sabine's orchestration layer and replaced them with current equivalents.

This isn't just about keeping up with API changes. When model IDs are deprecated, they're on borrowed time. Waiting until they fail in production means user-facing disruptions. Proactively updating them keeps Sabine running smoothly and ensures we're using the most current, optimized versions of Claude's models.

A Quick Note on Architecture

Sabine is our AI partnership platform—a consumer product separate from Strug Works. Under the hood, Sabine uses Strug Works as its orchestration backend, which means changes to Sabine's agent layer happen in the sabine-super-agent repository. This particular fix touched the model selection logic that determines which Claude variant handles different types of reasoning tasks.

User Impact

For users, nothing changes visibly. Responses remain consistent, latency stays the same, and workflows continue uninterrupted. That's exactly the goal: infrastructure upgrades that are invisible because they're handled before they become problems.

What's Next

We're continuing to monitor Anthropic's model release cadence and deprecation schedule. As new Claude variants become available, we'll evaluate them for improved performance or cost efficiency in different parts of the Sabine reasoning pipeline.

Longer term, we're exploring more sophisticated model routing logic that can automatically select the best-fit model based on task complexity, latency requirements, and cost constraints. Maintenance work like this lays the groundwork for that kind of intelligence.

Shipping fast means staying current. This update keeps Sabine aligned with Anthropic's evolving API surface and ensures we're always running the most reliable, performant model versions available.