Skip to content

Fix exponential backoff for provider: {anthropic,openai}#12421

Open
leegarrett wants to merge 1 commit into
continuedev:mainfrom
leegarrett:main
Open

Fix exponential backoff for provider: {anthropic,openai}#12421
leegarrett wants to merge 1 commit into
continuedev:mainfrom
leegarrett:main

Conversation

@leegarrett
Copy link
Copy Markdown

@leegarrett leegarrett commented May 15, 2026

Exponential backoff was broken for anthropic. The reason is that ParseError() returned a plain Error with no .response property. withExponentialBackoff() only triggers the backoff logic when when error.response?.status == 429. Without that property, it immediately rethrows the error. With this fix, it correctly backs off on an HTTP 429 error.

provider: openai bypasses BaseLLM.fetch() and uses the openai SDK.

Fix the tests so that they pass again.

Some APIs have a per minute rate-limt, so increase the max retries in both codepaths to 8 so that it it retries up to 127.5s until it fails.

I have built the extension and verified both codepaths now work with the rate-limited API in question.

Description

[ What changed? Feel free to be brief. ]

AI Code Review

  • Team members only: AI review runs automatically when PR is opened or marked ready for review
  • Team members can also trigger a review by commenting @continue-review

Checklist

  • I've read the contributing guide
  • The relevant docs, if any, have been updated or created
  • The relevant tests, if any, have been updated or created

Screen recording or screenshot

[ When applicable, please include a short screen recording or screenshot - this makes it much easier for us as contributors to review and understand your changes. See this PR as a good example. ]

Tests

[ What tests were added or updated to ensure the changes work as expected? ]

@leegarrett leegarrett requested a review from a team as a code owner May 15, 2026 22:40
@leegarrett leegarrett requested review from sestinj and removed request for a team May 15, 2026 22:40
@dosubot dosubot Bot added the size:XS This PR changes 0-9 lines, ignoring generated files. label May 15, 2026
@github-actions
Copy link
Copy Markdown
Contributor

github-actions Bot commented May 15, 2026

All contributors have signed the CLA ✍️ ✅
Posted by the CLA Assistant Lite bot.

Copy link
Copy Markdown
Contributor

@cubic-dev-ai cubic-dev-ai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No issues found across 3 files

Re-trigger cubic

@leegarrett
Copy link
Copy Markdown
Author

I have read the CLA Document and I hereby sign the CLA.

@leegarrett
Copy link
Copy Markdown
Author

recheck

Exponential backoff was broken for anthropic. The reason is that ParseError()
returned a plain Error with no .response property. withExponentialBackoff() only
triggers the backoff logic when when error.response?.status == 429. Without that
property, it immediately rethrows the error. With this fix, it correctly backs
off on an HTTP 429 error.

`provider: openai` bypasses BaseLLM.fetch() and uses the openai SDK.

Fix the tests so that they pass again.

Some APIs have a per minute rate-limt, so increase the max retries in both
codepaths to 8 so that it it retries up to 127.5s until it fails.
@leegarrett
Copy link
Copy Markdown
Author

recheck

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

size:XS This PR changes 0-9 lines, ignoring generated files.

Projects

Status: Todo

Development

Successfully merging this pull request may close these issues.

1 participant