Skip to content

Abort when coroutine is cancelled #1020

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 2 commits into from
Sep 15, 2023
Merged

Conversation

rucyang
Copy link
Contributor

@rucyang rucyang commented Sep 12, 2023

There is a scene, while running async for result in AsyncLLMEngine.generate(...),if caller cancel the coroutine at high level (ex. Sanic Framework cancel the coroutine which handling request when transport connection is disconnected), whether to need catching asyncio.CancelledError and aborting token generating in background?

@@ -385,8 +385,8 @@ async def generate(

async for request_output in stream:
yield request_output
except Exception as e:
# If there is an exception, abort the request.
except (Exception, asyncio.CancelledError) as e:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why is this needed? I believe Exception should already capture all exceptions?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Changed in version 3.8: CancelledError is now a subclass of BaseException rather than Exception.

Apparently not, guess I tripped on that too

Copy link
Collaborator

@Yard1 Yard1 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm

@rucyang rucyang requested a review from zhuohan123 September 14, 2023 02:02
Copy link
Member

@zhuohan123 zhuohan123 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM! Thanks for the fix!

@zhuohan123 zhuohan123 merged commit b9fe461 into vllm-project:main Sep 15, 2023
hongxiayang pushed a commit to hongxiayang/vllm that referenced this pull request Feb 13, 2024
pi314ever pushed a commit to pi314ever/vllm that referenced this pull request Apr 12, 2025
…vllm-project#1020)

Twin PR: [135](HabanaAI/vllm-hpu-extension#135)

APC will use our implementation of attn

---------

Co-authored-by: Michał Kuligowski <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants