Reduce log spam when client stops downloading media while it is being streamed to them (ConsumerStopProducingError) (#18699)

The case where a consumer stops downloading media that is currently
being streamed is now able to be handled explicitly.
That scenario isn't really an error, it is expected behaviour.

This PR adds a custom exception which allows us to drop the log level
for this specific case from `WARNING` to `INFO`.


### Pull Request Checklist

<!-- Please read
https://element-hq.github.io/synapse/latest/development/contributing_guide.html
before submitting your pull request -->

* [X] Pull request is based on the develop branch
* [X] Pull request includes a [changelog
file](https://element-hq.github.io/synapse/latest/development/contributing_guide.html#changelog).
The entry should:
- Be a short description of your change which makes sense to users.
"Fixed a bug that prevented receiving messages from other servers."
instead of "Moved X method from `EventStore` to `EventWorkerStore`.".
  - Use markdown where necessary, mostly for `code blocks`.
  - End with either a period (.) or an exclamation mark (!).
  - Start with a capital letter.
- Feel free to credit yourself, by adding a sentence "Contributed by
@github_username." or "Contributed by [Your Name]." to the end of the
entry.
* [X] [Code
style](https://element-hq.github.io/synapse/latest/code_style.html) is
correct (run the
[linters](https://element-hq.github.io/synapse/latest/development/contributing_guide.html#run-the-linters))

---------

Co-authored-by: Eric Eastwood <erice@element.io>
This commit is contained in:
Devon Hudson 2025-07-21 20:11:46 +00:00 committed by GitHub
parent 11a11414c5
commit 4e118aecd0
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
2 changed files with 18 additions and 9 deletions

1
changelog.d/18699.misc Normal file
View File

@ -0,0 +1 @@
Reduce log spam when client stops downloading media while it is being streamed to them.

View File

@ -380,12 +380,13 @@ async def respond_with_multipart_responder(
try:
await responder.write_to_consumer(multipart_consumer)
except ConsumerRequestedStopError as e:
logger.debug("Failed to write to consumer: %s %s", type(e), e)
# Unregister the producer, if it has one, so Twisted doesn't complain
if request.producer:
request.unregisterProducer()
except Exception as e:
# The majority of the time this will be due to the client having gone
# away. Unfortunately, Twisted simply throws a generic exception at us
# in that case.
logger.warning("Failed to write to consumer: %s %s", type(e), e)
# Unregister the producer, if it has one, so Twisted doesn't complain
if request.producer:
request.unregisterProducer()
@ -426,12 +427,13 @@ async def respond_with_responder(
add_file_headers(request, media_type, file_size, upload_name)
try:
await responder.write_to_consumer(request)
except ConsumerRequestedStopError as e:
logger.debug("Failed to write to consumer: %s %s", type(e), e)
# Unregister the producer, if it has one, so Twisted doesn't complain
if request.producer:
request.unregisterProducer()
except Exception as e:
# The majority of the time this will be due to the client having gone
# away. Unfortunately, Twisted simply throws a generic exception at us
# in that case.
logger.warning("Failed to write to consumer: %s %s", type(e), e)
# Unregister the producer, if it has one, so Twisted doesn't complain
if request.producer:
request.unregisterProducer()
@ -674,6 +676,10 @@ def _parseparam(s: bytes) -> Generator[bytes, None, None]:
s = s[end:]
class ConsumerRequestedStopError(Exception):
"""A consumer asked us to stop producing"""
@implementer(interfaces.IPushProducer)
class ThreadedFileSender:
"""
@ -751,7 +757,9 @@ class ThreadedFileSender:
self.wakeup_event.set()
if not self.deferred.called:
self.deferred.errback(Exception("Consumer asked us to stop producing"))
self.deferred.errback(
ConsumerRequestedStopError("Consumer asked us to stop producing")
)
async def start_read_loop(self) -> None:
"""This is the loop that drives reading/writing"""