Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add lossy queue to queue library module #51586

Closed
bpb mannequin opened this issue Nov 17, 2009 · 4 comments
Closed

Add lossy queue to queue library module #51586

bpb mannequin opened this issue Nov 17, 2009 · 4 comments
Assignees
Labels
stdlib Python modules in the Lib dir type-feature A feature request or enhancement

Comments

@bpb
Copy link
Mannequin

bpb mannequin commented Nov 17, 2009

BPO 7337
Nosy @rhettinger

Note: these values reflect the state of the issue at the time it was migrated and might not reflect the current state.

Show more details

GitHub fields:

assignee = 'https://github.com/rhettinger'
closed_at = <Date 2010-04-03.03:17:50.785>
created_at = <Date 2009-11-17.10:40:39.579>
labels = ['type-feature', 'library']
title = 'Add lossy queue to queue library module'
updated_at = <Date 2010-04-03.03:17:50.783>
user = 'https://bugs.python.org/bpb'

bugs.python.org fields:

activity = <Date 2010-04-03.03:17:50.783>
actor = 'rhettinger'
assignee = 'rhettinger'
closed = True
closed_date = <Date 2010-04-03.03:17:50.785>
closer = 'rhettinger'
components = ['Library (Lib)']
creation = <Date 2009-11-17.10:40:39.579>
creator = 'bpb'
dependencies = []
files = []
hgrepos = []
issue_num = 7337
keywords = []
message_count = 4.0
messages = ['95374', '95394', '95415', '102217']
nosy_count = 2.0
nosy_names = ['rhettinger', 'bpb']
pr_nums = []
priority = 'normal'
resolution = 'rejected'
stage = None
status = 'closed'
superseder = None
type = 'enhancement'
url = 'https://bugs.python.org/issue7337'
versions = ['Python 3.2']

@bpb
Copy link
Mannequin Author

bpb mannequin commented Nov 17, 2009

Many applications would benefit from 'connectionless' queues, i.e. they
don't want to care whether anything is reading from the other end.
Using current queue module classes this is not practical, because there
is a choice between unbounded memory consumption or blocking. I propose
adding a 'LossyQueue' class in the queue module which would allow
bounded memory consumption without blocking on put. (i.e. items are
dropped in fifo manner beyond a certain limit). In my view this is at
least as natural as the PriorityQueue and LifoQueue extensions in that
module.

Outline as follows:

class LossyQueue(Queue):
    "Queue subclass which drops items on overflow"
    def _init(self, maxsize):
        if maxsize > 0:
            # build the deque with maxsize limit
            self.queue = deque(maxlen=maxsize)
        else:
            # same as normal Queue instance
            self.queue = collections.deque()
        # deque alone handles maxsize,
        # so we pretend we have none
        self.maxsize = 0

if there is interest in this I will offer a proper patch with docs and
tests.

@bpb bpb mannequin added stdlib Python modules in the Lib dir type-feature A feature request or enhancement labels Nov 17, 2009
@rhettinger
Copy link
Contributor

I'm curious about your "many applications would benefit from
'connectionless' queues". What do you have in mind? Is there any
reason those apps cannot use collections.deque() directly?

@rhettinger rhettinger self-assigned this Nov 17, 2009
@bpb
Copy link
Mannequin Author

bpb mannequin commented Nov 18, 2009

'connectionless' is from how I see it as an analogy with UDP (vs TCP);
why not just use a deque is primarily about having the same API - a
client (getter) of the queue shouldn't know or care whether it is a
'lossy' queue or a normal queue. I guess most uses of a normal queue
(excepting the 'task' functions) could just use a deque, but it wouldn't
feel natural.

Use cases: non-critical event/status reporting is my canonical example.
Specific examples:

  • a program which executes a long running process in a thread. It wants
    to update a GUI progress bar or similar, which must occur in a different
    thread because of the GUI model. By using a LossyQueue, the server
    thread is simplified; it doesn't have to care whether anything is
    listening on the other end, allowing greater decoupling (e.g. no changes
    required if there isn't a GUI). LossyQueues become part of the interface
    which can be used or not as required.
  • emulating/providing wrapper around UDP sockets
  • many application protocols support a get/set/report type interface
    with the addition of asynchronous events (e.g. SNMP, Netconf, SCPI). In
    these type of applications a suitable abstraction might be a normal
    Queue(s) for the standard commands and a LossyQueue for the events
    (which some applications might not care about). The point is that to
    the user of this abstraction, these two interfaces look the same.

The 'server' doesn't care if a client is listening or not (it won't
block and it won't use unlimited memory)
The 'client' (if it wants to use it) doesn't know that it isn't a normal
queue (same API).
-> decouples server and client tasks.

@rhettinger
Copy link
Contributor

I think it would be better to put this in the ASPN recipes cookbook to let it mature and gather a following. Right now, it is not at all clear that this is the right thing to do.

@ezio-melotti ezio-melotti transferred this issue from another repository Apr 10, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
stdlib Python modules in the Lib dir type-feature A feature request or enhancement
Projects
None yet
Development

No branches or pull requests

1 participant