Giter Club home page Giter Club logo

Comments (5)

nitsanw avatar nitsanw commented on May 22, 2024

For Spsc you cannot avoid an ordered write to the array, otherwise you will lose required ordering. You can delay it to make a bunch of elements visible but I doubt there's much to win for Spsc.
On the mp/mc side you can limit contention and reduce overheads by batching inserts/reads but this comes at an increased risk of bubbles in the queue, so can hurt latency. You would have to test and see if it helps particular applications. It is an interesting option for the alternative interfaces set out under experimental.

On 22 Feb 2015, at 19:06, ChristianWulf [email protected] wrote:

Could we save some volatiled index and array updates when using a batch/buffered add for a producer in a high throughput context? Optimally, we could just wrap/decorate an available SpSc queue and adapt the element adding behavior.


Reply to this email directly or view it on GitHub.

from jctools.

ChristianWulf avatar ChristianWulf commented on May 22, 2024

Yes. What you call delay is what I mean. Why do doubt an increase for spsc w.r.t. throughput? I have scenarios where 1,000-100,000 elements per sec are passed through a spsc queue.

from jctools.

nitsanw avatar nitsanw commented on May 22, 2024

I recommend you test and measure rather than speculate. On current hardware the SPSC queue has been measured to deliver throughput of 350-470M messages per second. 1K - 100K is not going to be an issue.
For SPSC the 'cost' of offer/poll is down to 2-3ns there's really very very little to shave and I doubt there's many real world use cases where an improvement will make a difference to an application. I'm very happy to be proved wrong though.
As I said, there's value here for the MP/MC cases. The interface needs to be considered carefully and it would be easier to deliver as part of new queues interfaces from experimental.
Note that a 'decorator' pattern doesn't really work in these cases as the decorator has no way to claim a batch of slots. This will require some internal knowledge/access so you are looking at extending existing classes.

from jctools.

nitsanw avatar nitsanw commented on May 22, 2024

Batch produce/consume interfaces which leave out the batch size are harder to reason about. The issues I see are around commitment:

  • If consume cannot handle current element it is stuck, the element has already been removed and there's nowhere to return it to. It can notify the queue that it will be unable to handle the next element though.
  • We have a catch 22 with the producer, if we take an element from the producer we can't return it, but we may not be able to actually put it in the queue. If we first claim and then take from the producer we may now find the producer has nothing to give us.

In either case it is find when the queue can only fulfil a part of the declared batch because the caller has committed to either being able to produce sufficient elements to fill the claimed slots or to being able to consume the full size of the batch.

from jctools.

nitsanw avatar nitsanw commented on May 22, 2024

Resolved with MessagePassingQueue API

from jctools.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.