Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add support for FIFO queues #124

Closed
TheFarmer1 opened this issue Sep 6, 2018 · 8 comments
Closed

Add support for FIFO queues #124

TheFarmer1 opened this issue Sep 6, 2018 · 8 comments

Comments

@TheFarmer1
Copy link

When receiving messages from FIFO queues in batches, the handleMessage function is called in parallel for each message in the batch. For FIFO queues, this should be serially processed to maintain the correct order of processing. Ideally, the MessageGroupID could be used to process messages in different groups in parallel.

If batchSize is set to 1, then FIFO queues work ok naturally.

@nspragg
Copy link
Contributor

nspragg commented Oct 2, 2018

We're not looking to support FIFO queues at this time.

@nspragg nspragg closed this as completed Oct 2, 2018
@tnolet
Copy link

tnolet commented Jan 23, 2019

@TheFarmer1 @nspragg I'm working on a PR to add MessageGroupID based handling of FIFO queues. Not that much actually, as far as I can see now.

The speed gains are pretty good though.

@OrKoN
Copy link

OrKoN commented Mar 11, 2019

@tnolet did you figure out what is needed to support FIFO queues?

@tnolet
Copy link

tnolet commented Mar 11, 2019

@OrKoN I think so, will need to run some production load for a bit still. Don’t have a clear estimate.

@robbie-thompson
Copy link

@nspragg Still no plans to support FIFO?

@VrushabhGore
Copy link

@tnolet Did you do something for this?

@SAGARACH65
Copy link

SAGARACH65 commented Jan 10, 2021

@tnolet Any update on this.?

@matiasgarcia
Copy link

sqs-consumer supports FIFO out of the box, you just need to know how to use it in conjunction of your sqs queue configuration.

One option would be setting batchSize: 1, although this would be a terrible idea.

Another option is using:

  • batchSize > 1
  • handleMessageBatch
  • visibilityTimeout
  • heartbeatInterval

As long as you configure these correctly, you shouldn't have major issues.

Here is an snippet:

const app = Consumer.create({
  queueUrl: "http://localhost:4599/000000000000/fifo-queue.fifo",
  batchSize: 10,
  waitTimeSeconds: 0,
  /*
    Use visibilityTimeout in conjunction of heartbeatInterval to avoid assigning an on-going message group to another consumer
  */
  visibilityTimeout: 5,
  heartbeatInterval: 2,
  handleMessageBatch: async (messages) => {
    const messagesToAcknowledge = [];
    try {
      for (const message of messages) {
        await sleep(1000 + rand()); // simulate different workload
        console.log('Done consuming message:', message.Body, new Date().toISOString())
        messagesToAcknowledge.push(message);
      }
      return messagesToAcknowledge;
    } catch (err) {
      // Processing one message failed, so we want to abort and avoid processing messageGroupsMessages.
      console.error('messaging processing failed', err);
      throw err;
    } finally {
      return messagesToAcknowledge;
    }
  },
  sqs,
});

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

8 participants