/
How to Use the BatchOp Enumeration to Manage Message Batching in FIX Antenna HFT

How to Use the BatchOp Enumeration to Manage Message Batching in FIX Antenna HFT

Introduction

The FIX Antenna HFT library supports sending messages in batches, which allows for efficient transmission by grouping multiple messages into a single memory chunk. These batched messages are stored as a contiguous vector in memory and passed to the OS socket API in one send operation. This guide will teach you how to use the Engine::PutMessageOptions::BatchOp enumeration to control batching operations when working with the Session::push method.

Understanding Message Batching

General Concept

Batching involves grouping multiple raw FIX messages together in memory and sending them as a single contiguous chunk. This reduces the number of memory allocations and network calls involved in transmitting messages, improving performance.

Batch Control

You can control batching by setting the appropriate flags in Engine::PutMessageOptions (defined in B2BITS_Session.h) when calling the Session::push function. The library supports the following batching operations:

  1. Starting a batch: Initiates a new batch operation.

  2. Adding messages: Appends messages to an ongoing batch.

  3. Sending the batch: Finalizes the current batch and sends it.

Non-Batched Message Handling

  • If you want to send an individual message (i.e., not batched), set PutMessageOptions::batchOp_ to BatchOp::NotBatched.

  • Error Condition: If a non-batched message is sent when a batch is already active, the push function will throw a std::runtime_error. Always ensure no batch is in progress when using NotBatched.

How to Use the BatchOp Enumeration for Batching Operations

The BatchOp enum provides four key options for batching operations: NotBatched, Begin, Add, and Send. Here's how to use each option effectively:

1. Starting a Batch

Using BatchOp::Begin

  • Purpose: Explicitly starts a new batch while pre-allocating a memory buffer.

  • Behavior:

    • The value of PutMessageOptions::batchBufferSize_ determines the size of the pre-allocated buffer, reducing the number of memory allocations needed while adding messages to the batch.

  • Error Condition:

    • If Begin is passed when a batch is already active, std::runtime_error is thrown.

    • Solution: Complete or cancel the current batch before initiating a new one with Begin.

Using BatchOp::Add

  • BatchOp::Add can automatically start a new batch. If no batch is active, calling Add will begin a batch and initialize the batch memory buffer based on the value of PutMessageOptions::batchBufferSize_.

Best Practice for Starting a Batch:
Explicitly call Begin to start a batch when you want better control over pre-allocation and initialization.

2. Adding Messages to a Batch

Using BatchOp::Add

  • Purpose: Adds the current message to the ongoing batch.

  • Behavior:

    • The message is appended to the batch's memory buffer.

    • If the current buffer does not have enough space, it will be reallocated to accommodate the additional message.

  • Automatic Batch Start:

    • If no batch is active, using Add will automatically start a new batch.

Key Advantage: Add provides flexibility by handling both batch addition and automatic starting in a single operation.

3. Sending the Batch

Using BatchOp::Send

  • Purpose: Finalizes the current batch, appends the message being passed, and sends all accumulated messages in the batch as a single memory chunk.

  • Behavior:

    • Sends the entire message buffer (including the message passed in the current push call) to the OS socket API.

    • After sending, the current batch is completed. To start a new batch, call Begin or use Add.

  • Error Condition:

    • If no batch is currently in progress, Send will throw a std::runtime_error.

    • Solution: Ensure that a batch is active before calling Send.

Asynchronous Batch Sending

  • By setting PutMessageOptions::asyncSend_, you can configure the library to send the batch asynchronously.

4. Sending Non-Batched Messages

Using BatchOp::NotBatched

  • Purpose: Sends a single message without batching.

  • Behavior:

    • The message is immediately sent as a standalone message.

  • Error Condition:

    • If there is an active batch in progress, using NotBatched will throw a std::runtime_error.

    • Solution: Always finish or cancel the current batch before sending standalone messages.

Summary of BatchOp Options and Behavior

BatchOp Value

Description

Behavior

Error Condition

BatchOp Value

Description

Behavior

Error Condition

NotBatched

Sends a single non-batched message immediately.

Sends the message without batching.

Throws std::runtime_error if a batch is in progress.

Begin

Starts a new batch and pre-allocates memory for it.

Initializes a new batch for message grouping.

Throws std::runtime_error if a batch is already in progress.

Add

Adds a message to the current batch. Starts a new batch if none exists.

Appends the message to the batch. Reallocates memory buffer if needed.

No error condition for starting.

Send

Finalizes the current batch and sends all accumulated messages.

Sends the batch buffer through the OS socket API. Ends the current batch.

Throws std::runtime_error if no batch is active.

By using these options effectively, you can manage message batching with precision, improve performance, and handle messages according to the specific requirements of your application.