There are other examples of handling batched log events. The JDBC appender
supports it due to <
https://github.com/apache/logging-log4j2/blob/master/log4j-core/src/main/java/org/apache/logging/log4j/core/appender/db/AbstractDatabaseManager.java
>

This makes me think it might be worthwhile to extract out a common set of
AbstractBatchedAppender/Manager type classes that handles an internal
buffer of log events.

On 9 January 2018 at 10:28, Mikael Ståldal <mi...@apache.org> wrote:

> I guess that you are supposed to use LogEvent.isEndOfBatch() to know when
> to flush log events to final destination.
>
> Our file and stream based appenders do that. See
> https://github.com/apache/logging-log4j2/blob/master/log4j-
> core/src/main/java/org/apache/logging/log4j/core/appender/Ra
> ndomAccessFileAppender.java#L156
>
>
>
> On 2018-01-09 14:46, Apache wrote:
>
>> The Logging api only allows you to log a single event at a time so it
>> doesn’t make sense for an appender to have a method that accepts multiple
>> events since it can’t happen. That said, appenders can queue the events and
>> send them downstream in batches. I believe some of the appenders do that
>> now.
>>
>> Is there some use case I am not aware of where this method could be
>> called?
>>
>> Ralph
>>
>> On Jan 9, 2018, at 6:02 AM, Jochen Wiedmann <jochen.wiedm...@gmail.com>
>>> wrote:
>>>
>>> Hi,
>>>
>>> currently writing my first appender, and wondering about the following:
>>>
>>> The Appender interface specifies a method for logging a single event.
>>> However, my custom Appender would greatly benefit in terms of
>>> performance, if I could implement an additional method
>>> append(LogEvent[] events). Now, I wonder if such batching has been
>>> omitted from the API intentionally. Or, alternatively, if I might
>>> replace my custom logger with a modified instance of AsyncAppender?
>>>
>>


-- 
Matt Sicker <boa...@gmail.com>

Reply via email to