bluedenim / log4j-s3-search-samples

Sample programs linking to log4j-s3-search libs

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Issue with the log4j s3 appender

Siddartha9889 opened this issue · comments

Hello, blue denim. I tried to transmit logs to AWS S3 using the S3 appender, and I completed all of the configuration steps mentioned in the document, including adding the dependency to the pom.xml file. However, I was not able to connect to S3. Could you kindly help me determine whether the configuration I was using was correct?

<Log4j2Appender name="Log4j2Appender">
        <PatternLayout pattern="%d{HH:mm:ss,SSS} [%t] %-5p %c{36} - %m%n"/>
      <verbose>false</verbose>

      <!-- Examples of optional tags to attach to entries (applicable only to SOLR & Elasticsearch)-->
      <tags>TEST,ONE,TWO;THREE</tags>

      <!-- Number of messages (lines of log) to buffer before publishing out -->
      <stagingBufferSize>2</stagingBufferSize>

      <s3Bucket>xxxxx</s3Bucket>
      <s3Path>xxxxxx/</s3Path>
      <s3Region>us-west-2</s3Region>
      <s3AwsKey>xxxxxxxxxxxxx</s3AwsKey>
      <s3AwsSecret>xxxxxxxxxxxxxxxx</s3AwsSecret>
      <s3PathStyleAccess>true</s3PathStyleAccess>
      <s3CannedAcl>BucketOwnerFullControl</s3CannedAcl>
        <!-- Uncomment below to use SSE for S3. NOTE: SSE_S3 is the only supported option right now -->
<!--       <s3SseKeyType>SSE_S3</s3SseKeyType> -->
       <!-- Uncomment below to apply GZIP compression on content sent to S3 -->
  <!--<s3Compression>true</s3Compression>-->
        </Log4j2Appender>

Dependency:

<dependency>
    <groupId>com.therealvan</groupId>
    <artifactId>appender-log4j2</artifactId>
    <version>4.0.0</version>
</dependency>

Do I need to add any additional configurations in the log4j2 file. I tried with the different scenarios bur couldn't get that. Can you please assist me, how we can achieve this in the mule4. Thanks.

Hi @bluedenim . I tried increasing the staging buffer size to 200, but it was not able to connect to the S3.
This is my log4j2.xml file where I made all configuration as mentioned in the document.

<?xml version="1.0" encoding="utf-8"?>
<Configuration status="INFO">
    <Appenders>
      <Console name="ConsoleAppender" target="SYSTEM_OUT">
      <PatternLayout pattern="%d{HH:mm:ss,SSS} [%t] %-5p %c{36} - %m%n"/>
      </Console>
	  <Log4j2Appender name="Log4j2Appender">
        <PatternLayout pattern="%d{HH:mm:ss,SSS} [%t] %-5p %c{36} - %m%n"/>
      <verbose>false</verbose>

      <!-- Examples of optional tags to attach to entries (applicable only to SOLR & Elasticsearch)-->
      <tags>TEST,ONE,TWO;THREE</tags>

      <!-- Number of messages (lines of log) to buffer before publishing out -->
      <stagingBufferSize>200</stagingBufferSize>

      <s3Bucket>bau-abc-app</s3Bucket>
      <s3Path>log4js3/</s3Path>
      <s3Region>us-west-2</s3Region>
      <s3AwsKey>-----------------</s3AwsKey>
      <s3AwsSecret>-------------</s3AwsSecret>
      <s3PathStyleAccess>true</s3PathStyleAccess>
      <s3CannedAcl>BucketOwnerFullControl</s3CannedAcl>
        <!-- Uncomment below to use SSE for S3. NOTE: SSE_S3 is the only supported option right now -->
<!--       <s3SseKeyType>SSE_S3</s3SseKeyType> -->
       <!-- Uncomment below to apply GZIP compression on content sent to S3 -->
  <!--<s3Compression>true</s3Compression>-->
        </Log4j2Appender>
    </Appenders>
    <Loggers>
      <Root level="INFO">
        <AppenderRef ref="ConsoleAppender" />
        <AppenderRef ref="Log4j2Appender" />
      </Root>
      <Logger name="com.van.logging" level="debug" additivity="false">
        <AppenderRef ref="ConsoleAppender" />
      </Logger>
	</Loggers>
</Configuration>

Here "bau-abc-app" is the bucket name. "log4js3/" ,these is the another folder inside the bucket.
These are the configurations I used in my log4j2.xml file. Do I need to add any new configurations or remove any? Please guide me on these configurations, if I need to make minor changes or uncomment any configurations. Your feedback will be much appreciated.

Thanks in advance @bluedenim

Hello. Did you see any error messages that may explain why it wasn't able to connect to S3?

I suggest start with the sample repo at https://github.com/bluedenim/log4j-s3-search-samples/tree/master/appender-log4j2-sample. It is a setup that I know works.

Once you've pulled it down, modify the bucket and credentials as you need, and see how that works.

Hello, @bluedenim . I tried using these repo log4j2 files and modifying them with my S3 bucket credentials, but I was unable to send the logs to S3. There were some missing packages, but once they were added, everything worked great.

Hi @bluedenim , there are few things that I need a clarity from you.

  1. Is it possible to specify the file type of the file being uploaded? For example, if the json is uploaded and the json is added, the file type is json and it can be downloaded, or if the file type is zip, and we click the download button, it will download as a zip file. I haven't noticed any configurations in the log4j2 file for file type; are there any properties that we can specify explicitly?

  2. The file in the S3 is coming with the machine name , I was not sure how it was coming with the machine name, we are expecting not to include the machine name in the file name.
    For example: 20240212163043_MY-MACHINE-NAME_2cf52a167c3d48hj7t6edfcbb65213da
    How we can remove that machine name in the file name.

Your inputs would be much appreciated. Thanks @bluedenim

Hi @bluedenim , can you provide some inputs on these, we are doing an POC on these.

  1. Is it possible to specify the file type of the file being uploaded? For example, if the json is uploaded and the json is added, the file type is json and it can be downloaded, or if the file type is zip, and we click the download button, it will download as a zip file. I haven't noticed any configurations in the log4j2 file for file type; are there any properties that we can specify explicitly?

Typically, logs are uploaded as text files. They correspond to how the logs normally appear on the console, each line formatted by the PatternLayout you use for your logger. If you use a pattern that emits a JSON object, that is what will be uploaded.

If you set the "s3Compression" property to "true" in the log4j2 config, then the contents will be GZIPed, and the S3 key will end in ".gz" to hint that the content is compressed.

  1. The file in the S3 is coming with the machine name , I was not sure how it was coming with the machine name, we are expecting not to include the machine name in the file name.
    For example: 20240212163043_MY-MACHINE-NAME_2cf52a167c3d48hj7t6edfcbb65213da
    How we can remove that machine name in the file name.

The machine name is used in order to correlate and group the log entries uploaded in case we need to scroll through them. This is to avoid cases when you have this logger set up on multiple machines and they all upload to the same bucket.

I don't believe there is an option to skip the machine name currently. However, if we need to remove the host name, these lines are the ones to change: https://github.com/bluedenim/log4j-s3-search/blob/master/appender-log4j2/src/main/java/com/van/logging/log4j2/Log4j2AppenderBuilder.java#L266-L268

I'll look into adding a way to do that in the next minor release.

I added the option to specify the hostName value in the config. If you give it a value, then it will use it instead of the real host name.

This is in release 5.1.0.