Log Forwarding Service is a service of Cloud Edge Cloud Console (CECC) that aims to deliver detailed CECC statistics and data. It is a paid service for Cloud Edge partners who would like to further analyze the detection events or network traffic.
Logs from the CECC Log Forwarding Service is downloaded by the LFS client, which is a download tool that runs as a service in a Linux machine. CECC logs are saved into a local folder in CSV format. With the downloaded files, partners can do some analysis and generate useful report for their customers.
This article shows how to set up the LFS client to download logs for Cloud Edge Log Forwarding Service.
- Assign the correct Log Forwarding Service license depending on the box model.
NABU partners have CE50, CESB, and CESBW licenses.
- Contact Trend Micro Technical Support or your sales representative and have a Log Forwarding Service Provision request.
- Once the service is ready, you will be provided with the following:
- LFS-Client-5.0.1580-0.x86_64.rpm - RPM package of LFS client, where 5.0.1580 is the version
- Credentials - Text file containing some import information of AWS resources
- Make sure that you have the hardware and OS spec recommended by Trend Micro:
- CPU: Cores >= 2
- Memory: >= 2G
- Disk: >= 30G
- OS: CentOS 7.0 64-bit
- Copy the RPM package to the machine where the client will be running.
- Install the RPM package using the rpm command.
- Copy the credential file provided to the /root folder where the client is installed.
- Run the command lfs_setup to make LFS client work.
- Start the client by running daemonctl.sh start command.
The client is expected to work well with the default configuration. If not necessary, please do not modify the configuration file. However, if you have some requirements to do this, go to /etc/lfs and you can find all configuration files.
For example, if you have a small disk and you aim to prevent the full disk issue, you can set "keep_days" smaller. You can also change the folder where you intend to save the downloaded CSV files.
Below is a sample of lfs_setting.conf, which is the main configuration file:
[AWS] bucket: # this is the AWS S3 bucket which saved all logs sqs: # this is the AWS SQS queue which notify new coming logs [On-demand] # this is the path where on-demand mode saves its temp results tmp_path: /var/lfs/data/download/cli # this is the path where on-demand mode saves its exported csv files export_path: /var/lfs/data/export/cli [Daemon] # this is the path where daemon mode saves its temp results tmp_path: /var/lfs/data/download/daemon # this is the path where daemon mode saves its exported csv export_path: /var/lfs/data/export/daemon # this is the interval seconds of the daemon checks from SQS message check_interval_seconds: 10 [Purge] # this is the interval minutes of the purge program checks from daemon export # folder 0 as disabled purge_interval_minutes: 1 # this is the days that purge program will keep the exported results keep_days: 10 [CSV] # this is the delimiter (default comma) used in exported csv files to # separate different columns delimiter: , # this means all string type columns will be surrounded # by a pair of quotation marks quote_string: true