I have setup a multiline parser in my fluentbit.conf have tried the multiline parser with a base config on local cli and it seems to work there however when i add the parser to my production config the final optput is not taking the lines. below is my configuration that is not working. what am i missing
configs i tried locally :
`[SERVICE]
flush 1
log_level info
parsers_file parsers_multiline.conf
[INPUT]
name tail
path test.log
read_from_head true
multiline.parser multiline-regex-java
[OUTPUT]
name stdout
match *`
where parsers_multiline.conf contains the multiline parser
prod conf file
apiVersion: v1
kind: ConfigMap
metadata:
name: fluent-bit-config
namespace: loggly
labels:
k8s-app: fluent-bit
data:
filter-kubernetes.conf: |
[FILTER]
Name kubernetes
Match kube.*
Kube_URL https://kubernetes.default.svc.cluster.local:443
Merge_Log On
K8S-Logging.Parser On
Keep_Log Off
K8S-Logging.Exclude Off
Annotations Off
Labels Off
[FILTER]
Name nest
Match kube.*
Operation lift
Nested_under kubernetes
Add_prefix kubernetes_
[FILTER]
Name nest
Match kube.*
Operation lift
Nested_under kubernetes_labels
Add_prefix kubernetes_labels_
[FILTER]
Name modify
Match kube.*
Rename log MESSAGE
Rename kubernetes.var.log.containers.name pod_name
[FILTER]
name multiline
match kube.*
multiline.key_content MESSAGE
multiline.parser multiline-regex-java, python, go
[FILTER]
Name modify
Match kube.*
Remove kubernetes_container_hash
Remove kubernetes_docker_id
Remove kubernetes_pod_id
Remove logtag
Remove stream
fluent-bit.conf: |
[SERVICE]
Flush 1
Log_Level info
Daemon off
Parsers_File parsers.conf
HTTP_Server Off
@INCLUDE input-kubernetes.conf
@INCLUDE filter-kubernetes.conf
@INCLUDE output-loggly.conf
input-kubernetes.conf: |
[INPUT]
Name tail
Tag kube.*
Exclude_Path /var/log/containers/fluent-bit-*
Path /var/log/containers/*.log
Parser cri
DB /var/log/flb_kube.db
Mem_Buf_Limit 50MB
Skip_Long_Lines On
Refresh_Interval 10
output-loggly.conf: |
[OUTPUT]
Name http
Match *
Host ${LOGGLY_HOSTNAME}
Port 443
Tls On
URI /bulk/${LOGGLY_TOKEN}/tag/${LOGGLY_TAG}/
Format json_lines
Json_Date_Key timestamp
Json_Date_Format iso8601
Retry_Limit False
[OUTPUT]
Name stdout
Match *
Format json_lines
parsers.conf: |
[PARSER]
Name docker
Format json
Time_Key time
Time_Format %Y-%m-%dT%H:%M:%S.%L
Time_Keep On
# Command | Decoder | Field | Optional Action
# =============|==================|=================
Decode_Field_As escaped log
[PARSER]
Name syslog
Format regex
Regex ^\<(?<pri>[0-9]+)\>(?<time>[^ ]* {1,2}[^ ]* [^ ]*) (?<host>[^ ]*) (?<ident>[a-zA-Z0-9_\/\.\-]*)(?:\[(?<pid>[0-9]+)\])?(?:[^\:]*\:)? *(?<message>.*)$
Time_Key time
Time_Format %Y-%m-%dT%H:%M:%S.%L
[PARSER]
Name cri
Format regex
Regex ^(?<time>[^ ]+) (?<stream>stdout|stderr) (?<logtag>[^ ]*) (?<log>.*)$
Time_Key time
Time_Format %Y-%m-%dT%H:%M:%S.%L%z
[MULTILINE_PARSER]
name multiline-regex-java
type regex
flush_timeout 1000
#
# Regex rules for multiline parsing
# ---------------------------------
#
# configuration hints:
#
# - first state always has the name: start_state
# - every field in the rule must be inside double quotes
#
# rules | state name | regex pattern | next state
# ------|---------------|--------------------------------------------
rule "start_state" "/^\d{4}-\d{2}-\d{2} \d{2}:\d{2}:\d{2}\.\d{3} \[.*\] .* \[.*\] .*/" "next_state"
rule "next_state" "/^(?!\d{4}-\d{2}-\d{2} \d{2}:\d{2}:\d{2}\.\d{3} \[.*\] .* \[.*\] .*).*/" "cont"
rule "cont" "/^\s*at\s+/" "cont"
rule "cont" "/^\s*Caused by:/" "cont"
rule "cont" "/^\s*.*common frames omitted/" "cont"
Top comments (0)