Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

"Invalid field format" errors when writting data to InfluxDB #62

Open
spins006 opened this issue Jun 7, 2017 · 11 comments
Open

"Invalid field format" errors when writting data to InfluxDB #62

spins006 opened this issue Jun 7, 2017 · 11 comments

Comments

@spins006
Copy link

spins006 commented Jun 7, 2017

  • Version: InfluxDB 1.2.4, Logstash 5.3.0, Metricbeat 5.3.0
  • Operating System: Linux
  • Config File:
input {
  beats {
    port => 5044
  }
}

filter {
}

output {
  influxdb {
    codec => json
    data_points => {}
    host => "influxdb"
    db => "logstash"
  }
  stdout {codec => json}
}
  • Steps to Reproduce: When Logstash tries to write Metricbeat data to InfluxDB, there are errors about invalid field format displayed in Logstash logs:
 "2017-06-06T17:29:41.676000+0000", :message=>"Error writing to InfluxDB", :response=>#<Manticore::Response:0x2dea8269 
@message="Bad Request", @callback_result="{\"error\":\"unable to parse 'logstash  1496770179090': 
invalid field format\\nunable to parse 'logstash  1496770179090': invalid field format\\nunable to parse 'logstash  1496770179091': 
invalid field format\\nunable to parse 'logstash  1496770179091': invalid field format\\nunable to parse 'logstash  1496770179091': 
invalid field format\\nunable to parse 'logstash  1496770179091': invalid field format\\nunable to parse 'logstash  1496770179091': 
...

{"@timestamp":"2017-06-06T17:29:39.091Z","type":"metricsets","system":{"process":{"ppid":2,"memory":{"size":0,"rss":{"bytes":0,"pct":0.000000},"share":0},
"state":"sleeping","fd":{"open":0,"limit":{"hard":4096,"soft":1024}},"pid":809,"username":"root","cpu":{"total":{"pct":0.000000},
"start_time":"2016-02-28T21:38:32.000Z"},"pgid":0,"name":"jbd2/dm-2-8"}},"metricset":{"module":"system","name":"process","rtt":106444},
"beat":{"name":"host_name","hostname":"host_name","version":"5.3.0"},"@version":"1",
"host":"host_name","tags":["beats_input_raw_event"]}{"type":"metricsets","system":{"process":{"name":"ext4-dio-unwrit","pgid":0,
"cpu":{"total":{"pct":0.000000},"start_time":"2016-02-28T21:38:32.000Z"},"fd":{"open":0,"limit":{"soft":1024,"hard":4096}},"ppid":2,"state":"sleeping",
"memory":{"size":0,"rss":{"bytes":0,"pct":0.000000},"share":0},"username":"root","pid":810}},"metricset":{"module":"system","name":"process","rtt":106444},
"beat":{"version":"5.3.0","name":"host_name","hostname":"host_name"},"@timestamp":"2017-06-06T17:29:39.091Z",
"@version":"1","host":"host_name","tags":["beats_input_raw_event"]}{"@timestamp":"2017-06-06T17:29:39.091Z","type":"metricsets",
"system":{"process":{"username":"root","memory":{"share":0,"size":0,"rss":{"bytes":0,"pct":0.000000}},"cpu":{"total":{"pct":0.000000},
"start_time":"2016-02-28T21:38:32.000Z"},"pid":849,"ppid":2,"fd":{"open":0,"limit":{"hard":4096,"soft":1024}},"pgid":0,"name":"kauditd","state":"sleeping"}},
"metricset":{"rtt":106444,"module":"system","name":"process"},"beat":{"name":"host_name","hostname":"host_name","version":"5.3.0"},
"@version":"1","host":"host_name","tags":["beats_input_raw_event"]}{"beat":{"version":"5.3.0","name":"host_name",
"hostname":"host_name"},"@timestamp":"2017-06-06T17:29:39.091Z","type":"metricsets","system":{"process":{"state":"sleeping","memory":
...
@bazron
Copy link

bazron commented Jun 29, 2017

I had the same problem. I use logstash 5.1.2. I downgraded logstash-output-influxdb to version 5.0.0 and it started to work for me. Also checked logstash-output-influxdb version 4.0.0 and it worked.

@Transrian
Copy link

Transrian commented Jul 10, 2017

Got the same problem, any idea about it ?
Influx 1.2.4, Logstash 5.4.1, logstash-output-plugins 5.0.1

Output part :

output {
  if [type] == "xxx" {
    influxdb {
      host => "xx.xx.xx.xx"
      port => 8086
      user => "root"
      password => "root"
      db => "mmm"
      use_event_fields_for_data_points => false
    }
  }
}

I tried to with this configuration but got the same result

output {
  if [type] == "xxx" {
    influxdb {
      host => "xx.xx.xx.xx"
      port => 8086
      user => "root"
      password => "root"
      db => "mmm"
      codec => "json"
      use_event_fields_for_data_points => false
      data_points =>  {
      }
    }
  }
}

Log :

[2017-07-10T08:05:32,507][WARN ][logstash.outputs.influxdb] Non recoverable exception while writing to InfluxDB {:exception=>#<InfluxDB::Error: {"error":"unable to parse 'logstash  1499666732498': invalid field format"}

@bazron
Copy link

bazron commented Jul 10, 2017

from what I was able to figure out, the problem is that it's trying to insert blank data. in 'logstash 1499666732498' - logstash is a default setting, (i think the measurement) and 1499666732498 is the timestamp that the plugin adds when to the data when it calls the influxdb http api. here's a log from influxdb for an insert that succeded:

[I] 2017-07-09T12:13:44Z Write body received by handler: system.cpu,host=xxx,env=xxx,cores=4 system.pct=0.0434,softirq.pct=0.006,idle.pct=0.8249,steal.pct=0.0025,irq.pct=0.0001,iowait.pct=0.0004,user.pct=0.0012,nice.pct=0.1215 1499602423 service=httpd

so my measurment is system.cpu and the timestamp is 1499602423 . it's passing all the correct data. in your log, it's not passing anything.
I don't know why this happens. try the version i put above.

@Transrian
Copy link

Transrian commented Jul 10, 2017

I tried with the 5.0.0 version plugin ... with no more sucess

The message is the same :

[2017-07-10T13:48:21,525][WARN ][logstash.outputs.influxdb] Non recoverable exception while writing to InfluxDB {:exception=>#<InfluxDB::Error: {"error":"unable to parse 'logstash  1499687301516': invalid field format"}

And here is a sample requests of the database DB :

Jul 10 11:44:51 influxdb influxd[6205]: [httpd] 10.68.95.103 - root [10/Jul/2017:11:44:51 +0000] "POST /write?db=mmm&p=%5BREDACTED%5D&precision=ms&rp=autogen&u=root HTTP/1.1" 400 76 "-" "Ruby" 767e233e-6564-11e7-8011-000000000000 143
Jul 10 11:39:51 influxdb influxd[6205]: [httpd] 10.68.95.103 - root [10/Jul/2017:11:39:51 +0000] "POST /write?db=mmm&p=%5BREDACTED%5D&precision=ms&rp=autogen&u=root HTTP/1.1" 400 76 "-" "Ruby" 7c73260b-6564-11e7-8012-000000000000 201

Got really no idea of what could happen..

Edit :

I still got some data in the database, even if errors are displayed into the logstash logs.

capture

I will investigate a little, to view if something else like strange

@Transrian
Copy link

Transrian commented Jul 10, 2017

Well, as what I see, problem seems to be empty data_points fields

If I add one parameter, such as 'environnement' in this config :

output {
  if [type] == "xxx" {
    influxdb {
      host => "xx.xx.xx.xx"
      port => 8086
      user => "root"
      password => "root"
      db => "mmm"
      codec => "json"
      use_event_fields_for_data_points => false
      data_points =>  {
        "environnement" => "%{[environnement]}"
      }
    }
  }
}

It work, data is insered into InfluxDB, and I got no error message.

If someone could verify this point !

@khanga6tm
Copy link

I tested I confirm that does not work.
I am facing this issue and try to find a another way. Logstash to InfluxDB is awesome but It does not work now.

@harissutanrafiq
Copy link

hello i use this configuration its works

output {
stdout {codec => rubydebug}
influxdb {
use_event_fields_for_data_points => true
data_points => {}
host => "localhost"
exclude_fields => ["@timestamp", "@Version", "sequence", "message", "type","path"]
db => "stock"
}
}

image

~

@khanga6tm
Copy link

I hoped it will worked for me and proceed to test immediately. But it does not work, maybe my input is packetbeat and your is metricbeat.

But thanks for your sharing!

@Hivlaher
Copy link

Hello,
I had the same problem. Here is how I fixed it:
On filebeat configuration side:

  • type: log
    paths:
    • /var/log/blahblah
      fields:
      field01: "blahblah"
      field02: "blahblah"
      fields_under_root: true
      scan_frequency: 10s

On logstash part :

input {
beats {
port => 1234
}
}

filter {
if "beats_input_codec_plain_applied" in [tags] {
mutate {
remove_tag => ["beats_input_codec_plain_applied"]
remove_field => ["beat"]
remove_field => ["prospector"]
remove_field => ["source"]
remove_field => ["host"]
}

}
}

output {
influxdb {
db => "DBNAME"
host => "127.0.0.1"
measurement => "measurementname"
codec => "json"
use_event_fields_for_data_points => true
send_as_tags => ["fieldXXX", "fieldXXX2" ]
}

}

As far as I can understand from the errors in the logs, the logstash influxdb plugin sends prospector field as a boolean where the field is a string.
The above configuration is working on my production setup with the below versions of influxdb and logstash:
influxdb.x86_64 1.4.2-1
logstash.noarch 1:6.1.2-1

I hope this is helpful,

Fotis

@mosyang
Copy link

mosyang commented Jan 22, 2019

In my case (logspout to logstash 6.5.4), event fields is "message" and cause empty data_points fields issue like below. Check the default value of exclude_fields here.

Non recoverable exception while writing to InfluxDB {:exception=>#<InfluxDB::Error: {"error":"unable to parse 'users 1548122267685': invalid field format"}

I finally figured out two workable configuration choices. Both will set "message" as data points field.

data_points => {
"message" => "%{[message]}"
}
exclude_fields => ["@timestamp", "@Version"]

or

use_event_fields_for_data_points => true
exclude_fields => ["@timestamp", "@Version"]

@orsius
Copy link

orsius commented Oct 11, 2019

Hi, The logstash output plugin can be use with coerce_values parameter to convert data point values to the appropriate type before posting.
e.g. coerce_values => {'column_name' => 'datatype'}

plugins-outputs-influxdb - coerce_values

ℹ️ currently supported datatypes are integer and float

📌 Please have a look under the issue 11143 to see a full conf file example.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

8 participants