[IN PROGRESS] Integration Influxdb

In reality I was rather surprised by the amount of extra CPU used compared to Gladys without the service. Would an outgoing HTTP request be so CPU-intensive?

Wouldn’t we then have to do some kind of « batch », based on a number of events or a delay? For example: every 10 « values » or every 5 seconds.

Isn’t your InfluxDB container hogging the CPU?

No, actually.

  • Container for the Influx DB server => on another dedicated server
  • Gladys container with your Influx DB « client » service => on the RPi with Z2M and Gladys Prod.

And I notice that this latter container (your version) uses easily 4x more CPU than the version without the Gladys service

[EDIT] they’re only spikes to 60% CPU, but on a very large installation this could become unstable.

OK I didn’t understand and the

Nothing apparent, and I must be on 32-bit, I think (installed before the 64-bit version was available).

Got it!

I found errors:

Raw logs
2022-11-08T13:37:16+0100 \u003cwarn\u003e handleMqttMessage.js:101 () Zigbee2mqtt device capteurQualitéDeLair, feature formaldehyd not configured in Gladys.
ERROR: Write to InfluxDB failed. d [HttpError]: failure writing points to database: partial write: field type conflict: input field \"value\" on measurement \"switch\" is type boolean, already exists as type float dropped=1
    at IncomingMessage.\u003canonymous\u003e (/src/server/services/influxdb/node_modules/@influxdata/influxdb-client/src/impl/node/NodeHttpTransport.ts:323:13)
    at IncomingMessage.emit (events.js:412:35)
    at endReadableNT (internal/streams/readable.js:1333:12)
    at processTicksAndRejections (internal/process/task_queues.js:82:21) {
  statusCode: 422,
  statusMessage: 'Unprocessable Entity',
  body: '{\"code\":\"unprocessable entity\",\"message\":\"failure writing points to database: partial write: field type conflict: input field \\\\\"value\\\\\" on measurement \\\\\"switch\\\\\" is type boolean, already exists as type float dropped=1\"}',
  contentType: 'application/json; charset=utf-8',
  json: {
    code: 'unprocessable entity',
    message: 'failure writing points to database: partial write: field type conflict: input field \"value\" on measurement \"switch\" is type boolean, already exists as type float dropped=1'
  },
  code: 'unprocessable entity',
  _retryAfter: 0
}
2022-11-08T13:37:16+0100 \u003cerror\u003e index.js:15 (process.\u003canonymous\u003e) unhandledRejection catched: Promise {
  \u003crejected\u003e Error422:
      at /src/server/services/influxdb/lib/influxdb.writeBinary.js:32:15
      at runMicrotasks (\u003canonymous\u003e)
      at processTicksAndRejections (internal/process/task_queues.js:95:5) {
    status: 422,
    code: 'UNPROCESSABLE_ENTITY',
    properties: 'InfluxDB API - Unprocessable entity, maybe datatype problem'
  }
}
2022-11-08T13:37:16+0100 \u003cerror\u003e index.js:16 (process.\u003canonymous\u003e) Error422:
    at /src/server/services/influxdb/lib/influxdb.writeBinary.js:32:15
    at runMicrotasks (\u003canonymous\u003e)
    at processTicksAndRejections (internal/process/task_queues.js:95:5) {
  status: 422,
  code: 'UNPROCESSABLE_ENTITY',
  properties: 'InfluxDB API - Unprocessable entity, maybe datatype problem'
}
ERROR: Write to InfluxDB failed. d [HttpError]: failure writing points to database: partial write: field type conflict: input field \"value\" on measurement \"switch\" is type boolean, already exists as type float dropped=1
    at IncomingMessage.\u003canonymous\u003e (/src/server/services/influxdb/node_modules/@influxdata/influxdb-client/src/impl/node/NodeHttpTransport.ts:323:13)
    at IncomingMessage.emit (events.js:412:35)
    at endReadableNT (internal/streams/readable.js:1333:12)
    at processTicksAndRejections (internal/process/task_queues.js:82:21) {
  statusCode: 422,
  statusMessage: 'Unprocessable Entity',
  body: '{\"code\":\"unprocessable entity\",\"message\":\"failure writing points to database: partial write: field type conflict: input field \\\\\"value\\\\\" on measurement \\\\\"switch\\\\\" is type boolean, already exists as type float dropped=1\"}',
  contentType: 'application/json; charset=utf-8',
  json: {
    code: 'unprocessable entity',
    message: 'failure writing points to database: partial write: field type conflict: input field \"value\" on measurement \"switch\" is type boolean, already exists as type float dropped=1'
  },
  code: 'unprocessable entity',
  _retryAfter: 0
}
2022-11-08T13:37:16+0100 \u003cerror\u003e index.js:15 (process.\u003canonymous\u003e) unhandledRejection catched: Promise {
  \u003crejected\u003e Error422:
      at /src/server/services/influxdb/lib/influxdb.writeBinary.js:32:15
      at runMicrotasks (\u003canonymous\u003e)
      at processTicksAndRejections (internal/process/task_queues.js:95:5) {
    status: 422,
    code: 'UNPROCESSABLE_ENTITY',
    properties: 'InfluxDB API - Unprocessable entity, maybe datatype problem'
  }
}

Procedure :

  • Start Gladys with the built-in InfluxDB service
  • Run a scene from a wireless button
  • Error

[EDIT]
It looks like an HTTP 422 returned from my InfluxDB server, for an unsupported data type.

I have this one too, it’s on my to-do list (I think I’ll remove the binaries and store them as floats because a switch can have other datatypes)

I’ve rebuilt the build, I’m processing the data a bit differently.

You need to delete your InfluxDB bucket and recreate it on this new image

Tested as requested, and on my mini PC with an i5 this time.

Same behavior, container CPU used between 6% and 30% while Gladys without the Influx service is at 0%.

The logs clearly show that the high activity of HTTP requests must be using quite a bit of CPU, I imagine.
My CO2 and VOC sensor is very, very verbose (all data is updated every second).

Finally, there is still an error about a data type, but not the same one this time.

Had you deleted the bucket?

[quote=« lmilcent, post:49, topic:7454 »]
My CO2

Yes!

Once I had a similar case, a task that was very resource-intensive because it had to process a lot of data.
By buffering this task with a « batch » we completely solved this problem.

I see it like this:

  1. Collect events to process for 10 seconds
  2. Send the last 10 seconds of data all at once
  3. Repeat indefinitely

I’m going to see what the Node.js client offers (we’ll find a solution — there’s apparently a notion of a batch)

For the data type I’ll have to push only floats, I think.

Great — if it’s already possible in the client being used, I hope that will help better optimize resources. I’m thinking that on a few devices at my place I notice an increase in CPU; I can’t even imagine if someone else had the same sensor as me, but in every room!

Otherwise, I also noticed something odd. A docker stats shows me that your container is consuming much more RAM than Gladys.

Connecting into your container, an htop shows:

The same thing, in the Gladys container this time:

At first glance, no difference; is it docker stats that’s acting up?

This seems absurd to me, but I have no idea what’s going on. ( I only use the InfluxDB client)

We’ll see if batch mode changes anything :thinking:

+1 for batching calls, it’s really too much to send them one by one in my opinion :grinning_face_with_smiling_eyes:

You can either batch by time (every 10 seconds you flush your queue), or batch by count (every 100 items in the queue

I found this, I need to understand the example and implement it.

Okay, I just realized that the client handled the flush (writing only). Before I was doing it for each point.

I just set the following configuration:

Flush every 20 lines (I don’t know what a line corresponds to, it’s not clear in the docs)

And/or Flush every 60 seconds.

2 Likes

I see a huge difference!

The CPU of the Gladys Influx container barely exceeds 6% CPU, generally completely in sync with the original container. I can hardly notice any difference in consumption anymore.

For me the service is functional, as long as you add front-end messages like « configuration saved » or « error ».

Yes, the UI is in progress.

Then the tests.
In 6 months it should be ready :face_savoring_food:

1 Like