Gladys indispo après reboot

Bonjour à tous,
2nd topic du jour pour moi, je dois être poissard aujourd’hui…
J’ai essayer de suivre la doc pour l’install de owntracks.
J’ai demandé la clé API dans Gladys.
Tout est correct côté owntracks, je vais sur Gladys vérifier que je me vois bien.
L’IHM de Gladys n’est pas dispo…
Je me connecte au PI fait un docker ps -a et voit bien les 2 instances (gladysassistant et watchtower)
Je reboot pour être sûr de ce que je vois.

Depuis l’IHM n’est pas dispo et l’instance gladysassistant ne fait que rebooter… Je n’ai donc accès à rien du tout.

Savez vous d’ou peut provenir ce soucis ?

pi@gladys:~ $ docker ps -a
CONTAINER ID        IMAGE                                   COMMAND                  CREATED             STATUS              PORTS               NAMES
47a53ec43523        gladysassistant/gladys:4.0.0-beta-arm   "docker-entrypoint.s…"   5 days ago          Up About a minute                       gladys
b9b0c0475788        containrrr/watchtower:armhf-latest      "/watchtower --clean…"   3 months ago        Up 17 minutes       8080/tcp            watchtower
pi@gladys:~ $ docker ps -a
CONTAINER ID        IMAGE                                   COMMAND                  CREATED             STATUS                         PORTS               NAMES
47a53ec43523        gladysassistant/gladys:4.0.0-beta-arm   "docker-entrypoint.s…"   5 days ago          Restarting (1) 3 seconds ago                       gladys
b9b0c0475788        containrrr/watchtower:armhf-latest      "/watchtower --clean…"   3 months ago        Up 22 minutes                  8080/tcp            watchtower
pi@gladys:~ $ docker ps -a
CONTAINER ID        IMAGE                                   COMMAND                  CREATED             STATUS              PORTS               NAMES
47a53ec43523        gladysassistant/gladys:4.0.0-beta-arm   "docker-entrypoint.s…"   5 days ago          Up 5 seconds                            gladys
b9b0c0475788        containrrr/watchtower:armhf-latest      "/watchtower --clean…"   3 months ago        Up 22 minutes       8080/tcp            watchtower
pi@gladys:~ $ 


pi@gladys:~ $ lsb_release -a
No LSB modules are available.
Distributor ID:	Raspbian
Description:	Raspbian GNU/Linux 10 (buster)
Release:	10
Codename:	buster
pi@gladys:~ $

Voici les logs de Gladys via la commande :

docker logs --details 47a53ec43523

 gladys-server@ start:prod /src/server
 cross-env NODE_ENV=production node index.js
     
     Initialising OpenZWave 1.6.1051 binary addon for Node.JS.
     	OpenZWave Security API is ENABLED
     	ZWave device db    : /usr/local/etc/openzwave
     	User settings path : /src/server/services/zwave/node_modules/openzwave-shared/build/Release/../../
     	Option Overrides : --Logging false --ConsoleOutput false --SaveConfiguration true
     2020-09-06T16:15:11+0200 <info> index.js:20 (Object.start) Starting Open Weather service
     2020-09-06T16:15:11+0200 <info> index.js:19 (Object.start) Starting telegram service
     2020-09-06T16:15:11+0200 <info> index.js:13 (Object.start) Starting usb service
     2020-09-06T16:15:11+0200 <info> index.js:16 (Object.start) Starting zwave service
     2020-09-06T16:15:23+0200 <info> service.start.js:16 (Service.start) Service telegram is not configured, so it was not started.
     2020-09-06T16:15:31+0200 <warn> connect.js:50 (MqttClient.<anonymous>) Error while connecting to MQTT - Error: getaddrinfo ENOTFOUND m14cloudmqtt.com
     events.js:292
           throw er; // Unhandled 'error' event
           ^
     
     Error: getaddrinfo ENOTFOUND m14cloudmqtt.com
         at GetAddrInfoReqWrap.onlookup [as oncomplete] (dns.js:66:26)
     Emitted 'error' event on MqttClient instance at:
         at TLSSocket.streamErrorHandler (/src/server/services/mqtt/node_modules/mqtt/lib/client.js:333:12)
         at TLSSocket.emit (events.js:327:22)
         at emitErrorNT (internal/streams/destroy.js:92:8)
         at emitErrorAndCloseNT (internal/streams/destroy.js:60:3)
         at processTicksAndRejections (internal/process/task_queues.js:84:21) {
       errno: 'ENOTFOUND',
       code: 'ENOTFOUND',
       syscall: 'getaddrinfo',
       hostname: 'm14cloudmqtt.com'
     }
     npm ERR! code ELIFECYCLE
     npm ERR! errno 1
     npm ERR! gladys-server@ start:prod: `cross-env NODE_ENV=production node index.js`
     npm ERR! Exit status 1
     npm ERR! 
     npm ERR! Failed at the gladys-server@ start:prod script.
     npm ERR! This is probably not a problem with npm. There is likely additional logging output above.
     
     npm ERR! A complete log of this run can be found in:
     npm ERR!     /root/.npm/_logs/2020-09-06T14_15_32_372Z-debug.log

Et ensuite ça reboot…

Mes 2 topics du jour semble donc liés…

Tu as un problème avec ton serveur mqtt, fais une capture d’écran de sa configuration stp.

Je ne peux pas accéder à l’IHM…

Je viens de consulter la page du module node MQTT

Et apparemment c’est un problème connu :

As a breaking change , by default a error handler is built into the MQTT.js client, so if any errors are emitted and the user has not created an event handler on the client for errors, the client will not break as a result of unhandled errors.

Donc 2 possibilités, soit upgrade la version MQTT, soit ajouter un listener sur les erreurs MQTT.

Merci @AlexTrovato

Mais comment je peux upgrader la version MQTT de l’image qui tourne sur le docker ?

Il va falloir corriger ça dans la prochaine version de Gladys… tu peux déjà ouvrir un ticket sur github, ça devrait pas être grand chose à faire.

Yes je vais faire ça, par contre ça veut dire que mon install est à refaire… :sob:

A partir du moment où c’est corrigé, tu ne devrais plus avoir le souci… en attendant que la nouvelle version de Gladys… sinon oui tu dois repartir de zero, avec ta versionactuelle…

Merci pour le bug report @Fabichou!

En fait tu as fais une faute d’orthographe dans le domaine de ton serveur MQTT:

Tu as entré:

m14cloudmqtt.com

Je suppose que c’est:

m14.cloudmqtt.com

?

Du coup cette erreur fait visiblement crasher le service MQTT, ce qui en cascade fais crasher Gladys.

C’est un bug critique, merci de l’avoir trouvé !

D’ailleurs c’est pas censé arrivé ça ? La v4 devait éviter qu’un service fasse crasher l’application complète

En théorie, non, en pratique, ça peut toujours arriver et on peut rien y faire (à part faire tourner les services dans des process séparés).

Plusieurs cas peuvent faire crasher un service:

  • Une erreur critique au niveau d’une dépendance écrite en language natif. Exemple: le service Z-Wave se base sur du binding C++ (open-zwave). En cas de crash côté C++, ça crash et aucun moyen de recover.
  • Dans le cas du MQTT, c’est un design pattern “event emitter” dont la gestion des erreurs est assez particulière en Node.js, un try {} catch {} n’attrape pas ces erreurs.

Exemple:

var ThirdPartyModule = require("thirdPartyModule");
try {
   await ThirdPartyModule.doIt("some", "params");
} catch (err) {
   console.log("Ok so the module call failed. Let's try something else here, but don't abort my script please!");
}

Si ta fonction thirdPartyModule.doIt émet un event ‘error’ ça ne sera pas catché par le catch.

En revanche, ici on pourrait rajouter un :

process.on('uncaughtException', (err) => {
    console.log('UNCAUGHT EXCEPTION - keeping process alive:', err);
});

A la base de Gladys, ce qui catcherait les error event non catché, et ce qui aurait évité le crash dans ce cas ici.

Je vais implémenter ça pour améliorer la stabilité globale. Pour le point 1) sur le z-wave, en revanche je ne crois pas qu’il y ait de solution, quand ça crash c’est fatal.

Ok je vois , merci pour les explications , c’est précieux pour un non dev ( de métier ). :+1:

1 Like

Salut @Fabichou, j’ai vraiment besoin de plus d’informations concernant ton installation avant que tu la trash, tu étais en quelle version de Gladys? Tu te souviens quand tu as installé cette image Raspbian?

Je viens de tester chez moi, et le bug n’intervient pas. L’erreur est bien catchée, tout fonctionne.

Je pense que tu étais sous une vieille version de Gladys 4. Maintenant, la question c’est: pourquoi Gladys ne s’était pas mise à jour automatiquement?

Mon avis: ta carte SD était pleine car la partition n’était pas étendue. C’est normal, dans l’image Gladys 4 beta actuel, il n’y avait pas l’extension automatique de la partition.

Du coup, ça veut dire que ton image n’est pas du tout foutue, tu peux la récupérer assez facilement.

Tu peux vérifier ma théorie en regardant l’espace disponible sur ton Pi en faisant:

df -h

Ensuite, tu peux étendre la partition en faisant :

sudo raspi-config --expand-rootfs

Puis en redémarrant ton Pi.

Petite vérification de routine, tu peux faire:

docker ps

Afin de vérifier que tu as bien 2 container qui tournent (gladys + watchtower). Si c’est bien le cas, tu es bon.

Normalement, ensuite, Gladys devrait réussir à se mettre à jour toute seule, et ton bug devrait être résolu :slight_smile:

Je sais, je te demande beaucoup de manipulations techniques dans ce post. Rassure toi, ce ne sont pas des manipulations qui seront demandées une fois la RC release, c’est juste qu’on est encore en beta et l’image Raspbian n’était pas encore conçue pour se maintenir seule.

C’est le chantier chaud du moment qui prend toute notre attention avec @VonOx :slight_smile:

Salut regarde bien parce que j’ai fait une install toute fraîche ce week end et j’ai ajouté aussi le conteneur zigbee2mqtt sur le port 81 et ensuite j’ai du étendre ma partition moi même… Via raspi-config

Salut :slight_smile:
Alors mon install datait de 3 mois.
Elle est connectée en permanence… Je ne comprends pas pourquoi elle ne s’est pas mise à jour.
Pourtant lorsque je regardais tout semblait OK. Et si je dis pas de bêtise, j’étais en Beta 0.14

Voici les logs des actions

    pi@gladys:~ $ docker ps -a
    CONTAINER ID        IMAGE                                   COMMAND                  CREATED             STATUS              PORTS               NAMES
    47a53ec43523        gladysassistant/gladys:4.0.0-beta-arm   "docker-entrypoint.s…"   6 days ago          Up 49 seconds                           gladys
    b9b0c0475788        containrrr/watchtower:armhf-latest      "/watchtower --clean…"   3 months ago        Up 25 hours         8080/tcp            watchtower

pi@gladys:~ $ df -h
Filesystem      Size  Used Avail Use% Mounted on
/dev/root       3.5G  2.3G 1000M  70% /
devtmpfs        213M     0  213M   0% /dev
tmpfs           217M     0  217M   0% /dev/shm
tmpfs           217M   12M  206M   6% /run
tmpfs           5.0M  4.0K  5.0M   1% /run/lock
tmpfs           217M     0  217M   0% /sys/fs/cgroup
/dev/mmcblk0p1  253M   53M  200M  21% /boot
tmpfs            44M     0   44M   0% /run/user/1000

pi@gladys:~ $ sudo raspi-config --expand-rootfs

Welcome to fdisk (util-linux 2.33.1).
Changes will remain in memory only, until you decide to write them.
Be careful before using the write command.


Command (m for help): Disk /dev/mmcblk0: 29.7 GiB, 31914983424 bytes, 62333952 sectors
Units: sectors of 1 * 512 = 512 bytes
Sector size (logical/physical): 512 bytes / 512 bytes
I/O size (minimum/optimal): 512 bytes / 512 bytes
Disklabel type: dos
Disk identifier: 0x6c586e13

Device         Boot  Start     End Sectors  Size Id Type
/dev/mmcblk0p1        8192  532479  524288  256M  c W95 FAT32 (LBA)
/dev/mmcblk0p2      532480 7862271 7329792  3.5G 83 Linux

Command (m for help): Partition number (1,2, default 2): 
Partition 2 has been deleted.

Command (m for help): Partition type
   p   primary (1 primary, 0 extended, 3 free)
   e   extended (container for logical partitions)
Select (default p): Partition number (2-4, default 2): First sector (2048-62333951, default 2048): Last sector, +/-sectors or +/-size{K,M,G,T,P} (532480-62333951, default 62333951): 
Created a new partition 2 of type 'Linux' and of size 29.5 GiB.
Partition #2 contains a ext4 signature.

Command (m for help): 
Disk /dev/mmcblk0: 29.7 GiB, 31914983424 bytes, 62333952 sectors
Units: sectors of 1 * 512 = 512 bytes
Sector size (logical/physical): 512 bytes / 512 bytes
I/O size (minimum/optimal): 512 bytes / 512 bytes
Disklabel type: dos
Disk identifier: 0x6c586e13

Device         Boot  Start      End  Sectors  Size Id Type
/dev/mmcblk0p1        8192   532479   524288  256M  c W95 FAT32 (LBA)
/dev/mmcblk0p2      532480 62333951 61801472 29.5G 83 Linux

Command (m for help): The partition table has been altered.
Syncing disks.

Please reboot
pi@gladys:~ $ sudo reboot
Connection to 192.168.1.148 closed by remote host.
Connection to 192.168.1.148 closed.

pi@gladys:~ $ docker ps -a
CONTAINER ID        IMAGE                                   COMMAND                  CREATED             STATUS              PORTS               NAMES
47a53ec43523        gladysassistant/gladys:4.0.0-beta-arm   "docker-entrypoint.s…"   6 days ago          Up 9 seconds                            gladys
b9b0c0475788        containrrr/watchtower:armhf-latest      "/watchtower --clean…"   3 months ago        Up 3 seconds        8080/tcp            watchtower

pi@gladys:~ $ df -h
Filesystem      Size  Used Avail Use% Mounted on
/dev/root        29G  2.3G   26G   9% /
devtmpfs        213M     0  213M   0% /dev
tmpfs           217M     0  217M   0% /dev/shm
tmpfs           217M  3.2M  214M   2% /run
tmpfs           5.0M  4.0K  5.0M   1% /run/lock
tmpfs           217M     0  217M   0% /sys/fs/cgroup
/dev/mmcblk0p1  253M   53M  200M  21% /boot
tmpfs            44M     0   44M   0% /run/user/1000
pi@gladys:~ $ docker ps -a
CONTAINER ID        IMAGE                                   COMMAND                  CREATED             STATUS              PORTS               NAMES
47a53ec43523        gladysassistant/gladys:4.0.0-beta-arm   "docker-entrypoint.s…"   6 days ago          Up 12 seconds                           gladys
b9b0c0475788        containrrr/watchtower:armhf-latest      "/watchtower --clean…"   3 months ago        Up 2 minutes        8080/tcp            watchtower
pi@gladys:~ $ 

Voilà tout est dans l’ordre de ce que je viens de faire sur le PI
L’IHM de Gladys est toujours indispo et comme tu le vois l’instance gladysassistant reboot toujours

Tu peux faire un docker inspect gladys et poster le résultat ici ?

Pour voir quelle image tu fais tourner.

pi@gladys:~ $ docker inspect gladys
[
    {
        "Id": "47a53ec435230fc75738269763e18e3c5914e63ab9333c4c49870d8f52a712f3",
        "Created": "2020-08-31T14:33:07.083927534Z",
        "Path": "docker-entrypoint.sh",
        "Args": [
            "npm",
            "run",
            "start:prod"
        ],
        "State": {
            "Status": "running",
            "Running": true,
            "Paused": false,
            "Restarting": false,
            "OOMKilled": false,
            "Dead": false,
            "Pid": 32507,
            "ExitCode": 0,
            "Error": "",
            "StartedAt": "2020-09-08T20:25:48.312039788Z",
            "FinishedAt": "2020-09-08T20:25:45.290902568Z"
        },
        "Image": "sha256:463491239870b46ccd747f39f021a9f0ab0f1e084a46db567b37f09bb19b8983",
        "ResolvConfPath": "/var/lib/docker/containers/47a53ec435230fc75738269763e18e3c5914e63ab9333c4c49870d8f52a712f3/resolv.conf",
        "HostnamePath": "/var/lib/docker/containers/47a53ec435230fc75738269763e18e3c5914e63ab9333c4c49870d8f52a712f3/hostname",
        "HostsPath": "/var/lib/docker/containers/47a53ec435230fc75738269763e18e3c5914e63ab9333c4c49870d8f52a712f3/hosts",
        "LogPath": "/var/lib/docker/containers/47a53ec435230fc75738269763e18e3c5914e63ab9333c4c49870d8f52a712f3/47a53ec435230fc75738269763e18e3c5914e63ab9333c4c49870d8f52a712f3-json.log",
        "Name": "/gladys",
        "RestartCount": 978,
        "Driver": "overlay2",
        "Platform": "linux",
        "MountLabel": "",
        "ProcessLabel": "",
        "AppArmorProfile": "",
        "ExecIDs": null,
        "HostConfig": {
            "Binds": [
                "/var/run/docker.sock:/var/run/docker.sock",
                "/var/lib/gladysassistant:/var/lib/gladysassistant",
                "/dev:/dev"
            ],
            "ContainerIDFile": "",
            "LogConfig": {
                "Type": "json-file",
                "Config": {}
            },
            "NetworkMode": "host",
            "PortBindings": {},
            "RestartPolicy": {
                "Name": "always",
                "MaximumRetryCount": 0
            },
            "AutoRemove": false,
            "VolumeDriver": "",
            "VolumesFrom": null,
            "CapAdd": null,
            "CapDrop": null,
            "Capabilities": null,
            "Dns": [],
            "DnsOptions": [],
            "DnsSearch": [],
            "ExtraHosts": null,
            "GroupAdd": null,
            "IpcMode": "private",
            "Cgroup": "",
            "Links": null,
            "OomScoreAdj": 0,
            "PidMode": "",
            "Privileged": true,
            "PublishAllPorts": false,
            "ReadonlyRootfs": false,
            "SecurityOpt": [
                "label=disable"
            ],
            "UTSMode": "",
            "UsernsMode": "",
            "ShmSize": 67108864,
            "Runtime": "runc",
            "ConsoleSize": [
                0,
                0
            ],
            "Isolation": "",
            "CpuShares": 0,
            "Memory": 0,
            "NanoCpus": 0,
            "CgroupParent": "",
            "BlkioWeight": 0,
            "BlkioWeightDevice": [],
            "BlkioDeviceReadBps": null,
            "BlkioDeviceWriteBps": null,
            "BlkioDeviceReadIOps": null,
            "BlkioDeviceWriteIOps": null,
            "CpuPeriod": 0,
            "CpuQuota": 0,
            "CpuRealtimePeriod": 0,
            "CpuRealtimeRuntime": 0,
            "CpusetCpus": "",
            "CpusetMems": "",
            "Devices": [],
            "DeviceCgroupRules": null,
            "DeviceRequests": null,
            "KernelMemory": 0,
            "KernelMemoryTCP": 0,
            "MemoryReservation": 0,
            "MemorySwap": 0,
            "MemorySwappiness": null,
            "OomKillDisable": false,
            "PidsLimit": null,
            "Ulimits": null,
            "CpuCount": 0,
            "CpuPercent": 0,
            "IOMaximumIOps": 0,
            "IOMaximumBandwidth": 0,
            "MaskedPaths": null,
            "ReadonlyPaths": null
        },
        "GraphDriver": {
            "Data": {
                "LowerDir": "/var/lib/docker/overlay2/992c5fd9dc41fb1355ca834aaa5babab7e809c70fa4e4dab6ca5d6159394c512-init/diff:/var/lib/docker/overlay2/05b7b195c774881f58cba0533f79d5aabe67cbe453eb7e13913b14285584620d/diff:/var/lib/docker/overlay2/a69b9cc32e2e61eff29845edd2884c307cd4a38cdb5bc294419e981cff4da4e3/diff:/var/lib/docker/overlay2/baa67559087c3eb037808a8b083a5b75d46dd0b074b82e8979281ca8d51bf566/diff:/var/lib/docker/overlay2/836bc8df4f2751c0bd26db5bb9c17091d9cbda2c3417c201600329ed27cfc90b/diff:/var/lib/docker/overlay2/a0cacc132fd00698b656494e375d740d738bac737942331db65cfb73ac0e2355/diff:/var/lib/docker/overlay2/474741e4205465961843cf4bbe8ba977dc2a4feaa694b8e74055202d50dee40f/diff:/var/lib/docker/overlay2/575ba7eb2685aa3fd4553aa5b60773350bfa591142ffabca4f5a1ac43c49fcce/diff:/var/lib/docker/overlay2/71b53704530522c8e2597216a2aa49810a9fa027ed17ea44320485304af3f7ab/diff:/var/lib/docker/overlay2/4818cc93aa35c471133e21fa1230e89b670af40f63b7fcd9a4e67edaae2e0f39/diff:/var/lib/docker/overlay2/b7f5e62be5c2edeaa423a45f4b58ad98107bd77f2ec6584dbf2aa260ae71da91/diff:/var/lib/docker/overlay2/fcb0a411a09aeef9b88438efdf94faf3577e712a77eeee52435626a0518ac682/diff",
                "MergedDir": "/var/lib/docker/overlay2/992c5fd9dc41fb1355ca834aaa5babab7e809c70fa4e4dab6ca5d6159394c512/merged",
                "UpperDir": "/var/lib/docker/overlay2/992c5fd9dc41fb1355ca834aaa5babab7e809c70fa4e4dab6ca5d6159394c512/diff",
                "WorkDir": "/var/lib/docker/overlay2/992c5fd9dc41fb1355ca834aaa5babab7e809c70fa4e4dab6ca5d6159394c512/work"
            },
            "Name": "overlay2"
        },
        "Mounts": [
            {
                "Type": "bind",
                "Source": "/dev",
                "Destination": "/dev",
                "Mode": "",
                "RW": true,
                "Propagation": "rprivate"
            },
            {
                "Type": "bind",
                "Source": "/var/lib/gladysassistant",
                "Destination": "/var/lib/gladysassistant",
                "Mode": "",
                "RW": true,
                "Propagation": "rprivate"
            },
            {
                "Type": "bind",
                "Source": "/var/run/docker.sock",
                "Destination": "/var/run/docker.sock",
                "Mode": "",
                "RW": true,
                "Propagation": "rprivate"
            }
        ],
        "Config": {
            "Hostname": "raspberrypi",
            "Domainname": "",
            "User": "",
            "AttachStdin": false,
            "AttachStdout": false,
            "AttachStderr": false,
            "ExposedPorts": {
                "80/tcp": {}
            },
            "Tty": false,
            "OpenStdin": false,
            "StdinOnce": false,
            "Env": [
                "TZ=Europe/Paris",
                "SQLITE_FILE_PATH=/var/lib/gladysassistant/gladys-production.db",
                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
                "NODE_VERSION=12.18.3",
                "YARN_VERSION=1.22.4",
                "NODE_ENV=production",
                "SERVER_PORT=80"
            ],
            "Cmd": [
                "npm",
                "run",
                "start:prod"
            ],
            "Image": "gladysassistant/gladys:4.0.0-beta-arm",
            "Volumes": null,
            "WorkingDir": "/src/server",
            "Entrypoint": [
                "docker-entrypoint.sh"
            ],
            "OnBuild": null,
            "Labels": {
                "org.label-schema.build-date": "",
                "org.label-schema.version": ""
            }
        },
        "NetworkSettings": {
            "Bridge": "",
            "SandboxID": "7b36f9c0d6d6e2b25bc782553f55571a7ac15b7d616bac20c91dfaea86a9a439",
            "HairpinMode": false,
            "LinkLocalIPv6Address": "",
            "LinkLocalIPv6PrefixLen": 0,
            "Ports": {},
            "SandboxKey": "/var/run/docker/netns/default",
            "SecondaryIPAddresses": null,
            "SecondaryIPv6Addresses": null,
            "EndpointID": "",
            "Gateway": "",
            "GlobalIPv6Address": "",
            "GlobalIPv6PrefixLen": 0,
            "IPAddress": "",
            "IPPrefixLen": 0,
            "IPv6Gateway": "",
            "MacAddress": "",
            "Networks": {
                "host": {
                    "IPAMConfig": null,
                    "Links": null,
                    "Aliases": null,
                    "NetworkID": "52cf186cb21192c3c9aa88cb4c94bb770b3deff48b02aa4392285f1cf4504c39",
                    "EndpointID": "be3f8a328fbee5ba2fede973c370d9f0e4a01a202221bbab8d2e56144f8ea4f7",
                    "Gateway": "",
                    "IPAddress": "",
                    "IPPrefixLen": 0,
                    "IPv6Gateway": "",
                    "GlobalIPv6Address": "",
                    "GlobalIPv6PrefixLen": 0,
                    "MacAddress": "",
                    "DriverOpts": null
                }
            }
        }
    }
]
pi@gladys:~ $

Ok donc ton image semble récente, le container a été créé: “Created”: “2020-08-31T14:33:07.083927534Z”

Tu pourrais récupérer la DB SQlite de Gladys qui se trouve à /var/lib/gladysassistant/gladys-production.db ? Ton bug est vraiment critique et il faut comprendre ce qui s’est passé pour qu’on puisse le corriger :slight_smile: Tu peux ouvrir ce fichier ensuite avec TablePlus par exemple. Regarde ce qu’il y a dans la table “t_variable” et montre nous les champs relatifs au MQTT. (Attention: pense à flouter tes informations sensibles: mot de passé, clé d’API, etc…)

@pierre-gilles
Tu vas être déçu, voilà le contenu de t_variable (les valeurs critiques sont tronquées…)

{
    "created_at": "2020-06-06 14:16:00.122 +00:00",
    "id": "",
    "name": "GLADYS_INSTANCE_CLIENT_ID",
    "service_id": "",
    "updated_at": "2020-06-06 14:16:00.122 +00:00",
    "user_id": "",
    "value": ""
},
{
    "created_at": "2020-06-06 14:16:11.142 +00:00",
    "id": "",
    "name": "DEVICE_STATE_HISTORY_IN_DAYS",
    "service_id": "",
    "updated_at": "2020-06-06 14:16:11.142 +00:00",
    "user_id": "",
    "value": "-1"
},
{
    "created_at": "2020-06-06 14:18:00.564 +00:00",
    "id": "",
    "name": "GLADYS_GATEWAY_REFRESH_TOKEN",
    "service_id": "",
    "updated_at": "2020-06-06 14:18:00.564 +00:00",
    "user_id": "",
    "value": ""
},
{
    "created_at": "2020-06-06 14:18:00.858 +00:00",
    "id": "",
    "name": "GLADYS_GATEWAY_RSA_PRIVATE_KEY",
    "service_id": "",
    "updated_at": "2020-06-06 14:18:00.858 +00:00",
    "user_id": "",
    "value": ""
},
{
    "created_at": "2020-06-06 14:18:01.025 +00:00",
    "id": "",
    "name": "GLADYS_GATEWAY_ECDSA_PRIVATE_KEY",
    "service_id": "",
    "updated_at": "2020-06-06 14:18:01.025 +00:00",
    "user_id": "",
    "value": ""
},
{
    "created_at": "2020-06-06 14:18:01.174 +00:00",
    "id": "",
    "name": "GLADYS_GATEWAY_RSA_PUBLIC_KEY",
    "service_id": "",
    "updated_at": "2020-06-06 14:18:01.174 +00:00",
    "user_id": "",
    "value": ""
},
{
    "created_at": "2020-06-06 14:18:01.325 +00:00",
    "id": "",
    "name": "GLADYS_GATEWAY_ECDSA_PUBLIC_KEY",
    "service_id": "",
    "updated_at": "2020-06-06 14:18:01.325 +00:00",
    "user_id": "",
    "value": ""
},
{
    "created_at": "2020-06-06 14:18:01.462 +00:00",
    "id": "",
    "name": "GLADYS_GATEWAY_USERS_KEYS",
    "service_id": "",
    "updated_at": "2020-07-04 12:25:38.274 +00:00",
    "user_id": "",
    "value": ""
},
{
    "created_at": "2020-06-06 14:18:01.610 +00:00",
    "id": "",
    "name": "GLADYS_GATEWAY_BACKUP_KEY",
    "service_id": "",
    "updated_at": "2020-06-06 14:18:01.610 +00:00",
    "user_id": "",
    "value": ""
},
{
    "created_at": "2020-06-12 15:33:17.145 +00:00",
    "id": "",
    "name": "ZWAVE_DRIVER_PATH",
    "service_id": "",
    "updated_at": "2020-07-24 13:12:41.506 +00:00",
    "user_id": "",
    "value": "/dev/ttyACM1"
},
{
    "created_at": "2020-06-15 17:20:50.897 +00:00",
    "id": "",
    "name": "DARKSKY_API_KEY",
    "service_id": "",
    "updated_at": "2020-06-15 17:20:50.897 +00:00",
    "user_id": "",
    "value": ""
},
{
    "created_at": "2020-07-29 08:40:33.833 +00:00",
    "id": "",
    "name": "OPENWEATHER_API_KEY",
    "service_id": "",
    "updated_at": "2020-07-29 08:42:38.827 +00:00",
    "user_id": "",
    "value": ""
},
{
    "created_at": "2020-08-27 15:31:17.003 +00:00",
    "id": "",
    "name": "TIMEZONE",
    "service_id": "",
    "updated_at": "2020-08-27 15:31:17.003 +00:00",
    "user_id": "",
    "value": "Europe/Paris"
},
{
    "created_at": "2020-08-27 15:31:17.021 +00:00",
    "id": "",
    "name": "TIMEZONE",
    "service_id": "",
    "updated_at": "2020-08-27 15:31:17.021 +00:00",
    "user_id": "",
    "value": "Europe/Paris"
},
{
    "created_at": "2020-08-27 15:31:17.227 +00:00",
    "id": "",
    "name": "TIMEZONE",
    "service_id": "",
    "updated_at": "2020-08-27 15:31:17.227 +00:00",
    "user_id": "",
    "value": "Europe/Paris"
}

]