I’m puzzled. I’ve got a DHT22 sensor, initially configured like
sensor:
- platform: dht
pin: GPIO03
temperature:
name: “Temperature”
humidity:
name: “Humidity”
update_interval: 10s
The sensor reads
[22:47:26][D][dht:048]: Got Temperature=26.6°C Humidity=43.0%
The temperature value is already 6.6C too warm.
I correct it by using a filter/offset.
- platform: dht
pin: GPIO03
temperature:
name: “Temperature”
filters:
- offset: -5.5
humidity:
name: “Humidity”
update_interval: 10s
After recompiling, the sensor reads
[22:56:13][D][sensor:093]: 'Temperature': Sending state 23.30000 °C with 1 decimals of accuracy
[22:56:13][D][sensor:093]: 'Humidity': Sending state 38.00000 % with 0 decimals of accuracy
From my primary school days, I remember that 26.6 - 5.5 = 21.1.
I’ve also noted several times that after power down, the sensor starts reading totally different values well below zero.
[23:13:34][D][sensor:093]: 'Temperature': Sending state -18.30000 °C with 1 decimals of accuracy
[23:13:34][D][sensor:093]: 'Humidity': Sending state 24.90000 % with 0 decimals of accuracy
I live in an old house but it is not that cold. Usually the temperature inside the house is close to the outside temperature …
I’m puzzled by the initial inaccurate reading, which I would expect to be at most in 10th of a degree off the real temperature. Secondly, why does the offset not make mathematical sense?
If I remove the offset, the temperature goes back to 26.6C.
The leads are very short and I use a 4.7k pull-up resistor. I’m using the RX pin on a basic sonoff GPIO03 for the input. This pin is used for the initial programming but this configuration has worked in the past before my HA corrupted itself.
1 post - 1 participant