It just so happens that I've recently completed the driver for the Bosch BME680 environmental sensor as well as a separate module for the Bosch BSEC sensor fusion library. I've yet to submit the patches so I don't know if the method I chose would be accepted but I'd like to share my own thoughts to the discussion. First let me give a brief background on the sensor and the BSEC library.
The BME680 contains sensors for temperature, humidity, pressure and gas. To take a reading including the gas sensor, you first trigger a reading (which involves the sensor measuring the first three parameters and then turning on the gas sensor heater for a period of time), after which you wait a specified amount of time (approximately 200ms) and then read the results. The BSEC library requires that the sensor is read at precise intervals (it contains profiles for 3s and 300s periods) and the readings are combined together to perform compensation, background calibration and air quality measurement. Bosch states that sensor is not really meant to be used standalone but in conjunction with the BSEC library. I decided to have the sensor driver layer only perform the trigger-wait-read process (so my sensor_read function triggers a reading, calls os_time_delay to wait, and then reads and returns the result). Regarding the BSEC library, because this is a bit more higher level, I packaged that into a separate package that spawns its own thread that takes care of the timing requirements for reading and provides another API for reading the processed results. So in summary, the BSEC library uses the mynewt sensor API to access the BME680, but my application code uses a separate custom API to access the processed data. I did it this way, because I didn't feel that the sensor API was meant to be that high level that it would spawn its own background threads and so put that in a separate package. So in your case, I would have simply implemented the sensor read as part of the sensor API, but created a separate package that then is responsible for regularly reading from the sensor and providing a "always ready" result to the application. I guess the "right way" depends on how high level the sensor API is intended to be - I've taken the assumption that the API aims to cover simply taking a single measurement from the sensor, rather than providing a fully processed and always-up-to-date result. On Sat, 30 Jun 2018 at 00:02, Kevin Townsend <[email protected]> wrote: > > I'm working on some sensor drivers for Mynewt 1.4, and have run into an > issue that I'm not sure has a single perfect answer, but should perhaps > be addressed or further discussed. > > Most sensors have specific timing constraints, and in certain instances > these can change dynamically. The TSL2561 light sensor, for example, can > have it's integration time set from 13ms to 402ms, depending on the > level of light sensitivity required. The TSL2591 driver I'm writing > (since the TSL2561 is EOL) has a variable integration time from 100..600ms. > > The 'problem' is that there is no concept of minimum time between sample > reads in the sensor API at present (to my knowledge, feel free to > correct me!), and I'm not sure the best way to insert this delay between > valid reads so that the data we get back can be considered reliable or > fresh. > > If a sensor has a 300ms delay between valid samples, for example, we can > still request data every 10ms but the response is undetermined in the > sense that each sensor will handle this differently. In the case of the > TSL2561 and TSL2591 the first sample requested will likely be invalid > since a single valid integration cycle hasn't finished, and then it will > buffer and continue to return values until the NEXT valid sample is > available. This is visible in the following sequence where the first IR > reading is completely out of range, and some subsequent values are > actually cached entries that might not reflect current light levels > since they happen before the next integration time ellapses: > > 011023 compat> tsl2591 r 10 > 011799 Full: 30 > 011799 IR: 61309 > 011799 Full: 30 > 011800 IR: 13 > 011801 Full: 30 > 011801 IR: 13 > 011801 Full: 30 > 011801 IR: 13 > 011802 Full: 30 > 011802 IR: 13 > 011802 Full: 30 > 011803 IR: 13 > 011803 Full: 30 > 011803 IR: 13 > 011804 Full: 30 > 011804 IR: 13 > 011804 Full: 30 > 011804 IR: 13 > 011805 Full: 30 > 011805 IR: 13 > > I'm not sure what the best way to handle this is, though. > > Some options are: > > * Add a blocking delay in the read task to take into account the > current minimum delay between valid samples (at the risk of causing > problems on the I2C bus if other devices perform transactions in > between) > * Add a concept of 'minimum time' between sample reads at the sensor > API level and enforce this at a higher level, with one of the > following consequences for read requests that occur before this > delay: (*Keep in mind that this min value can change dynamically > based on sensor config or auto-ranging!) > o Return an appropriate error value > o Return the previous cached value with the sample still marked as > valid > o Return the previous cached value with the sample marked as invalid > o Other? > > There are other solutions, but I was hoping to get some feedback on this > to hear what other people think of the issue of the current disparity > between the sensor API and real-world timing constraints of the sensors > themselves. An argument could be made that the end user should know and > work with the constraints of their HW, but it seems like this could also > be handled with some small API additions as well? >
