I just noticed that most of the ADC implementations providing their own adc_res_t do not cover all values. The API documentation states that adc_sample() should return -1 on unsupported resolutions. This indicates that all possible resolutions have to be defined in any case, so that a user could check at run time which resolutions are provided.
However: Wouldn't it make more sense to only provide the enum values actually supported? This would have two advantages:
1. Currently all places where adc_res_t is provided need to be updated when new resolutions are added, resulting in some maintenance effort 2. Only having the resolutions defined that are actually supported would result in compile time errors, which are much easier to spot and debug than run time errors
Additionally, use cases where users needed to determine available resolutions could be address by e.g. defining HAVE_ADC_RES_10BIT when ADC_RES_10BIT is supported. And ADC_RES_MAX could be provided for the highest resolution enum and ADC_RES_MAX_BITS for the number of bits this has. This would allow to determine the resolution at compile time, resulting in less overhead in terms of both runtime and memory.
But: As currently the approach to detect available resolutions would result in compile time errors (when testing for resolutions not covered in the enum), maybe nobody actually needs this?
Kind regards, Marian