Analog channel not reading a correct 12 bit value
Hello everyone,
I'm working on a PIC24EP256GP206 device that needs to periodically read 12 analog pins. No particular speed is required and Microchip documentation for a simple peripheral such as an ADC is inexplicably complex, so I'm just going to read each channel in series, one at a time.
Given the premises I want to be as precise as possible and read every channel as a 12 bit ADC. This works well except for a couple of channels, namely AN12 and AN13. For these, when I should be reading the maximum value of 4095 I get instead values around ~3000, so it would seem the whole scale is about 1000 ticks lower than it should be.
I can confirm the analog pin itself works, because by setting those particular channels to 10 bits instead I get the full 0-1023 scale.
I could not find mention of particular behavior in the Datasheet, FRM or errata. This is my setup and analog reading function:
typedef enum {
ADC_TEMP1 = 9,
ADC_TEMP2 = 0,
ADC_PRESS1 = 1,
ADC_PRESS2 = 7,
ADC_10V1 = 8,
ADC_10V2 = 11,
ADC_420MA1 = 12,
ADC_420MA2 = 13,
ADC_420MA3 = 14,
ADC_420MA4 = 15,
} adc_input_num_t;
void init_adc (void)
{
// ASAM disabled; ADDMABM disabled; ADSIDL disabled; DONE disabled; SIMSAM Sequential; FORM Absolute decimal result, unsigned, right-justified; SAMP disabled; SSRC Clearing sample bit ends sampling and starts conversion; AD12B 12-bit; ADON enabled; SSRCG disabled;
AD1CON1 = 0x8400;
// CSCNA disabled; VCFG0 AVDD; VCFG1 AVSS; ALTS disabled; BUFM disabled; SMPI Generates interrupt after completion of every sample/conversion operation; CHPS 1 Channel;
AD1CON2 = 0x00;
// SAMC 0; ADRC FOSC/2; ADCS 0;
AD1CON3 = 0x00;
// CH0SA AN0; CH0SB AN0; CH0NB AVSS; CH0NA AVSS;
AD1CHS0 = 0x00;
// CSS26 disabled; CSS25 disabled; CSS24 disabled; CSS31 disabled; CSS30 disabled;
AD1CSSH = 0x00;
// CSS2 disabled; CSS1 disabled; CSS0 disabled; CSS8 disabled; CSS7 disabled; CSS6 disabled; CSS5 disabled; CSS4 disabled; CSS3 disabled;
AD1CSSL = 0x00;
// DMABL Allocates 1 word of buffer to each analog input; ADDMAEN disabled;
AD1CON4 = 0x00;
// CH123SA disabled; CH123SB CH1=OA2/AN0,CH2=AN1,CH3=AN2; CH123NA disabled; CH123NB CH1=VREF-,CH2=VREF-,CH3=VREF-;
AD1CHS123 = 0x00;
IFS0bits.AD1IF = 0;
}
unsigned int read_adc_input(adc_input_num_t channel) {
unsigned int value = 0, i, tmp;
AD1CON1bits.AD12B = 1;
AD1CON1bits.ADON = 0;
AD1CHS0bits.CH0SA = channel;
AD1CON1bits.ADON = 1;
__delay_us(20);
for (i = 0; i < NUM_SAMPLES; i++) {
AD1CON1bits.SAMP = 1;
__delay_us(2); // Tad is 120 ns
AD1CON1bits.SAMP = 0;
__delay_us(5);
// Of course the DONE bit does not work properly, so I need a workaround
while(IFS0bits.AD1IF == 0) {
__delay_us(1);
IFS0bits.AD1IF = 0;
}
tmp = ADC1BUF0;
value += tmp;
}
AD1CON1bits.ADON = 0;
return value/NUM_SAMPLES;
}
I'd be very grateful if anyone could point me to the detail I've probably missed from the convoluted mess that Microchip calls "documentation".