• AVR Freaks

Analog channel not reading a correct 12 bit value

Author
Maldus
Starting Member
  • Total Posts : 56
  • Reward points : 0
  • Joined: 2016/08/17 09:55:57
  • Location: 0
  • Status: offline
2019/06/27 02:51:52 (permalink)
0

Analog channel not reading a correct 12 bit value

Hello everyone,
I'm working on a PIC24EP256GP206 device that needs to periodically read 12 analog pins. No particular speed is required and Microchip documentation for a simple peripheral such as an ADC is inexplicably complex, so I'm just going to read each channel in series, one at a time.
Given the premises I want to be as precise as possible and read every channel as a 12 bit ADC. This works well except for a couple of channels, namely AN12 and AN13. For these, when I should be reading the maximum value of 4095 I get instead values around ~3000, so it would seem the whole scale is about 1000 ticks lower than it should be.
I can confirm the analog pin itself works, because by setting those particular channels to 10 bits instead I get the full 0-1023 scale.
 
I could not find mention of particular behavior in the Datasheet, FRM or errata. This is my setup and analog reading function:
 

typedef enum {
    ADC_TEMP1 = 9,
    ADC_TEMP2 = 0,
    ADC_PRESS1 = 1,
    ADC_PRESS2 = 7,
    ADC_10V1 = 8,
    ADC_10V2 = 11,
    ADC_420MA1 = 12,
    ADC_420MA2 = 13,
    ADC_420MA3 = 14,
    ADC_420MA4 = 15,            
} adc_input_num_t;
 

void init_adc (void)
{
    // ASAM disabled; ADDMABM disabled; ADSIDL disabled; DONE disabled; SIMSAM Sequential; FORM Absolute decimal result, unsigned, right-justified; SAMP disabled; SSRC Clearing sample bit ends sampling and starts conversion; AD12B 12-bit; ADON enabled; SSRCG disabled;
   AD1CON1 = 0x8400;

    // CSCNA disabled; VCFG0 AVDD; VCFG1 AVSS; ALTS disabled; BUFM disabled; SMPI Generates interrupt after completion of every sample/conversion operation; CHPS 1 Channel;
   AD1CON2 = 0x00;
    // SAMC 0; ADRC FOSC/2; ADCS 0;
   AD1CON3 = 0x00;
    // CH0SA AN0; CH0SB AN0; CH0NB AVSS; CH0NA AVSS;
   AD1CHS0 = 0x00;
    // CSS26 disabled; CSS25 disabled; CSS24 disabled; CSS31 disabled; CSS30 disabled;
   AD1CSSH = 0x00;
    // CSS2 disabled; CSS1 disabled; CSS0 disabled; CSS8 disabled; CSS7 disabled; CSS6 disabled; CSS5 disabled; CSS4 disabled; CSS3 disabled;
   AD1CSSL = 0x00;
    // DMABL Allocates 1 word of buffer to each analog input; ADDMAEN disabled;
   AD1CON4 = 0x00;
    // CH123SA disabled; CH123SB CH1=OA2/AN0,CH2=AN1,CH3=AN2; CH123NA disabled; CH123NB CH1=VREF-,CH2=VREF-,CH3=VREF-;
   AD1CHS123 = 0x00;
   IFS0bits.AD1IF = 0;
}

unsigned int read_adc_input(adc_input_num_t channel) {
    unsigned int value = 0, i, tmp;
   
    AD1CON1bits.AD12B = 1;
    AD1CON1bits.ADON = 0;
    AD1CHS0bits.CH0SA = channel;
    AD1CON1bits.ADON = 1;
    __delay_us(20);
    
    for (i = 0; i < NUM_SAMPLES; i++) {
        AD1CON1bits.SAMP = 1;
        __delay_us(2); // Tad is 120 ns
        AD1CON1bits.SAMP = 0;
        __delay_us(5);
// Of course the DONE bit does not work properly, so I need a workaround
        while(IFS0bits.AD1IF == 0) {
            __delay_us(1);
            IFS0bits.AD1IF = 0;
        }
        tmp = ADC1BUF0;
        value += tmp;
    }
    AD1CON1bits.ADON = 0;
    
    return value/NUM_SAMPLES;
}

 
I'd be very grateful if anyone could point me to the detail I've probably missed from the convoluted mess that Microchip calls "documentation".
#1
Maldus
Starting Member
  • Total Posts : 56
  • Reward points : 0
  • Joined: 2016/08/17 09:55:57
  • Location: 0
  • Status: offline
Re: Analog channel not reading a correct 12 bit value 2019/07/15 23:58:17 (permalink)
0
I believe I found out the problem.
First, there was a typo in the waiting loop. I was waiting for the conversion with:


__delay_us(5);
while(IFS0bits.AD1IF == 0) {
            __delay_us(1);
            IFS0bits.AD1IF = 0;
        }

Which doesn't make sense because I clear AD1IF only if it is 0. AD1IF was probably already set and never cleared, which meant I was only waiting 5 microseconds for the ADC conversion; it is enough for 10 bits ADC, but not for 12.
 
I solved the problem by replacing it with

 
        while(!IFS0bits.AD1IF) {
            __delay_us(1);
        }
        IFS0bits.AD1IF = 0;
 

 
Initially however, this didn't work. The debugger would not stop in my code and I could not understand what was going on. I believe this to be caused by my ADCON3.ADCS bits, which were set to 0. Any bigger value solved the problem.
 
Now, the FRM is ambiguous about the "Conversion Clock settings". It is my understanding that they decide the speed of the conversion process, with higher values yielding slower conversions. What I don't get is what is the point in having a lower speed; does the precision increases? Is there a maximum speed that I should keep below of? It mentions the top conversion speed being "up to" 500 Ksps in 12 bit mode, but it sounds like a limit the device won't surpass, not something I should be wary about.
#2
Jump to:
© 2019 APG vNext Commercial Version 4.5