• AVR Freaks

Hot!Porting g8ulib to PIC18 // How to Start/Stop/SendByte on K42 architecture

Author
Roze Doyanawa
New Member
  • Total Posts : 22
  • Reward points : 0
  • Joined: 2009/10/27 07:56:40
  • Location: 0
  • Status: offline
2020/09/30 08:35:49 (permalink)
0

Porting g8ulib to PIC18 // How to Start/Stop/SendByte on K42 architecture

Hi,
I've been working to port the g8ulib to work on my PIC18F27K42. This includes removing everything unessesary, since my available space is not great. That and the fact that XC8 takes over 100 GB RAM when compiling it.
 
However, I've run into a bit of a dilemma. The I2C handling is totally Abstracted by the library. This all boils down to 4 functions.
i2c_init(), i2c_start(slave), i2c_send_byte(singleByte) and i2c_stop. However, according to my reading of the datasheets for said controller, there is no way to manually control start/stop conditions. And the library does Not send byte count ahead of time.
 
Of course, I could "buffer" the data and send everything on _stop, but that would waste so many cycles.
 
Any suggestions on workaround or how to properly do it? The old architecture had SEN and PEN, but I cant find any equivalents.
 
//Regards
 

void u8g_i2c_init(uint8_t options) {
    // ?
}

uint8_t u8g_i2c_wait(uint8_t mask, uint8_t value, uint8_t pos) {
    // ?
    return 1;
}

uint8_t u8g_i2c_start(uint8_t sla) {
    // ?
    return 1;
}

uint8_t u8g_i2c_send_byte(uint8_t data) {
    // ?
    return 1;
}

void u8g_i2c_stop(void) {
}

post edited by Roze Doyanawa - 2020/09/30 08:59:06
#1

11 Replies Related Threads

    mbrowning
    USNA79
    • Total Posts : 1811
    • Reward points : 0
    • Joined: 2005/03/16 14:32:56
    • Location: Melbourne, FL
    • Status: offline
    Re: Porting g8ulib to PIC18 // How to Start/Stop/SendByte on K42 architecture 2020/09/30 10:11:38 (permalink)
    0
    Three possibilities I can think of:
    1 - change g8ulib I2C implementation to better match the K42 i2c interface - probably too hard
    2 - use bit-bang I2C interface - probably too slow
    3 - probably just right
       a initialize ADB=1 and CNT = 255, enable module ( assumes less than 255 bytes to be sent)
       b ignore start command (return success)
       c on each send_byte, write TXB with the data. on first write, start condition will be generated
       d on stop command, disable I2C module and bit-bang stop condition. Go to a.
     
    Datasheet gives no examples with ADB=1 and I haven't used this mode, but should work. I have state-machine, interrupt, and DMA versions of K42 I2C master code working. DMA is cool - no software involvement at all beyond starting a packet, interrupt on stop, and for reads an interrupt in the middle to turn over from write to read.
     
    #2
    Mysil
    Super Member
    • Total Posts : 3809
    • Reward points : 0
    • Joined: 2012/07/01 04:19:50
    • Location: Norway
    • Status: online
    Re: Porting g8ulib to PIC18 // How to Start/Stop/SendByte on K42 architecture 2020/09/30 11:07:34 (permalink)
    0
    Hi,
    The I2C peripheral in K42 isn't directly suited to the abstraction expected in your library.
    as the hardware is intended to handle transfer of messages rather than single bytes,
    and expect to use Interrupt or DMA to feed bytes.
     
    I think however it should be possible to make something to work.
    The message length may possibly be faked by setting a large transfer length,
    and then clear it when the last byte have been sent.
     
    There is some example code in a MPLAB Xpress
    https://mplabxpress.microchip.com/mplabcloud/example/details/519
    It is written by the same author as a Application Note TB3191 I2C Master Mode, for the K42 device.
     
    There is a control bit to initiate a Start signal sequence,
    be aware that address of target slave may be expected to be set up before Start bit is set:
        I2C1PIR = 0;                        /* Clear all flags, Mysil. */
        I2C1ERR &= 0x0F;                    /* Mysil. */
        I2C1ADB1 = (uint8_t)(address<<1);
        wait4BusFree();
        I2C1CNT = 255;
        I2C1CON0bits.S = 1;
        wait4Start();  


    Then it is possible to make a function to transfer one byte at a time:
    void sendByte(uint8_t data)
    {
        uint8_t delayCounter = 255;     /* Timeout counter. */
        if(lastError == I2C1_GOOD)
        {
            while(--delayCounter)
            {
                if(I2C1STAT1bits.TXBE)
                {
                    I2C1TXB = data;
                    return;
                }
                else
                {
                    __delay_us(1);
                }
            }
            lastError = I2C1_FAIL_TIMEOUT;
        }
    }  

     
    When the last byte have been transferred, try to clear the transfer counter,
    {   I2C1CNT = 0;
        wait4Stop();
    }  

    and see what happens.
    Also MCC will generate some example code for PIC18F__K42
     
    Here is another thread that may be related.
    https://www.microchip.com/forums/FindPost/1127343
     
        Mysil
    #3
    Roze Doyanawa
    New Member
    • Total Posts : 22
    • Reward points : 0
    • Joined: 2009/10/27 07:56:40
    • Location: 0
    • Status: offline
    Re: Porting g8ulib to PIC18 // How to Start/Stop/SendByte on K42 architecture 2020/09/30 13:02:47 (permalink)
    0
    Thank you for your replies. I'll report back once I've tested. Need to wait for PCB to arrive to actually test code against PIC
    #4
    Roze Doyanawa
    New Member
    • Total Posts : 22
    • Reward points : 0
    • Joined: 2009/10/27 07:56:40
    • Location: 0
    • Status: offline
    Re: Porting g8ulib to PIC18 // How to Start/Stop/SendByte on K42 architecture 2020/10/12 08:57:11 (permalink)
    +1 (1)
    I'm starting to Detest XC8 as a compiler... Whenever I try to use indirect function calls, I always get thrown these
    ...
    u8g/u8g_com_api.c:68:: warning: (1471) indirect function call via a NULL pointer ignored
    u8g/u8g_com_api.c:68:: warning: (759) expression generates no code
    ...
    And compiler just REFUSES to generate the call. Even tho at some point it will not be null
     
    #5
    davea
    Super Member
    • Total Posts : 389
    • Reward points : 0
    • Joined: 2016/01/28 13:12:13
    • Location: Tampa Bay FL USA
    • Status: offline
    Re: Porting g8ulib to PIC18 // How to Start/Stop/SendByte on K42 architecture 2020/10/12 09:14:35 (permalink)
    0
    what is line 68 ?
    #6
    Mysil
    Super Member
    • Total Posts : 3809
    • Reward points : 0
    • Joined: 2012/07/01 04:19:50
    • Location: Norway
    • Status: online
    Re: Porting g8ulib to PIC18 // How to Start/Stop/SendByte on K42 architecture 2020/10/12 09:33:53 (permalink)
    0 (2)
    Hi,
    Even tho at some point it will not be null

    If you make code like that, that is not actually called from anywhere,
    then compiler will skip it, and tell you.
    Compiler cannot know what you will do in future revisions of code.
     
    Do not try to make a call with a pointer that have Not been initialized.
    You may test if the pointer have a NULL value.
    You may initialize a pointer with a function address in definition of the pointer variable.
     
        Mysil
    #7
    NKurzman
    A Guy on the Net
    • Total Posts : 18976
    • Reward points : 0
    • Joined: 2008/01/16 19:33:48
    • Location: 0
    • Status: offline
    Re: Porting g8ulib to PIC18 // How to Start/Stop/SendByte on K42 architecture 2020/10/12 10:43:19 (permalink)
    +1 (1)
    I guess Omniscient Code Generation does not include predictions of the Future.
    XC8 has a compiled Stack.  It needs to generate a call tree for every call.  Even Indirect ones.
    If you treat these underpowered 8 bit CPUs  like a 32 bit one you will have problems.
    If you want to use a Classic Style Stack, it is a compiler option. but you will pay a performance penalty for using it. 
    If the Compiler is giving you a waring, you should review it. It is usually right.
    #8
    Roze Doyanawa
    New Member
    • Total Posts : 22
    • Reward points : 0
    • Joined: 2009/10/27 07:56:40
    • Location: 0
    • Status: offline
    Re: Porting g8ulib to PIC18 // How to Start/Stop/SendByte on K42 architecture 2020/10/12 11:08:07 (permalink)
    0
    How do I enable traditional stacks? I'm just curious if the warning goes away.
     
    //Regards
    #9
    ric
    Super Member
    • Total Posts : 28691
    • Reward points : 0
    • Joined: 2003/11/07 12:41:26
    • Location: Australia, Melbourne
    • Status: online
    Re: Porting g8ulib to PIC18 // How to Start/Stop/SendByte on K42 architecture 2020/10/12 12:34:35 (permalink)
    +3 (3)
    Look in the User Guide for "5.2.4.2.2 Software Stack Operation"

    I also post at: PicForum
    Links to useful PIC information: http://picforum.ric323.co...opic.php?f=59&t=15
    NEW USERS: Posting images, links and code - workaround for restrictions.
    To get a useful answer, always state which PIC you are using!
    #10
    Roze Doyanawa
    New Member
    • Total Posts : 22
    • Reward points : 0
    • Joined: 2009/10/27 07:56:40
    • Location: 0
    • Status: offline
    Re: Porting g8ulib to PIC18 // How to Start/Stop/SendByte on K42 architecture 2020/10/13 13:32:49 (permalink)
    0
    Thank you. That works. Now all the indirect references work. Despite 20% larger size and quite slower. But main point now is to get Anything out of the display...
    I'm still not getting I2C to work...  I keep getting Stop conditions and hardware throws me out of transfer..
     
    Any ideas? And no, I'm not using MCC cause the code it generates is one heck of a state machine, which I can't really use for this.
     
    void I2C_Transmit(char address, int length, unsigned char *data) {
        I2C1ADB1 = address;
        I2C1CNT = length & 0xFF;
        while (length-- > 0) {
            I2C1TXB = *data++;
            while (!I2C1STAT1bits.TXBE) {
                // Getting stuck here. PCIF gets set and MMA gets cleared, even tho CNT is not 0
            }
        }
        while (!I2C1PIRbits.PCIF) {}
        I2C1PIRbits.PCIF = 0;
        I2C1PIRbits.SCIF = 0;
        I2C1STAT1bits.CLRBF = 1;
    }

     
    Registers at point of stuck:
    I2C1ADB1    SFR    0x3D6E    0x78    
    I2C1BTO    SFR    0x3D7C    0x00    
    I2C1CLK    SFR    0x3D7B    0x03    
    I2C1CNT    SFR    0x3D6C    0x19    
    I2C1CON0    SFR    0x3D73    0x84    
    I2C1CON1    SFR    0x3D74    0x80    
    I2C1CON2    SFR    0x3D75    0x20    
    I2C1ERR    SFR    0x3D76    0x00    
    I2C1PIE    SFR    0x3D7A    0x00    
    I2C1PIR    SFR    0x3D79    0x00     
    I2C1SCLPPS    SFR    0x3AE1    0x13    
    I2C1SDAPPS    SFR    0x3AE2    0x14    
    I2C1STAT0    SFR    0x3D77    0x80    
    I2C1STAT1    SFR    0x3D78    0x00   

     
    #11
    Roze Doyanawa
    New Member
    • Total Posts : 22
    • Reward points : 0
    • Joined: 2009/10/27 07:56:40
    • Location: 0
    • Status: offline
    Re: Porting g8ulib to PIC18 // How to Start/Stop/SendByte on K42 architecture 2020/10/14 16:59:20 (permalink)
    +1 (1)
    I figured out my issue. The I2C code works, it was a protocol misconception issue. The slave only accepted 1 or 2 bytes depending on command before forcing a stop. So proper grouping had to be done.
    #12
    Jump to:
    © 2020 APG vNext Commercial Version 4.5