• AVR Freaks

Hot!Micro sec timer generation using code configuration

Author
SanjyotKadu
New Member
  • Total Posts : 9
  • Reward points : 0
  • Joined: 2020/02/10 04:24:00
  • Location: 0
  • Status: offline
2020/10/26 23:38:45 (permalink)
0

Micro sec timer generation using code configuration

Hi all,
 
I am working on a project where i need  to interface DHT22  to  PIC16F18877 Controller .
for this i need to be able to count time in micro seconds . I am planning to use timer 2 for this purpose but not able to generate timer in micro seconds . Max time i am able to generate is 60 micro seconds (refer attachment), i want it at least to 10 micro seconds . I am already using timer 0 for other task.
 
Please let me know how to measure this time in micro seconds or any other way to interface dht22 with pic16f18877 
 
thanks 
sanjyot
 
 
 
 

Attached Image(s)

#1
ric
Super Member
  • Total Posts : 28943
  • Reward points : 0
  • Joined: 2003/11/07 12:41:26
  • Location: Australia, Melbourne
  • Status: online
Re: Micro sec timer generation using code configuration 2020/10/27 00:17:01 (permalink)
0
SanjyotKadu
...
Max time i am able to generate is 60 micro seconds (refer attachment), i want it at least to 10 micro seconds .

Huh?
If max is 60, and you want 10, what is the problem?
How do you intend decoding the signal?
Can you disable interrupts while you are receiving (if you use them at all) ?
 
 
 

I also post at: PicForum
Links to useful PIC information: http://picforum.ric323.co...opic.php?f=59&t=15
NEW USERS: Posting images, links and code - workaround for restrictions.
To get a useful answer, always state which PIC you are using!
#2
TomasParrado
New Member
  • Total Posts : 25
  • Reward points : 0
  • Joined: 2019/12/27 06:21:04
  • Location: 0
  • Status: offline
Re: Micro sec timer generation using code configuration 2020/10/27 04:49:24 (permalink)
0
EDIT: This reply was stuck in pending so may no longer be relevant

It looks like you've setup an 8bit timer to count in 2us increments, which tallies with the period slider options from 2 to 512us. The period is the value to which the timer is compared (pg415 in the datasheet). When the timer reaches the period, typically you have it do something. To have it count in 1us increments set one of your prescalers to 1:1. You will then only be able to count up to 256us. 10us should be no problem.

If it was a typo and you meant 10ms then set-up you prescaler to 1:128. Then you'll count in 128us increments giving a range of ~0.13ms to ~32.8ms. If you want both range and precision, use one of the odd timers (1,3,5). They look to be 16bit
post edited by TomasParrado - 2020/10/27 05:26:17
#3
SanjyotKadu
New Member
  • Total Posts : 9
  • Reward points : 0
  • Joined: 2020/02/10 04:24:00
  • Location: 0
  • Status: offline
Re: Micro sec timer generation using code configuration 2020/10/27 05:18:53 (permalink)
0
aaha , 
 
If max is 60, and you want 10, what is the problem? ---> i mean maximum i delay using mcc i can produce is 60 micro sec, if i try to reduce it like 50 micro sec then , the program hangs . i guess its making all the interrupts to stop.
 
i am already using timer0 (with interrupts) to generate 10 mili sec of time and using this 10 miliseconds i am running my all tasks (basically i made scheduler out of it).
 
now, i need to measure time of digital inputs changing states (DHT22 Protocal), this time is in micro seconds e.g. 18 micro sec. etc , so i require another means to measure the time so want to use timer 2 to generate time inmicro sec.
 
yes i am using , interrupt to collect the time , my plan was to make a module which will run like stop watch in micro seconds using timer 2 like :
/********************************************/
int timer2count;
int option = stop;
 
timer2 interrupt
{
   stopwatch();
    timer2IF = 0;
}
 
stopwatch()
{
   
    switch(option)
    {
        case start:
          {
             timer2count++;
          }
 
         case stop :
         {
            timer2count = 0;
         }
 
   }
 
}
 
int getStopWatchTime()
{
  return timer2count ;
}
 
/*********************************************************/
 
now, as soon as using mcc i reduce time to 10 micro sec (actually i want 1 micro sec) , program hangs (i guess it is killing interrupts from timer 1 also , that is it is stopping the scheduler )
 
i tried using higher frequencies of internal clocks .
 
 
 
Can you disable interrupts while you are receiving (if you use them at all) ?-------------> did not do this , can you explain more on this , how to do it ?
#4
ric
Super Member
  • Total Posts : 28943
  • Reward points : 0
  • Joined: 2003/11/07 12:41:26
  • Location: Australia, Melbourne
  • Status: online
Re: Micro sec timer generation using code configuration 2020/10/27 05:22:56 (permalink)
0
Assuming you are running the PIC at 32mhz clock, so 8 instructions per microsecond, do you really think you could service an interrupt in less than 8 assembly instructions?
You're simply expecting too much from the PIC and are taking the wrong approach.

I also post at: PicForum
Links to useful PIC information: http://picforum.ric323.co...opic.php?f=59&t=15
NEW USERS: Posting images, links and code - workaround for restrictions.
To get a useful answer, always state which PIC you are using!
#5
NorthGuy
Super Member
  • Total Posts : 6404
  • Reward points : 0
  • Joined: 2014/02/23 14:23:23
  • Location: Northern Canada
  • Status: offline
Re: Micro sec timer generation using code configuration 2020/10/27 06:02:28 (permalink)
4.5 (2)
Use CCP to measure time between edges.
 
Your 10 us interrupt will be disabled for all the time while you process other interrupts, so whatever you measure will be highly inaccurate.
#6
oliverb
Super Member
  • Total Posts : 342
  • Reward points : 0
  • Joined: 2009/02/16 13:12:38
  • Location: 0
  • Status: offline
Re: Micro sec timer generation using code configuration 2020/10/27 06:11:24 (permalink)
0
Can you have the PIC work exlusively on reading the port. If so then use software timing with interrupts disabled.
You need to detect the data pin being low, then detect the low-to-high transition. Then wait a fixed time, 40-50us ought to do it and sample the data state at that point. If the line has already returned to low you have a zero, if it is still high then you have a one. Repeat until all bits are received. The initial "wait for low" loop could have a crude timeout such as a maximum loop count since the exact time isn't needed.
 
Incidentally I see that PIC does have CLC blocks so it may be possible to implement a "digital monostable" to recover the clock in which case the data stream could be fed to a MSSP, but I'd normally only try that AFTER a successful "bit bang" implementation.
 
post edited by oliverb - 2020/10/27 06:15:06
#7
vexorg
Super Member
  • Total Posts : 253
  • Reward points : 0
  • Status: offline
Re: Micro sec timer generation using code configuration 2020/10/27 06:28:43 (permalink)
0
On the limit, assembly would be better so you know exactly how long each step will take.
#8
Jump to:
© 2020 APG vNext Commercial Version 4.5