PIC processor interrupt for delayed output

B

Bill Bowden

Guest
What is the best approach to generate a delayed output from a trigger
input to a PIC microcontroller? The idea is to have a constant known
time from trigger input to output, so I don't want to poll the logic
level on some pin due to the jitter that would occur. Thought about
using the comparator module to generate an interrupt, but it seems
complicated setting and clearing all the various bits and reference
levels.

Is there an easier way?

-Bill
 
"Bill Bowden" <wrongaddress@att.net> wrote in message
news:77a2a704-f080-4e01-897d-cee2c1f2eb28@h1g2000prh.googlegroups.com...
What is the best approach to generate a delayed output from a trigger
input to a PIC microcontroller? The idea is to have a constant known
time from trigger input to output, so I don't want to poll the logic
level on some pin due to the jitter that would occur. Thought about
using the comparator module to generate an interrupt, but it seems
complicated setting and clearing all the various bits and reference
levels.

Is there an easier way?

-Bill

You can use the interrupt on change facility. However, you can't poll or
write to the GPIO when you are doing this, because if you happen to be
accessing it, the interrupt flag will not be set. Arrgh. Very annoying bug
in the chip.

See GPIO INTERRUPT in section 12.4.3 in the PIC12F683 manual for more
information.

Regards,
Bob Monsen
 
On Apr 21, 7:40 pm, "Bob Monsen" <rcmon...@gmail.com> wrote:
"Bill Bowden" <wrongaddr...@att.net> wrote in message

news:77a2a704-f080-4e01-897d-cee2c1f2eb28@h1g2000prh.googlegroups.com...

What is the best approach to generate a delayed output from a trigger
input to a PIC microcontroller? The idea is to have a constant known
time from trigger input to output, so I don't want to poll the logic
level on some pin due to the jitter that would occur. Thought about
using the comparator module to generate an interrupt, but it seems
complicated setting and clearing all the various bits and reference
levels.

Is there an easier way?

-Bill

You can use the interrupt on change facility. However, you can't poll or
write to the GPIO when you are doing this, because if you happen to be
accessing it, the interrupt flag will not be set. Arrgh. Very annoying bug
in the chip.

See GPIO INTERRUPT in section 12.4.3 in the PIC12F683 manual for more
information.

Regards,
 Bob Monsen

Yes, I notice in the data sheet for the PIC16F628, an on-change
interrupt can be generated from any of 4 pins of PORTB <4,5,6,7> . Any
change will set the RBIF interrupt flag. But it's not clear how to set
it up so it jumps to the service routine when a change takes place. I
imagine the GIE bit needs to be set and maybe some others. There is a
note that the change may be missed if the input is too short, so I
guess it requires a minimum width input.

I might play around with that idea, to see if it works, but it's hard
to simulate in MPLAB, and single step the program. How do I simulate
an interrupt so the program will single step into the service routine?

-Bill
 
On Apr 22, 3:15 am, Bill Bowden <wrongaddr...@att.net> wrote:
What is the best approach to generate a delayed output from a trigger
input to a PIC microcontroller? The idea is to have a constant known
time from trigger input to output, so I don't want to poll the logic
level on some pin due to the jitter that would occur. Thought about
using the comparator module to generate an interrupt, but it seems
complicated setting and clearing all the various bits and reference
levels.

Is there an easier way?

You don't say what else you want the PIC to be doing while it's
monitoring the pin.

If you can run it in a tight loop I think you can test 8 input pins
(on one port) for a change in four cycles (16 clocks). If you unravel
the loop you can get that down to an average approaching 3 cycles at
the expense of code size.

You also don't say if you're then going to wait in another loop or
whether you'll set up a timer to trigger an interrupt when the output
needs to change state. Finally you don't say how long the delay needs
to be. Unless you can afford your interrupt routine to trash the W and
status register or you can guarantee that the interrupt will never
trigger when anything other than a known bank is selected then there
is quite a lot of overhead you'll need in the interrupt routine that
will limit your minimum delay.

Tim.
 
On Apr 23, 10:19 am, "goo...@woodall.me.uk" <goo...@woodall.me.uk>
wrote:

If you can run it in a tight loop I think you can test 8 input pins
(on one port) for a change in four cycles (16 clocks). If you unravel
the loop you can get that down to an average approaching 3 cycles at
the expense of code size.

Infact, for a single pin I think you can get this down to 3 cycles or
2 cycles with the loop unraveled.

Tim.
 
On Apr 23, 2:19 am, "goo...@woodall.me.uk" <goo...@woodall.me.uk>
wrote:
On Apr 22, 3:15 am, Bill Bowden <wrongaddr...@att.net> wrote:> What is the best approach to generate a delayed output from a trigger
input to a PIC microcontroller? The idea is to have a constant known
time from trigger input to output, so I don't want to poll the logic
level on some pin due to the jitter that would occur. Thought about
using the comparator module to generate an interrupt, but it seems
complicated setting and clearing all the various bits and reference
levels.

Is there an easier way?

You don't say what else you want the PIC to be doing while it's
monitoring the pin.
It doesn't do anything, just waits for an interrupt in an infinite
loop.

If you can run it in a tight loop I think you can test 8 input pins
(on one port) for a change in four cycles (16 clocks). If you unravel
the loop you can get that down to an average approaching 3 cycles at
the expense of code size.

You also don't say if you're then going to wait in another loop or
whether you'll set up a timer to trigger an interrupt when the output
needs to change state. Finally you don't say how long the delay needs
to be. Unless you can afford your interrupt routine to trash the W and
status register or you can guarantee that the interrupt will never
trigger when anything other than a known bank is selected then there
is quite a lot of overhead you'll need in the interrupt routine that
will limit your minimum delay.

Tim.
I got it working, and the interrupt routine calls an output routine
that sets and resets a couple pins according to a calculated delay.
This is what I want, but I'm still not sure what the jitter will be
without putting it on a scope and all that hassel. Using the on-change
interrupt requires the system to continually read the state of PORTB
and generate an interrupt when a change occurs. I don't know how many
clocks that takes and if the change is too fast, (say 100nS pulse), it
may not be seen at all. I'm going to try and overclock it at 25MHz to
reduce the jitter percentage.

Any idea how many clocks are involved monitoring the state of PORTB in
a very tight loop of one line where the program continually goes to
the same line and waits for an interrupt?

-Bill
 
On Apr 24, 2:11 am, Bill Bowden <wrongaddr...@att.net> wrote:
On Apr 23, 2:19 am, "goo...@woodall.me.uk" <goo...@woodall.me.uk
wrote:

On Apr 22, 3:15 am, Bill Bowden <wrongaddr...@att.net> wrote:> What is the best approach to generate a delayed output from a trigger
input to a PIC microcontroller? The idea is to have a constant known
time from trigger input to output, so I don't want to poll the logic
level on some pin due to the jitter that would occur. Thought about
using the comparator module to generate an interrupt, but it seems
complicated setting and clearing all the various bits and reference
levels.

Is there an easier way?

You don't say what else you want the PIC to be doing while it's
monitoring the pin.

It doesn't do anything, just waits for an interrupt in an infinite
loop.

If you can run it in a tight loop I think you can test 8 input pins
(on one port) for a change in four cycles (16 clocks). If you unravel
the loop you can get that down to an average approaching 3 cycles at
the expense of code size.

You also don't say if you're then going to wait in another loop or
whether you'll set up a timer to trigger an interrupt when the output
needs to change state. Finally you don't say how long the delay needs
to be. Unless you can afford your interrupt routine to trash the W and
status register or you can guarantee that the interrupt will never
trigger when anything other than a known bank is selected then there
is quite a lot of overhead you'll need in the interrupt routine that
will limit your minimum delay.

Tim.

I got it working, and the interrupt routine calls an output routine
that sets and resets a couple pins according to a calculated delay.
This is what I want, but I'm still not sure what the jitter will be
without putting it on a scope and all that hassel. Using the on-change
interrupt requires the system to continually read the state of PORTB
and generate an interrupt when a change occurs. I don't know how many
clocks that takes and if the change is too fast, (say 100nS pulse), it
may not be seen at all. I'm going to try and overclock it at 25MHz to
reduce the jitter percentage.

Any idea how many clocks are involved monitoring the state of PORTB in
a very tight loop of one line where the program continually goes to
the same line and waits for an interrupt?
It's all in the datasheet - at least for the 16F627 which is the one
I've been playing with.

IIRC the pin is checked on every clock - i.e. provided your pulse is
at least 1 clock it should be picked up. The interrupt is triggered on
the next cycle (cycle is four clocks) - it's not immediately obvious
from the datasheet if the interrupt will trigger on the same clock as
the pin is sampled so your jitter will be between 0 <= t <= 4 clocks
(where one or other of those <= should be <)

Using a tight polling loop
loop:
btfss PORTB,pin
goto loop

That's three cycles so your pulse needs to be at least 12 clocks wide
to guarantee to pick it up and 0 <= t < 12

Unravelling the loop
loop:
btfsc PORTB,pin
goto gotit
btfsc PORTB,pin
goto gotit
...
btfss PORTB,pin
goto loop
gotit:

Then it's usually two cycles but there's one three cycle case so
usually 0 <= t < 8 but worst case 0 <= t < 12 as before.


If you're really worrying about sub microsecond jitter (20MHz PIC
gives you a 200ns jitter) then I'm not sure that a PIC is the right
tool for the job.

20MHz pic gives you 50ns clock time so I'd expect pulses longer than
that to be detected assuming they rise fast enough at the pin.

Tim.
 
On Apr 23, 6:23 am, Bill Bowden <wrongaddr...@att.net> wrote:
On Apr 21, 7:40 pm, "Bob Monsen" <rcmon...@gmail.com> wrote:





"Bill Bowden" <wrongaddr...@att.net> wrote in message

news:77a2a704-f080-4e01-897d-cee2c1f2eb28@h1g2000prh.googlegroups.com...

What is the best approach to generate a delayed output from a trigger
input to a PIC microcontroller? The idea is to have a constant known
time from trigger input to output, so I don't want to poll the logic
level on some pin due to the jitter that would occur. Thought about
using the comparator module to generate an interrupt, but it seems
complicated setting and clearing all the various bits and reference
levels.

Is there an easier way?

-Bill

You can use the interrupt on change facility. However, you can't poll or
write to the GPIO when you are doing this, because if you happen to be
accessing it, the interrupt flag will not be set. Arrgh. Very annoying bug
in the chip.

See GPIO INTERRUPT in section 12.4.3 in the PIC12F683 manual for more
information.

Regards,
 Bob Monsen

Yes, I notice in the data sheet for the PIC16F628, an on-change
interrupt can be generated from any of 4 pins of PORTB <4,5,6,7> . Any
change will set the RBIF interrupt flag. But it's not clear how to set
it up so it jumps to the service routine when a change takes place. I
imagine the GIE bit needs to be set and maybe some others. There is a
note that the change may be missed if the input is too short, so I
guess it requires a minimum width input.

I might play around with that idea, to see if it works, but it's hard
to simulate in MPLAB, and single step the program. How do I simulate
an interrupt so the program will single step into the service routine?

RTFM
 

Welcome to EDABoard.com

Sponsor

Back
Top