NerdKits - electronics education for a digital generation

You are not logged in. [log in]

NEW: Learning electronics? Ask your questions on the new Electronics Questions & Answers site hosted by CircuitLab.

Microcontroller Programming » How could I delay in microseconds instead of milliseconds?

January 31, 2011
by rboggs10
rboggs10's Avatar

I want to be able to set a pin to oscillate high and low in microseconds instead of milliseconds. I know about the "delay_ms();" function but I tried putting a decimal such as "0.01" as the parameter and that didn't work. How could I do this?

January 31, 2011
by Rick_S
Rick_S's Avatar

Try delay_us() instead.

February 01, 2011
by bretm
bretm's Avatar

Depending on the nature of the oscillations that you're trying to produce there might be better ways to do it instead of using delays. If you need it to oscillate continuously at a specific frequency you can use one of the pins connected to a timer and let the timer hardware toggle the pin for you.

February 01, 2011
by rboggs10
rboggs10's Avatar

Thanks Rick_S, it worked. I can't believe I didn't think of that.

Post a Reply

Please log in to post a reply.

Did you know that one NerdKits customer controlled a laser pointer with his computer using a microcontroller? Learn more...