FreeRTOS Support Archive
The FreeRTOS support forum is used to obtain active support directly from Real
Time Engineers Ltd. In return for using our top quality software and services for
free, we request you play fair and do your bit to help others too! Sign up
to receive notifications of new support topics then help where you can.
This is a read only archive of threads posted to the FreeRTOS support forum.
The archive is updated every week, so will not always contain the very latest posts.
Use these archive pages to search previous posts. Use the Live FreeRTOS Forum
link to reply to a post, or start a new support thread.
[FreeRTOS Home] [Live FreeRTOS Forum] [FAQ] [Archive Top] [April 2009 Threads] How to delay in nanosecondPosted by Sebastian Aslund on April 9, 2009 Hello all
I am new to the FreeRTOS system and I need some delay to initializing a LCd display, but since it is only require a delay in nanosecond, then it seems a but overkill to use vTaskDelay in milisecond, but is it possible to make vTaskDelay to delay in nanoseconds? or is there another option? right now I use asm to create a delay.
Regards
RE: How to delay in nanosecondPosted by Adam Turowski on April 9, 2009 VTaskDelay uses scheduler to make a delay. It means two things: - delay time can be only a multiple of scheduler tick (usually multiple of 1ms) - other tasks can run when the task is delayed.
To use scheduler for delay you have to check ready tasks list and (if necessary) switch context. All of this takes some time, for example approx 10us for ARM7TDMI running with 48MHz core clock.
The conclusion is that to make very small delays use the loop (as you are doing now), for bigger delays (more than lets say two scheduler tick) use VTaskDelay.
RE: How to delay in nanosecondPosted by Adam Turowski on April 9, 2009 I forgot to add that if you need to do delay, which value is somewhere between 10us(task switching time) and 1ms(scheduler tick), you should use timer interrupts or get your hardware redesigned.
RE: How to delay in nanosecondPosted by Jaume Ribot on April 9, 2009 Try to use an asm("nop") loop strategy. Knowing the system clock you can obtain very little delays. It skips all the OS scheduling but you can obtain you low desired delay values.
Best regards Jaume
RE: How to delay in nanosecondPosted by Sebastian Aslund on April 9, 2009 Thanks for your replies. I just assumed that using asm("nop") was a brutal way of doing things, but if that is your only option, I have to stick with that :)
Regards
Sebastian
RE: How to delay in nanosecondPosted by incrediball on April 13, 2009 It might seem brutal and wasteful but compared to two task switches (to another task and back again), wasting small amounts of time with NOPs is insignificant.
Depending on your microprocessor and its clock speed, some of the required delays may be shorter than a single instruction. Therefore it's hard to get the timing optimal unless you use a logic analyser.
Copyright (C) Amazon Web Services, Inc. or its affiliates. All rights reserved.
|