[...]
a __nop keyword almost makes sense, since if-anything, it can tell the
compiler not to optimize it away... (vs the compiler doing its best to
figure out that "some function in another compilation unit accessed via
a function pointer and performing a few global variable assignments" is
in-fact no-op...).
I'm still trying to understand why this, or anything like it, would make
sense. What would be the benefit of forcing NOP instructions in the
object code generated by a C compiler?
I actually use NOP delays primarily in writing serial protocols that I
have to bit-bang myself. For example, if I'm running a
microcontroller at 32 MHz, I sometimes need to insert small NOP delays
to create a electrical signal that is slow enough to be detected by
the peripheral (in my case a temperature sensor). If I ran the
microcontroller at a slower clock speed, I wouldn't need to insert
delays. I personally haven't used anything like this outside the
embedded domain.
\code snippet
/*
* Define the baseline for the delay according to the PIC18F2X1X/4X1X
* oscillator frequency. Each instruction requires 4 clock cycles to
* complete. The table below gives the measured time used by a NOP
* instruction 'asm( "nop" )'.
*
* Osc Freq | Delay
* -------------------
* 1 MHz | 4 ms
* -------------------
* 2 MHz | 2 us
* -------------------
* 4 MHz | 1 us
* -------------------
* 8 MHz | 500 ns
* -------------------
* 16 MHz | 250 ns
* -------------------
* 32 MHz | 125 ns
* -------------------
*/
#if defined(FOSC)
# if (FOSC == 1000000)
# define delay_4us asm( "nop" )
# elif (FOSC == 2000000)
# define delay_2us asm( "nop" )
# elif (FOSC == 4000000)
# define delay_1us asm( "nop" )
# define delay_2us delay_1us; delay_1us
# elif (FOSC == 8000000)
# define delay_500ns asm( "nop" )
# define delay_1us delay_500ns; delay_500ns
# define delay_2us delay_1us; delay_1us
# elif (FOSC == 16000000)
# define delay_250ns asm( "nop" )1
# define delay_500ns delay_250ns; delay_250ns
# define delay_1us delay_500ns; delay_500ns
# define delay_2us delay_1us; delay_1us
# elif (FOSC == 32000000)
# define delay_125ns asm( "nop" )
# define delay_250ns delay_125ns; delay_125ns
# define delay_500ns delay_250ns; delay_250ns
# define delay_1us delay_500ns; delay_500ns
# define delay_2us delay_1us; delay_1us
# else
# error FOSC is not a valid clock speed
# endif
#else
# error delay.h requires FOSC to be defined
#endif
\endcode
\code snippet2
void sensibus_transmit_start( uint8_t sensor_id )
{
/* ______ ______
* DATA: |_________|
* ___ ___
* CLK : ____| |_____| |____
*/
SENSIBUS_DATA_OUT( sensor_id );
SENSIBUS_CLK = 0;
C_SET_BIT( SHT1X_SENSOR_PORT, sensor_id );
delay_1us;
SENSIBUS_CLK = 1;
delay_1us;
C_CLEAR_BIT( SHT1X_SENSOR_PORT, sensor_id );
delay_1us;
SENSIBUS_CLK = 0;
delay_us( 5 );
SENSIBUS_CLK = 1;
C_SET_BIT( SHT1X_SENSOR_PORT, sensor_id );
delay_1us;
SENSIBUS_CLK = 0;
}
\endcode
If I strip out all the delays in the sensibus_transmit_start running
at 32 MHz, the temperature sensor can't keep up.
Best regards,
John D.