From: tanguy on
I'm trying to see the spectrum of a 100Hz sine wave.
I'm using a sine wave at 100Hz, Amplitude=1 and sample time = 1/10000.
The output of sine wave go to a spectrum scope (dBm, Buffer size = 128*128)
The output of sine wave go also to a running RMS block and a dBm conversion block using 1 ohm giving 27 dBm which is normal.
To see something useful in the spectrum scope i need to run the simulation 10s. How is it possible to reduce the simulation time keeping the good results around 100 Hz ?
Seeing the spectrum in the spectrum scope at 100 Hz i can evaluate the power to 25 dBm and not 27 which is the good result. Why ? The power is always calculated with 1 Ohm ?
Thanks in advance.
Best regards
Eric