dBm = 10 log (measured power/ 1 mW)

This formula means that dBm gives us the power of the signal relative to a standard 1 mW signal. Why not 1W?...convention. In fact, there is also dBW, which is power relative to, you guessed it, 1W!

So far so good...but what does it really mean? What confused me was "where is this 'measured power' that is compared to 1mW actually measured"? Let's imagine for a moment that we are a Spectrum Analyzer. Sitting inside our enclosure, we see the world through the connector on the front panel, i.e. the measured power must be measured inside the enclosure! not anywhere in the 'device under test' (DUT) where the analyzer is connected to! This is where the 'input impedance' of the analyzer comes into play. On the input of my analyzer it says '50 Ohm'. What this means is that we can basically think of the entire expensive analyzer as a 50 Ohm resistor connected between the input and ground. In other words, when the analyzer is connected to the DUT, then we actually load the DUT output with a 50 Ohm load.

Now it gets pretty straight forward: The measured power in the analyzer is the power dissipated in the 50 Ohm resistor. In other words, if we know the RMS voltage on the input of the analyzer then we can calculate the power in the resistor, and understand the dBm readout.

To validate this concept, I did a simple experiment: I hooked the 50 Ohm output of my waveform generator directly into the 50 Ohm input of the spectrum analyzer. I also hooked my oscilloscope into the output of the waveform generator via a BNC T-adapter. This is the circuit:

The 50 Ohm, 1 MOhm resistors and the 15 pF capacitor are the internal impedances of the generator, analyzer and oscilloscope. The oscilloscope also has a capacitive impedance, which is 15 pF.

(BTW: this great little schematic was made with iCircuit for iPad...a fantastic way to spend $10! Very easy to play with circuits to understand electronics concepts. The kicker: It does real time simulation while you play with circuit components - give it a try!)

This circuit basically illustrates that the voltage (amplitude) of the signal from the waveform generator is basically divided in half by the 50/50 voltage divider formed by the output impedance of the generator and the input impedance of the analyzer. How about the 1MOhm/15 pF impedance of the oscilloscope? Surely the 1 MOhm does not factor in much, but how about the capacitor? My experiment runs at 10.7 MHz, i.e. if we calculate the impedance of the capacitor via 1/(2*pi*f*C) we get 992 Ohm for this frequency. This means the cap can (at 10.7 MHz) be viewed as a 992 Ohm resistor in parallel with the 50 Ohm impedance of the analyzer. Calculating the resulting total resistance we get 47.6 Ohm, i.e. a change of about 5%. Not too impressive, but one definitely sees a small change in the dBm reading of the analyzer after hooking up the oscilloscope (it was about 0.2 dBm).

My goal in the experiment was to verify that the analyzer indeed gives me a 0 dBm readout if the power dissipated in the 50 Ohm impedance of the analyzer is 1 mW.

This photo shows the readout of the spectrum analyzer after playing with the waveform generator amplitude to achieve an exact 0.00 dBm peak value of the 10.7 MHz peak.:

Here is the waveform generator screen:

This means that a 0.64V amplitude apparently delivers 1 mW (0 dBm) into the analyzer 50 Ohm impedance. If the above circuit holds, we should get half this amplitude (0.32V) at the input of the analyzer. A look at the oscilloscope shows this:

The Rigol DS 1052 apparently measures amplitude across the entire ±wave form. It shows a value of 699 mV, corresponding to a true signal amplitude of 0.35 V, i.e. a bit more than half of the 0.64V shown on the function generator. The 'bit more' is probably a measurement problem. It cannot be explained by the oscilloscope impedance, which should have reduced the reading a bit. Pretty close, though! Lets calculate the power using the presumed 50% (0.32V) amplitude at the analyzer input:

The corresponding RMS amplitude is 0.32V/sqrt(2)=0.226V. Using P=V^2/R to calculate the power, we get: P=0.058/50 W = 0.00102 W = 1.02 mW. Pretty close! The 'mystery' of the spectrum analyzer readout seems to be solved!

While reading up on spectrum analyzers, I learned that they actually work pretty much like a FM tuner. Like a tuner, they work best at a certain input power level. It appears that actually attenuating the signal with a calibrated BNC attenuation 'pad' to a -20dB level or so may be a good idea to protect the fairly sensitive inputs of these devices as long as one has a strong enough signal. This is a bit like using a 1:10 oscilloscope probe as long as your voltages are not too low....this protects the oscilloscope input from too high voltages. For further reading: There is a great (but also rather long) introduction to spectrum analyzers from Agilent. It is posted here: http://cp.literature.agilent.com/litweb/pdf/5952-0292.pdf