NF 5030 attenuator setting / wrong readings
I am using an NF 5030 to measure the magnetic field both in a typical office 'background' situation and in the vicinity of high-voltage overhead transmission lines.
My question is regarding the correct attenuation factor to choose. A search of posts on here suggests that to measure very high fields it will be necessary to choose 30 or 40dB, and that for the very lowest fields perhaps 0dB is needed (i.e. I cannot just leave it on 'auto' which only switches between 10 and 20 dB).
I am confused however by how I can tell what attenuation factor to use at a given moment. Here in the office environment, where I would expect to see magnetic fields of, for instance, in the vicinity of maybe 0.1 to 10 uT and probably not more than 100 (very roughly, depending on how close to various bits of equipment I get), I get readings of between 20, 120 and 3000 uT depending on which attenuation I choose (0, auto/10, 40). Which attenuation is correct, and (crucially), how do I know which attenuation to select for a particular circumstance? Incidentally, a much less sophisticated three-axis magnetic field meter records between 0.1 and 1 uT as I move around the office and this reading accords much more closely with my expectations...
edit: this is using mostly the default "power" mode settings, 45-65Hz, 1300ms scan time, 3D magnetic field sensor.
edit2: actually I am doing this with an NF 5035 demo unit not an NF 5030 (but I will be buying the latter) and I seem to have been sent a very out of date manual with information that doesn't match the unit and says little about attenuation. Just to forestall any "read the bloody manual" replies
Thanks for your help,
It's quite simple to see if the right attenuator is used because in that case you can also use the next lower or higher attenuator and should get quite the same result.
But this will not work on very low (0dB attenuator needed) or very high fields (40dB attenuator needed).
E.g. you are using the 20dB attenuator. If you get quite the same reading with the 10dB attenuator and 30dB attenuator, the 20dB attenuator will be the right one. If you have huge changes instead you are using the wrong attenuator.
You also see it at the signal level as the signal level should be very stable. If the signal level changes a lot after each sweep and doesn't show a sharp peak you are using a wrong attenuator.
Updated: Attached a table and graphic showing what is happening (analog input). Exactly the same will happen on field strengths.
Thanks for your reply, that makes things clearer.
To avoid needing to use trial and error to find the right attenuation, is there a list of the magnetic field strengths each is best suited for?
e.g. 0-1 uT = 0dB, 1-10 uT = 10dB or something like that? It would be a very useful table to include in a future version of the manual I think.
No, because the sensitivity depends on the frequency.
Posting was updated with graphics...
"No, because the sensitivity depends on the frequency."
Given that we are interested in measuring 50 Hz fields the majority of the time, why is it not possible to supply such a list of recomended attenuator setting for 50 Hz electric and magnetic fields?
Because every unit is different.