From: dpb on
Rune Allnor wrote:
> On 12 apr, 15:25, dpb <n...(a)non.net> wrote:
....

>> I've seen others in both computations and literature than the particular
>> de facto 'standard' in ML...
>
> Examples?

Over 40 years and having been retired for over 10 now, which instrument
specifically had which would take some effort that I'm not sure it would
make any difference to go find again but various HP (now Agilent) and
Tek instruments had their own idiosyncrasies of what normalizations they
utilized internally.

>> plus windowing corrections, etc., that may
>> be in the other software that aren't in ML w/o manual incorporation.
>
....

> To get the numbers to match up - e.g. Parseval's relation - you
> need to introduce some additional scaling constants. Those have
> nothing to do with the FFT, but with the periodogram. If you
> want the numbers to match up to the physical world, you need to
> introduce calibration constants. Which have nothing to do with
> the peridogram or the FFT, but with the sensors.

Some do, others don't...I've been in precisely the same position as OP
in multiple instances and found empirically things like an extra (or
missing) "N", "sqrt(N)", "N/2" and various other combinations of the
above and more that needed simply to find empirically what the
instrument returned as opposed to the "gold standard" as you seem to
want to refer to it.

That there's a specific definition of FFT is true, that TMW uses perhaps
even the most common doesn't mean that others haven't been used in
various other places either on purpose or perhaps even out of ignorance.

> Again, all this is trivial. One only needs to know what one
> is talking about.

As well as what the instrument makers were talking about which sometimes
required some empirical prodding to uncover ime...

I've never had a case where the transducer calibrations weren't already
in the returned data from the instrument; only other normalizations for
windowing, the N vis a vis N/2, proper scaling for frequency binning,
etc., that were, in the end, a constant that could be determined which
factors were what by looking at the required number to obtain consonance.

My wager is still that that's what the OP will find...

If your experience is different, so be it.

--
From: Rune Allnor on
On 12 apr, 16:29, dpb <n...(a)non.net> wrote:
> Rune Allnor wrote:
> > On 12 apr, 15:25, dpb <n...(a)non.net> wrote:
>
> ...
>
> >> I've seen others in both computations and literature than the particular
> >> de facto 'standard' in ML...
>
> > Examples?
>
> Over 40 years and having been retired for over 10 now, which instrument
> specifically had which would take some effort that I'm not sure it would
> make any difference to go find again but various HP (now Agilent) and
> Tek instruments had their own idiosyncrasies of what normalizations they
> utilized internally.
>
> >> plus windowing corrections, etc., that may
> >> be in the other software that aren't in ML w/o manual incorporation.
>
> ...
>
> > To get the numbers to match up - e.g. Parseval's relation - you
> > need to introduce some additional scaling constants. Those have
> > nothing to do with the FFT, but with the periodogram. If you
> > want the numbers to match up to the physical world, you need to
> > introduce calibration constants. Which have nothing to do with
> > the peridogram or the FFT, but with the sensors.
>
> Some do, others don't...I've been in precisely the same position as OP
> in multiple instances and found empirically things like an extra (or
> missing) "N", "sqrt(N)", "N/2" and various other combinations of the
> above and more that needed simply to find empirically what the
> instrument returned as opposed to the "gold standard" as you seem to
> want to refer to it.

I have said nothing about a 'gold stanard'. I only happen to have
some
clue about DSP. The definition of the FFT/IFFT pair as used in
matlab is found in a number of textbooks:

Oppenheim & Schafer: "Digital Signal Processing" (1975) eqs. 3.5 &
3.6, p.89.
Rabiner & Gold: "Theory and applications of DSP" (1975) eqs. 2.132 &
2.136, p. 51.
Oppenheim & Schafer: "Discrete-Time Signal Procesing" (1989) eqs. 8.2
& 8.4, p. 542
Proakis & Manolakis: "DSP, Principles, Algorithms and
Applications" (1996)
eqs. 5.1.1 & 5.1.8, pp 394 & 395
Leland B. Jackson: "Digital Filters and Signal Processing" (1989),
eqs 7.1.4 and 7.1.6, p. 134

and those are just the books I have easily available.

> That there's a specific definition of FFT is true, that TMW uses perhaps
> even the most common doesn't mean that others haven't been used in
> various other places either on purpose or perhaps even out of ignorance.

Maybe not. But not knowing that what TMW uses is the de facto
standard can *only* be attributed to ignorance.

> > Again, all this is trivial. One only needs to know what one
> > is talking about.
>
> As well as what the instrument makers were talking about which sometimes
> required some empirical prodding to uncover ime...

Too bad that you are used sub-standard gear.

> I've never had a case where the transducer calibrations weren't already
> in the returned data from the instrument; only other normalizations for
> windowing, the N vis a vis N/2, proper scaling for frequency binning,
> etc., that were, in the end, a constant that could be determined which
> factors were what by looking at the required number to obtain consonance.

That might be the case in a dedicated instrument.

> My wager is still that that's what the OP will find...

He might find that if he gets access to the proprietary data formats
for the commercial package at hand. However, the prices he is quotes
all but guarantee that those data formats will not be openly
available.

The open data formats like .wav do not contain calibration
information,
only raw quantized data from the fixed-point ADC.

No sane programmer will allow the software package to interact with
the sensor except by reading the output. If the software package is
worth the suggested price tag, the full instrument package - from
sensor through cables through PC and software lisence - will have
been calibrated by some independent body, saving the calibration
data for the set-up locally on the PC. One might imagine the same
sensors and cables be used with a different PC/SW combo, in which
case the previous calibrations will be invalid if the sensor settings
are messed with. Similarly, the ADC on a single PC might be used
with a number of sensors, in which case other calibrations will
be invalidated if the ADC settings are messed with.

As I said several days ago:

1) Calibration data will be stored independently of the measured
data.
2) Calibration corrections will be implemented as pre-processing
steps in the processing chain.
3) Calibration data will not be available to arbitrary users, only
to the internals of the commercial package. That restricted access
is *the* reason why SW vendors can charge $20k per lisence.

All of this is totally trivial instrumentation issues.

> If your experience is different, so be it.

What is different between the two of us is that I actually know
what I am talking about, both with respect to DSP and
instrumentation.

Rune
From: dpb on
Rune Allnor wrote:
....

> What is different between the two of us is that I actually know
> what I am talking about, both with respect to DSP and
> instrumentation.

Well, I am quite confident I'm aware of what I've done as well.

The difference appears to be in what you're thinking the OP is getting
returned to him vs what I presume he has which is the output _INCLUDING_
those internal transducer calibrations, etc., etc., etc., ... but
lacking the scaling to engineering units.

If he instead does simply have raw transducer ADC values, that is,
indeed something else again and the compensation will not be a constant
and he's likely hosed unless he can find same or derive them by
comparison on a bin-by-bin basis.

--
From: Rune Allnor on
On 12 apr, 19:06, dpb <n...(a)non.net> wrote:

> The difference appears to be in what you're thinking the OP is getting
> returned to him vs what I presume he has which is the output _INCLUDING_
> those internal transducer calibrations, etc., etc., etc., ... but
> lacking the scaling to engineering units.

The guy talks about "sound pressure levels", which can only
be interpreted as if he thinks he gets the calibrated numbers.
If that is correct - that the package indeed produces the
sound pressure numbers - then the calibration data are missing
from the data he exports from that package.

> If he instead does simply have raw transducer ADC values, that is,
> indeed something else again and the compensation will not be a constant
> and he's likely hosed unless he can find same or derive them by
> comparison on a bin-by-bin basis.

I see no reason why he would get anything but the ADC values
exported to a public format. The calibration data is the sole
reason why anyone could defend a price tag of several tens of
thousands of dollars for what are effectively simple spectrum
analyses. Anyone selling a package for $20k would protect that
chargeable core. If the calibrated data were exported, anyone
could import them to some other package (e.g. matlab) and
do the analysis there.

There is partially a business aspect to this - the SW vendor
losing sales - but far more importantly, a QA/QC aspect. The
serious SW vendor will have had their computational algorithms
tested and certified by some standardizing body, which have
a number of boring legal implications for what the SW can be
used for.

If a measurement is to be used for something with legal
implications, e.g. settle a quarrel between a blast contractor
who used too much dynamite and a house owner who discovered
cracks in the basin walls, one does *not* want arbitrary
people or programs in that loop. One wants to be certain that
any particular analysis result is dominated by the data, and
not screwed up by an amateur fiddling with stuff he does
not understand.

The obvious way for the SW vendor to protect himself from
amateurs going in and doing stupid analyses is to be able
to say that "we don't export the calibration information
with the non-proprietary data formats, so any analyses based
on data available on non-proprietary data formats is invalid."

That way the SW vendors cover their backs wrt a number of
legal nuisances, ensuring that only their own, certified
software - and trained, certified operators - can do the
legally relevant stuff.

Rune
From: Johannes Buechler on
Rune Allnor <allnor(a)tele.ntnu.no> wrote in message <3cd2aa66-1f00-4a4a-8ffd-6a81c20b303b(a)u21g2000yqc.googlegroups.com>...
>
> > That there's a specific definition of FFT is true, that TMW uses perhaps
> > even the most common doesn't mean that others haven't been used in
> > various other places either on purpose or perhaps even out of ignorance.
>
> Maybe not. But not knowing that what TMW uses is the de facto
> standard can *only* be attributed to ignorance.

Ok, if that makes you happy.

> > > Again, all this is trivial. One only needs to know what one
> > > is talking about.

Right, but if everybody knew, you wouldn't be any special any more, so that'd be boring as well.

> He might find that if he gets access to the proprietary data formats
> for the commercial package at hand. However, the prices he is quotes
> all but guarantee that those data formats will not be openly
> available.

You are right that the details of the data format are not available to the customer. This however is irrelevant for this thread, because the numerical information contained in a exported format such as .mat or .wav (with restrictions) or .asc is identical, apart from some minor rounding errors due to different data word representations.

Or how else would you explain that if I import data that I had formerly exported as .mat or .asc (pure uncompressed numbers, prepresenting the unit of the respective channel - to stay at the example of this thread, Pascal), I get the exact same plots?

> No sane programmer will allow the software package to interact with
> the sensor except by reading the output. If the software package is

Am I wrong or do you completely ignore the fact that there is the actual measurement frontend between the sensor and the Software?

The Software, by the way, to record the data may be different from the Software to be used to create plots. I can measure with a frontend from a different manufacturer and still do the analysis in PAK. This is not standard of course, but it is done regularly in my daily work with different companies and different departments within one and the same company. Luckily, this works, otherwise we'd have the same dilemma as with MS-Word...

It is true though that plots don't look exactly alike between different Softwares. However, there is for sure no 3..5 dB shift in anything, it's rather much minor things. We can usually reproduce analyses very well between different Softwares, even if it comes to things like Pychoacoustic parameters, modulation analyses and what not.

> worth the suggested price tag, the full instrument package - from
> sensor through cables through PC and software lisence - will have
> been calibrated by some independent body, saving the calibration
> data for the set-up locally on the PC. One might imagine the same
> sensors and cables be used with a different PC/SW combo, in which
> case the previous calibrations will be invalid if the sensor settings
> are messed with. Similarly, the ADC on a single PC might be used

Again, you obviously ignore the presence of a measurement frontend. Also it appears that you have no clue at all of how multi-channel measurements in real life are performed. We have 120 parallel measurement channels, some record accelerometers, some record microphones, some record AH's, some record TTL signals and some record data from a bus system like CAN (extracted from the data stream by the frontend). Each channel can have sampling rates of up to 2^18 Hz - some channels (those for the TTL have even higher sampling rates - in return they store only the times of the slopes not the voltages itself).

The whole setup looks different each time, because different kinds of sensors are applied to the object of interest, and connected to the frontend just before the measurements are done. For instance, when I perform a measurement for a vehicle interior noise transfer path analysis on a dynanometer, I typically have around 90 accelerometer channels (30 sensors with 3D), some 10 microphones and some other stuff. The frontend just knows the sentitivity of each sensor and whether it needs an operating voltage or not. The sensitivity incorporates the calibration value which is obtained before the measurement by the user by means of a calibrator (at least for the microphones).

The computer can be exchanged, it is just connected to the measurement frontend via a LAN cable. Its only purpose during the measurement is to store the data that the frontend delivers. Also it can show some useful information on the display, like time data of one or several channels, or on-line-analyses of the data that is coming in.

No specific information other than the sensitivity of a sensor is available, since TED information is not used. (a while ago it did not exist anyway). The sensors are assumed to have a flat frequency response within a specified frequency range. It is up to the engineer to keep this in mind while interpreting the data afterwards.

So what would be the ominous "calibration data" you are talking about the whole time. I would really like to understand this.

If I import a Matlab data stream into the Software, I have to be able to get the same FFT plot as in Matlab itself. This is the whole issue of this thread.

> 1) Calibration data will be stored independently of the measured
> data.
> 2) Calibration corrections will be implemented as pre-processing
> steps in the processing chain.
> 3) Calibration data will not be available to arbitrary users, only
> to the internals of the commercial package. That restricted access
> is *the* reason why SW vendors can charge $20k per lisence.

Nobody said that each single license costs 20k. We are quite a few people working with it here.

> All of this is totally trivial instrumentation issues.

good to know, doesn't solve the problem, though.

Joe/"the guy"/OP