Processing experimental data: how to do it ?
maerten.sebastien at laposte.net
Tue Oct 5 10:34:32 PDT 2004
I've recently had to process what I call a huge amount of experimental
data, at work. I'd like to get your opinion on how to do this.
First, this the story. I've had to use one of those shiny oscilloscopes
wich run windows and let you save data files of tens MB and which does
not average power spectrums :(. The scope was available for a short time
only so I grabbed the data as fast as possible, put it in nicely named
folders and saved it for later processing. Back from this "measurement
campain" I found myself with 2.6 GB of raw data in the form of text
files, nice. Typically, one "waveform" is 500 k to 1 M points, and the
PC I have at work cannot handle that amount of data (using matlab on
windows). What I needed to do is go trough the directories, compute fft
for each data file and, for each final case, average the power spectrums
and plot it. What I've done is use a combination of octave + bash
scripts and it did the job :).
This situation is quite common for most of the physicists I know and I'd
like to know how you all handle this. Am I missing something or is this
always a real pain ? Do some of you face the same situations ?
More information about the lfs-chat