[SAC-HELP] question about "do file wild *SAC"

Igor Stubailo stubailo at gmail.com
Tue Jan 26 14:58:28 PST 2010


This is an update to my original message about "do file wild *SAC" problem.
Katie Boyle helped me to figure out what was happening.

So, the problem is with the sac memory limit. The  script works fine with a
few files but when I increase the number of them to 150, sac fails with the
segfault. Apparently, "do file wild *SAC" wants to fit all *SAC data into
the buffer and cannot.

The data files are 1 sps, 2 hour long LHZs. Each file is 29K (29436) long.

My sac version is the latest and greatest ver. 101.3b.
Is there a way to increase the total data size limit that can be used in
sac?

The workaround in my case is to read the sac files one by one through the
loop in the tshell outside sac.

Thanks,
Igor
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://www.iris.washington.edu/pipermail/sac-help/attachments/20100126/d7428b86/attachment.html>


More information about the sac-help mailing list