Thread: Re: question about "do file wild *SAC"

Started: 2010-01-27 06:58:28
Last activity: 2010-01-27 08:53:57
Topics: SAC Help
Igor
2010-01-27 06:58:28
This is an update to my original message about "do file wild *SAC" problem.
Katie Boyle helped me to figure out what was happening.

So, the problem is with the sac memory limit. The script works fine with a
few files but when I increase the number of them to 150, sac fails with the
segfault. Apparently, "do file wild *SAC" wants to fit all *SAC data into
the buffer and cannot.

The data files are 1 sps, 2 hour long LHZs. Each file is 29K (29436) long.

My sac version is the latest and greatest ver. 101.3b.
Is there a way to increase the total data size limit that can be used in
sac?

The workaround in my case is to read the sac files one by one through the
loop in the tshell outside sac.

Thanks,
Igor

  • Arthur Snoke
    2010-01-27 03:04:59
    In the HISTORY file that accompanies the version of SAC you are using, it
    says ...


    - Known Bugs/deficiencies
    - SSS traveltimes command (endian issue)
    - do file wild segmentation fault with a large number of files

    This bug will be corrected in the next release.

    When I first found this bug some years ago, I found it was not the number
    of files but the number of cumulative characters in the filespecs. The
    way we have gotten around it for now is to break up the list into several
    pieces:
    do file wild [A-J]*bhz
    enddo
    do file wild [K-N]*bhz
    enddo

    I think this has been discussed before in sac-help exchanges.

    On Tue, 26 Jan 2010, Igor Stubailo wrote:

    This is an update to my original message about "do file wild *SAC" problem.
    Katie Boyle helped me to figure out what was happening.

    So, the problem is with the sac memory limit. The  script works fine with a
    few files but when I increase the number of them to 150, sac fails with the
    segfault. Apparently, "do file wild *SAC" wants to fit all *SAC data into
    the buffer and cannot.

    The data files are 1 sps, 2 hour long LHZs. Each file is 29K (29436) long.

    My sac version is the latest and greatest ver. 101.3b.
    Is there a way to increase the total data size limit that can be used in
    sac?

    The workaround in my case is to read the sac files one by one through the
    loop in the tshell outside sac.

    Thanks,
    Igor


    • Igor
      2010-01-27 08:53:57
      Arthur,

      Thanks for clarification and one more way to work around this problem.

      -Igor

      On Wed, Jan 27, 2010 at 12:04 AM, Arthur Snoke <snoke<at>vt.edu> wrote:

      In the HISTORY file that accompanies the version of SAC you are using, it
      says ...


      - Known Bugs/deficiencies
      - SSS traveltimes command (endian issue)
      - do file wild segmentation fault with a large number of files

      This bug will be corrected in the next release.

      When I first found this bug some years ago, I found it was not the number
      of files but the number of cumulative characters in the filespecs. The way
      we have gotten around it for now is to break up the list into several
      pieces:
      do file wild [A-J]*bhz
      enddo
      do file wild [K-N]*bhz
      enddo

      I think this has been discussed before in sac-help exchanges.


      On Tue, 26 Jan 2010, Igor Stubailo wrote:

      This is an update to my original message about "do file wild *SAC"
      problem.
      Katie Boyle helped me to figure out what was happening.

      So, the problem is with the sac memory limit. The script works fine with
      a
      few files but when I increase the number of them to 150, sac fails with
      the
      segfault. Apparently, "do file wild *SAC" wants to fit all *SAC data into
      the buffer and cannot.
      The data files are 1 sps, 2 hour long LHZs. Each file is 29K (29436) long.

      My sac version is the latest and greatest ver. 101.3b.
      Is there a way to increase the total data size limit that can be used in
      sac?

      The workaround in my case is to read the sac files one by one through the
      loop in the tshell outside sac.

      Thanks,
      Igor



08:39:26 v.01697673