Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add read support for FOCMEC output files #2156

Merged
merged 19 commits into from Feb 15, 2019
Merged

Conversation

megies
Copy link
Member

@megies megies commented May 22, 2018

What does this PR do?

Add read support for focal mechanisms calculated using program FOCMEC (out and lst files)

PR Checklist

  • Correct base branch selected? master for new features, maintenance_... for bug fixes
  • the PR is making changes to documentation +DOCS
  • All tests still pass.
  • Any new features or fixed regressions are be covered via new tests.
  • Any new or changed features have are fully documented.
  • Significant changes have been added to CHANGELOG.txt .

In [1]: cat = read_events('/path/to/focmec_8sta.lst')

In [2]: print(cat[0].focal_mechanisms[0])
FocalMechanism
	            resource_id: ResourceIdentifier(id="smi:local/ab869739-a73b-411f-b788-9d39e5edaf0a")
	           nodal_planes: NodalPlanes(nodal_plane_1=NodalPlane(strike=59.08, dip=76.43, rake=-64.23), nodal_plane_2=NodalPlane(strike=174.99, dip=28.9, rake=-150.97), preferred_plane=1)
	          azimuthal_gap: 236.7
	 station_polarity_count: 23
	                 misfit: 0.0
	          creation_info: CreationInfo(creation_time=UTCDateTime(2017, 9, 8, 14, 54, 58), version='FOCMEC')
	                   ---------
	               comments: 1 Elements

In [3]: print(cat[0].focal_mechanisms[0].comments[0].text)

     Dip,Strike,Rake     76.43    59.08   -64.23
     Dip,Strike,Rake     28.90   174.99  -150.97   Auxiliary Plane
     Lower Hem. Trend, Plunge of A,N     84.99    61.10   329.08    13.57
     Lower Hem. Trend, Plunge of P,T    358.82    51.71   128.90    26.95
     B trend, B plunge, Angle:  232.62  25.00 105.00

          Log10(Ratio)                              Ratio     S Polarity
     Observed  Calculated    Difference  Station     Type     Obs.  Calc. Flag
      0.8847      0.8950      -0.0103      BLA        SH       R      R       
      1.1785      1.0810       0.0975      COR        SH       R      R       
      0.6013      0.5442       0.0571      HRV        SH       R      R       
      0.3287      0.3666      -0.0379      KEV        SH       L      L       
      0.8291      0.9341      -0.1050      KIP        SH       R      R       
      0.8033      0.7815       0.0218      KIP        SV       B      B       
      1.0783      1.1857      -0.1074      PAS        SH       R      R       
      0.2576      0.2271       0.0305      TOL        SH       L      L       
     -0.2762     -0.4076       0.1314      TOL        SS       F      F    NUM
     -0.4283     -0.4503       0.0220      HRV        SS       F      F       
     -0.0830      0.0713      -0.1543      KEV        SS       B      B       

Total number of ratios used is  11
RMS error for the 11 acceptable solutions is 0.0852
  Highest absolute velue of diff for those solutions is  0.1543

@megies megies added the .io issues generally related to our read/write plugins label May 22, 2018
@megies megies added this to the 1.2.0 milestone May 22, 2018
@megies megies self-assigned this May 22, 2018
@krischer
Copy link
Member

Let us know once this is ready for review.

@megies
Copy link
Member Author

megies commented May 25, 2018

Ready for review.. it's a bit ugly but this comes mostly from the lst files output by FOCMEC not being very well constrained. They can look pretty different depending on whether input is polarities only or ratios only or both, and also depending on the chosen settings when running FOCMEC. So for many things I have to look through the whole header / focal mechanism block with regex because we can not rely on positioning of information.

But there's several test files that look really different and they all read OK. So that's good enough for me right now. Should people hit read errors we can still fix it later.

@megies
Copy link
Member Author

megies commented Oct 8, 2018

Rebased

@krischer krischer added this to Waiting for Review in Release 1.2.0 Feb 14, 2019
@megies
Copy link
Member Author

megies commented Feb 14, 2019

Rebased on current master and force-pushed so that we have fresh CI results tomorrow.

Copy link
Contributor

@ThomasLecocq ThomasLecocq left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

failed tests on travis & appveyor seem unrelated

Just check whether your datetime stuff is still needed

Event, FocalMechanism, NodalPlanes, NodalPlane, Comment, CreationInfo)


# XXX some current PR was doing similar, should be merged to
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

what about this ? is this still valid?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah I think so, just a note for future reference I guess



# XXX some current PR was doing similar, should be merged to
# XXX core/utcdatetime.py eventually..
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

idem

@megies
Copy link
Member Author

megies commented Feb 15, 2019

Test fails are unrelated.

@megies megies merged commit 7546955 into obspy:master Feb 15, 2019
@megies megies deleted the read_focmec branch February 15, 2019 15:11
@megies megies moved this from Waiting for Review to Done in Release 1.2.0 Feb 15, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
.io issues generally related to our read/write plugins
Projects
No open projects
Development

Successfully merging this pull request may close these issues.

None yet

3 participants