Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feat / logfile analysis #1093

Merged
merged 10 commits into from
Jul 23, 2018

Conversation

WilliamHPNielsen
Copy link
Contributor

Changes proposed in this pull request:

  • Add a minimodule for logfile analysis
  • Add an example notebook with a crazy real-world example of an SR860 failing mid-run

@QCoDeS/core

@jenshnielsen
Copy link
Collaborator

Mypy is complaining. I think this is because you need an explicit import of dateutil.parser see https://stackoverflow.com/questions/23385003/attributeerror-when-using-import-dateutil-and-dateutil-parser-parse-but-no

It would be interesting to see if there are any recovering slowdowns. Could you do a plot of all time stamps larger than x where x is normaltime*1.1 or something like that

@codecov
Copy link

codecov bot commented May 15, 2018

Codecov Report

Merging #1093 into master will not change coverage.
The diff coverage is n/a.

@@           Coverage Diff           @@
##           master    #1093   +/-   ##
=======================================
  Coverage   79.78%   79.78%           
=======================================
  Files          48       48           
  Lines        6664     6664           
=======================================
  Hits         5317     5317           
  Misses       1347     1347

t0s = nfirsttimes.astype("datetime64[ns]")
t1s = nsecondtimes.astype("datetime64[ns]")
timedeltas = (t1s.values - t0s.values).astype('float')*1e-9

Copy link
Member

@sohailc sohailc May 15, 2018

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

consider rewriting the above as

ntimes = np.zeros((2, len(firsttimes)).astype("datetime64[ns]")

for count, times in enumerate([firsttimes, secondtimes]): 
	
	if "," in times.iloc[0]:
		ntimes[count] = times.str.replace(",", ".")
	else:
		ntimes[count] = times 

timedeltas = np.diff(ntimes, axis=1).astype(float) * 1E-9

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not sure... that snippet will not work as is. I modified it to work (spot three differences 😛 ):

    ntimes = np.zeros((2, len(firsttimes))).astype("datetime64[ns]")

    for count, times in enumerate([firsttimes, secondtimes]):
        if "," in times.iloc[0]:
            ntimes[count] = times.str.replace(",", ".")
        else:
            ntimes[count] = times 

    timedeltas = np.squeeze(np.diff(ntimes, axis=0).astype(float) * 1E-9)

which is actually slightly (2.5%) slower than the original. But I get your dissatisfaction with the two identical if statements.

@WilliamHPNielsen
Copy link
Contributor Author

Should we put this in? It's not very great, but it's also not harmful in any way, and it does have some utility.

Copy link
Contributor

@astafan8 astafan8 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

agree, it's a nice util.

@WilliamHPNielsen WilliamHPNielsen merged commit 0f85dfd into microsoft:master Jul 23, 2018
giulioungaretti pushed a commit that referenced this pull request Jul 23, 2018
Merge: 8c009af e662958
Author: William H.P. Nielsen <whpn@mailbox.org>

    Merge pull request #1093 from WilliamHPNielsen/feat/log_analysis
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants