Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TODO: Andover release #155

Closed
34 of 36 tasks
wblumberg opened this issue Dec 11, 2018 · 13 comments
Closed
34 of 36 tasks

TODO: Andover release #155

wblumberg opened this issue Dec 11, 2018 · 13 comments

Comments

@wblumberg
Copy link
Contributor

wblumberg commented Dec 11, 2018

Primary Goals:

  • QC array functions for various params.py functions.
  • Make SHARPpy resistant to input soundings that have missing data at the very top (I think this causes a CAPE bug).
  • Run through archive files and ID the files that have extremely large height values below ground.
  • Make debug less noisy and write to .sharppy/ so users can send log to developers and developers don't tear their hair out trying to debug the program.
  • Use CI to include documentation when deploying the binaries.
  • Cleanup README file so there's little duplicate info with the online documentation.
  • Include command line tool that gets installed w/ setup.py to launch GUI (see https://gehrcke.de/2014/02/distributing-a-python-command-line-application/)
  • Deploy SHARPpy to pip and conda using Travis-CI.
  • Finish writing documentation (include a command-line args section).
  • Incorporate Travis CI fake X-window to do GUI unit testing (https://docs.travis-ci.com/user/gui-and-headless-browsers/ ... use the xvfb-run wrapper, the other stuff doesn't work).
  • Built test suite to run on the CI machine ( @tsupinie ).
  • Include sample data files (BUFKIT, SPC-format, PECAN-format; ensemble, deterministic) for users to mimic and incorporate their formats into the documentation.
  • Clean up real-time soundings on SHARP (some are throwing errors having to do with incorrect ordering of the data.)
  • Rebuild SHARP archive using new *.gem files.
  • Make font size increase with window size (some branches might have these modifications)
    • fire.py
    • winter.py
    • STP-EF
    • VROT
    • STP Stats
    • SHIP Stats
    • Temp Advection
    • Wind Speed w/ height
    • PHT
    • Storm Slinky
    • SR Wind w height
    • Theta-E w height
    • thermo.py
    • hodo.py
    • skew.py
    • kinematics.py

Low Priority Goals:

  • Incorporate PyLint to do code testing (work on cleaning up code too).
  • Attempt to incorporate the IEM BUFKIT soundings into SHARPpy.
  • Create FV3 BUFKIT files on SHARP.
  • Append something to the version name when a binary is deployed so we can ID if the problem is with a binary when SHARPpy is in the wild.
  • Go through and make docstrings for sharppy.viz files.
@wblumberg wblumberg pinned this issue Dec 16, 2018
@wblumberg
Copy link
Contributor Author

wblumberg commented Dec 21, 2018

Bug fix:

@wblumberg
Copy link
Contributor Author

  • Find workaround for archived GFS BUFKIT soundings. Prior to 7/20/2017, the soundings were output every 3 hours. After this date, the soundings were output every hour.

@wblumberg
Copy link
Contributor Author

wblumberg commented Jan 17, 2019

Status update for today:

Attempted to deploy v1.4.0a4 last night with @tsupinie . Seems the entire release pipeline went to hell:

  1. Appveyor successfully built the Windows binary using Python 2.7 and 64 bit architecture, but failed while trying to build the Python 3.6 version on 64 bit architecture. The binary that was successfully built was unable to be deployed to Github Releases. Logs for both of these builds are attached to this issue.

py26_64_appveyor.txt
py36_64_appveyor.txt

  1. The macOS binary was built and successfully pushed to Github Releases, but will not load. This is likely due to the fact that two additional dependencies were added to the program: python-dateutil and requests. This too may be an issue related to the inability to build the Windows binary on Appveyor. Both dependencies are needed to check Github Releases for a more recent version of SHARPpy.
  2. Pip was successfully built and deployed to PyPI. Yay. Submission to the most unforgiving online package repository was the only thing that succeeded.
  3. Azure Pipelines did not launch. This is likely due to the build requirement not handling tags.
  4. Conda build failed to create the correct .bz2 packages and therefore were not uploaded. I was able to reproduce the error below on my laptop:
Ws-MacBook-Pro:conda-recipe blumberg$ conda build .
No numpy version specified in conda_build_config.yaml.  Falling back to default numpy value of 1.11
WARNING:conda_build.metadata:No numpy version specified in conda_build_config.yaml.  Falling back to default numpy value of 1.11
Cloning into '/Users/blumberg/conda-bld/sharppy_1547684816468/work'...
done.
Checking out files: 100% (2249/2249), done.
checkout: 'andover'
Checking out files: 100% (837/837), done.
Branch 'andover' set up to track remote branch 'andover' from 'origin'.
Switched to a new branch 'andover'
==> git log -n1 <==

commit 5b1a3ecea303e7e3e6bc97c38a45e676d187feee
Author: Greg Blumberg <wblumberg@ou.edu>
Date:   Wed Jan 16 02:01:47 2019 -0500

    trying to fix deployment pipeline

==> git describe --tags --dirty <==

v1.4.0a4

==> git status <==

On branch andover
Your branch is up to date with 'origin/andover'.

nothing to commit, working tree clean

Adding in variants from internal_defaults
INFO:conda_build.variants:Adding in variants from internal_defaults
/Users/blumberg/anaconda3/lib/python3.6/site-packages/conda_build/environ.py:422: UserWarning: The environment variable 'CONDA_BLD_PATH' is being passed through with value /Users/blumberg/conda-bld.  If you are splitting build and test phases with --no-test, please ensure that this value is also set similarly at test time.
  UserWarning
/Users/blumberg/anaconda3/lib/python3.6/site-packages/conda_build/environ.py:422: UserWarning: The environment variable 'CONDA_BLD_PATH' is being passed through with value /Users/blumberg/conda-bld.  If you are splitting build and test phases with --no-test, please ensure that this value is also set similarly at test time.
  UserWarning
/Users/blumberg/anaconda3/lib/python3.6/site-packages/conda_build/environ.py:422: UserWarning: The environment variable 'CONDA_BLD_PATH' is being passed through with value /Users/blumberg/conda-bld.  If you are splitting build and test phases with --no-test, please ensure that this value is also set similarly at test time.
  UserWarning
Received dictionary as spec.  Note that pip requirements are not supported in conda-build meta.yaml.
Ws-MacBook-Pro:conda-recipe blumberg$ 

The reason for this issue is unclear. This link: bioconda/bioconda-utils#157, seems to suggest that it's due to a pip requirement being used in the meta.yaml file, but there's no reference to pip in conda-recipes/meta.yaml. The deployment worked previously, so I'm not sure what's going on. Might need to look at the versions of conda that are being used on the CI services.

  1. Documentation build and deployment CI freezes when it gets to the step where the sphinx et al. are downloaded and deployed. This also worked previously, so I'm not sure what's going on here. This happens on the 4th VM Travis CI spins up where Linux and Python 3 are used.

Possible Solutions:

Tried this last night in: 5b1a3ec

  1. I tried to remove some of the packages shown in the install_requires list in setup.py. This didn't fix the Conda-build issue.

  2. Moved the deployment to PyPI in .travis.yml to be the last step. This way if it the deployment fails in any previous step (e.g. conda), we don't deploy to PyPI.

  3. "Turned off" deployment to PyPI by forcing it to only deploy if it's on the "andover-off" branch. This branch doesn't exist, so deployment to PyPI will not occur anymore. This should help with testing.

  4. Tried to load sphinx using conda install. It was slow. The problem may have been that my command wasn't correct. I changed ci/build_docs.sh to reflect the new command from conda install -q -c sphinx sphinx-gallery to conda install -q -c anaconda sphinx sphinx_rtd_theme and conda install -q -c conda-forge sphinx-gallery. Haven't tested this yet though. It may solve the docs deployment step - see 29f0700 for the commit in which I did this.

  5. Tried to add in requests and dateutil.parser into the .spec file for macOS. The binary built locally on my machine, but it will not load - it looks like it's having problems with the requests/python-dateutil dependencies. See sharppy-out.txt for the results from the binary. Here is the commit: 1b33954.

sharppy-out.txt

@wblumberg
Copy link
Contributor Author

wblumberg commented Jan 18, 2019

Got the conda-build working again on my machine...see d142a7e. The error was because the NUMPY_VERSION environmental variable was not being read in or parsed correctly when the meta.yaml was not being read in.

@wblumberg
Copy link
Contributor Author

ci/build_docs.sh is slow when the sphinx-gallery extension is downloaded from conda, but it successfully installed. See: 42e63db

@wblumberg
Copy link
Contributor Author

wblumberg commented Jan 18, 2019

Regarding AppVeyor and the Windows binary: this doesn't seem to be working anymore. I've started building binaries to Azure-Pipelines. The deployment of these binaries works a lot more smoothly than the one with AppVeyor - binaries actually get built and published to Github Releases as a draft along with an automatically build CHANGELOG. I can also output the artifacts from PyInstaller from both the OSX and windows build for trouble shooting in addition to the screenshots of the GUI from Azure. Results from pytests also get published nicely in the browser. With some more modifications I may be able to enable to build tagged releases.

A problem has arisen with the PyInstaller that I've used the artifacts to better understand. On the VMs, none of the relevant conda packages (PySide, requests, python-dateutil) get packaged into the binaries. The sharppy packages appear to get included though. I am not sure how to get around this yet.

@wblumberg
Copy link
Contributor Author

Now doing pyinstaller from within the runsharp directory. It successfully finds the full_gui import, but does not load in any other packages (e.g., Numpy). It does load in the PySide hooks like it should though. A problem is arising with the .nib file PySide needs to handle the Mac menu bars. For seem reason the directory gets ripped out on Azure.

@wblumberg
Copy link
Contributor Author

Got Azure to build the macOS version almost correctly: an error associated with writing the correct version in the .spec file using versioneer keeps preventing the binary from being built correctly. Not sure what's going on with the windows version.

@wblumberg
Copy link
Contributor Author

@wblumberg
Copy link
Contributor Author

wblumberg commented Sep 13, 2019

Known issues right now:

  • SHARPpy binaries (at least macOS) will not load SPC Observed data. But the drop down box suggests that it does see the correct times when data is available.
  • SHARP server is behind in creating international soundings by 3 days (but archiving still happens).

Still need to test:

  • Issue reported by Victor about inability to load BUFKIT files.
  • Are the SHARPpy Windows binaries being built correctly?
  • Conda package submission.

Todo before release:

  • Edit documentation to remove odd references.
  • Add info on finding versioning information on the Issues pin.

@wblumberg
Copy link
Contributor Author

wblumberg commented Sep 14, 2019

Update on binaries:

  • macOS binary built by Azure is acting strange. getAvailableAtTime() within available.py throws a URLError when Observed is accessed. Models work though. I suspect a similar issue is what happened to the Xenia binaries. For the binary, available.py is stored within the binary as opposed to the typical location (~/.sharppy). Need to confirm that this issue persists when the macOS binary is built locally on my computer. But available.py works when it's not running in the binary. Binary works though for the most part.
  • Windows binary won't even launch. Tried building on Win 8 machine (SHARPpy-win8.spec) and Azure machine (SHARPpy-win7-64.spec). PySide2 hooks were found though. Both binaries tried are unable to launch for similar reasons - "qt.qpa.plugin: Could not find the Qt platform plugins "windows" in "" This application failed to start because no Qt platform plugin could be initialize. Reinstalling the application may fix this problem."

Tried some things to solve the Windows problem:

Putting the platforms/ folder next to the SHARPpy.exe per the recommendation in the last comment of (https://stackoverflow.com/questions/47468705/pyinstaller-could-not-find-or-load-the-qt-platform-plugin-windows) successfully allowed the SHARPpy.exe to run. It did NOT have the problem with the available.py seen in the macOS binary.

Found out I can debug Pyinstaller binaries using pyi-archive_viewer and by using the debug and console switches in the EXE (https://stackoverflow.com/questions/13765801/how-do-i-debug-a-non-functioning-pyinstaller-build).

@wblumberg
Copy link
Contributor Author

Replies to my post on pyinstaller/pyinstaller#2857 oriented me to try to test making an Anaconda environment using the Pip version of PySide2. This is commit bf7dc12 and was successful at building a binary that could be launched on Windows 8. However, this environment.yml caused the Linux CIs to fail. Maybe I can try using PySide2 5.12 from Conda-forge.

Still need to put print statements everywhere in the Mac version to figure out what's going on there.

@wblumberg wblumberg unpinned this issue Mar 11, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant