Database layer for scenario and execute list#266
Conversation
|
I tried to Do you know what is going on with the |
|
@rouille give this a try update: try |
We cannot access postgres within the container? |
We can, I think it's only needed for the dependencies to build psycopg2, you don't actually have to run postgres. That should also get you tools like |
|
@jon-hagg I encountered the same failure as @rouille did above when I tried |
Cool, thanks for checking. I had an existing postgres installation so wasn't sure if it was needed. One thing about the libpq option - did you add the path to bashrc (or zshrc) and reload after |
No, I didn't do that. I simply tried |
|
I spun up the local db container as described in the Readme and tried to run the test. I've got following error message: I believe I didn't create the database properly. Once I have the local db container running, how to configure the test database according to the schema @jon-hagg mentioned in the gist? |
|
@BainanXia sorry, I'll add some instructions to the readme. |
|
If we have to install |
|
We should be able to use the dev server to test this. |
BainanXia
left a comment
There was a problem hiding this comment.
Thanks for the guidance. I finally got all tests passed.
(PowerSimData) bxia PowerSimData (jon/postgres) $ pytest .
================================================================================================= test session starts ==================================================================================================
platform darwin -- Python 3.8.3, pytest-5.4.3, py-1.9.0, pluggy-0.13.1
rootdir: /Users/bainanxia/OneDrive - Gates Ventures/Documents/GitHub/PowerSimData, inifile: pytest.ini
collected 231 items
powersimdata/data_access/tests/test_execute_list_store.py ...... [ 2%]
powersimdata/data_access/tests/test_scenario_list_store.py ..... [ 4%]
powersimdata/data_access/tests/test_sql_store.py .... [ 6%]
powersimdata/design/tests/test_object_persistence.py ... [ 7%]
powersimdata/design/tests/test_resource_target_manager.py ............... [ 14%]
powersimdata/design/tests/test_scenario_info.py ........ [ 17%]
powersimdata/design/tests/test_strategies.py ................ [ 24%]
powersimdata/design/tests/test_target_manager_input.py ... [ 25%]
powersimdata/design/tests/test_transmission.py ............................................................. [ 52%]
powersimdata/input/tests/test_change_table.py ............................ [ 64%]
powersimdata/input/tests/test_grid.py ............................ [ 76%]
powersimdata/input/tests/test_transform_grid.py .............. [ 82%]
powersimdata/input/tests/test_transform_profile.py ................... [ 90%]
powersimdata/tests/test_mocks.py .......... [ 95%]
powersimdata/utility/tests/test_distance.py ... [ 96%]
powersimdata/utility/tests/test_helpers.py . [ 96%]
powersimdata/utility/tests/test_transfer_data.py ....... [100%]
============================================================================================ 231 passed in 77.36s (0:01:17) ============================================================================================
For people who would like to setup a database within a local container for testing purpose:
- Have docker engine and docker compose installed on the local machine
- Do
docker-compose -f stack.yml upin the directory of filestack.yml - In a new terminal tab, create
schema.sql, copy and paste what @jon-hagg has written in the gist. - Do
psql -U postgres -h localhost, using password 'example' to get into psql shell of the container - Do
CREATE DATABASE psd;then\c psd - In the database shell, do
\i schema.sql, to check the two tables are created in the psd database successfully, one could run\dt - With the database setup and running in the local container, run
pytest.
|
I am testing this on the dev server. I have the db running in a container and can connect with the database via psql and confirm the db is running and setup. The tests fails saying the table does not exist: Any suggestion how to dig further? |
|
@kasparm Did you create the two tables according to the |
|
@BainanXia yes. Feel free to check it out. The db container is running on the dev server and you should be able to connect. |
|
@kasparm Try testing it again. It should work now. |
|
What did you change? |
|
@kasparm The two tables should be created in the |
|
Thanks for the help. So I had created the tables in postgres rather than in the psd database. |
|
Running on the dev server is one option, though I was thinking we can automate the setup for running locally, just need a bit more work on that. I believe we can run the container in github actions as well. |
|
Sorry guys, I am kind of lost. Would it be possible to have a step by step refresher on what to install and how to run it? |
|
@rouille see if the readme I added is helpful, combined with the schema.sql file from the gist. If it's unclear or missing anything else (besides the gist, which will eventually be checked in), I want to update it. |
@rouille I've written down the steps I went through to set up the local database and run the test above. |
Do you need to install |
|
@rouille Only |
Purpose
Implement storage api for scenario and execute list using postgres db, aiming to be consistent with the existing csv implementation. Provide base class
SqlStoreto simplify usage via a context manager and reduce boilerplate query definitions in theScenarioTableandExecuteTableclasses that inherit from this.What the code does
The shared logic is in
sql_store.py, and sql implementations have been added toscenario_list.pyandexecute_list.pyalongside the csv version. Tests are probably the best way to see how these are used. The readme inpowersimdata/data_accessdescribes how to run a local db container which is used for tests, and can be connected to manually (also see the note there, I haven't checked in the sql schema yet so if you want to try it out, check this gist or message me and I will help).What the code doesn't do, is change how we currently store data. Nothing here is actually used yet, but the PR is getting on the larger side so it's probably a good point to check in. Next steps will include having schema created automatically, and setting up a container on the server to migrate existing data into, while having the code write to both csv and sql.
Time to review
~ 40 mins