Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion Pipfile
Original file line number Diff line number Diff line change
Expand Up @@ -14,4 +14,4 @@ paramiko = "==2.7.1"
pytest = "==5.4.3"
scipy = "==1.4.1"
tqdm = "==4.29.1"

psycopg2 = "*"
144 changes: 82 additions & 62 deletions Pipfile.lock

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

34 changes: 34 additions & 0 deletions powersimdata/data_access/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,34 @@
## About
The `powersimdata.data_access` package contains implementations of storage to
be used by the simulation framework generally. By providing a consistent api
for any given set of data, we can decouple the storage medium from application
logic.

Currently, there are csv and sql implementations for the scenario list and
execute list.

## Usage
To try this out, use the `stack.yml` provided in the tests directory to run a
local instance of postgres, plus an admin ui. The integration tests for the db layer are run against this instance, and you can also connect to it with `psql`, the standard cli tool for interacting with postgres.

Start the container using the following command, taken from the postgres
[docs](https://github.com/docker-library/docs/blob/master/postgres/README.md).
```
docker-compose -f stack.yml up
```

Note - the schema are not automatically created (or part of source control) at this point, so to run the tests you'll need to do this manually. Improvements on this are forthcoming.

## Schema creation
To get a working local database, run the container then do the following:

```
# connect to container, use password from stack.yml
psql -U postgres -h localhost
```

Once in the `psql` shell, you can create the database using `CREATE DATABASE
psd;` then connect to it with `\c psd`. Make sure there is a file called
`schema.sql` in your current directory then run `\i schema.sql`. Now if you run
`\dt` you should see the tables have been created, and should be able to run
the integration tests.
2 changes: 1 addition & 1 deletion powersimdata/data_access/__init__.py
Original file line number Diff line number Diff line change
@@ -1 +1 @@
__all__ = ["csv_store", "execute_list", "scenario_list"]
__all__ = ["csv_store", "sql_store", "execute_list", "scenario_list"]
60 changes: 59 additions & 1 deletion powersimdata/data_access/execute_list.py
Original file line number Diff line number Diff line change
@@ -1,9 +1,67 @@
from powersimdata.data_access.csv_store import CsvStore
from powersimdata.data_access.sql_store import SqlStore, to_data_frame
from powersimdata.utility import server_setup


class ExecuteTable(SqlStore):
"""Storage abstraction for execute list using sql database.
"""

table = "execute_list"
columns = ["id", "status"]

def get_status(self, scenario_id):
"""Get status of scenario by scenario_id
:param str scenario_id: the scenario id
:return: (*pandas.DataFrame*) -- results as a data frame.
"""
query = self.select_where("id")
self.cur.execute(query, (scenario_id,))
result = self.cur.fetchmany()
return to_data_frame(result)

def get_execute_table(self, limit=None):
"""Return the execute table as a data frame
:return: (*pandas.DataFrame*) -- execute list as a data frame.
"""
query = self.select_all()
self.cur.execute(query)
if limit is None:
result = self.cur.fetchall()
else:
result = self.cur.fetchmany(limit)
return to_data_frame(result)

def add_entry(self, scenario_info):
"""Add entry to execute list
:param collections.OrderedDict scenario_info: entry to add
"""
scenario_id, status = scenario_info["id"], "created"
sql = self.insert()
self.cur.execute(sql, (scenario_id, status,))

def update_execute_list(self, status, scenario_info):
"""Updates status of scenario in execute list

:param str status: execution status.
:param collections.OrderedDict scenario_info: entry to update
"""
self.cur.execute(
"UPDATE execute_list SET status = %s WHERE id = %s",
(status, scenario_info["id"]),
)

def delete_entry(self, scenario_info):
"""Deletes entry from execute list.

:param collections.OrderedDict scenario_info: entry to delete
"""
sql = self.delete("id")
self.cur.execute(sql, (scenario_info["id"],))


class ExecuteListManager(CsvStore):
"""This class is responsible for any modifications to the execute list file.
"""Storage abstraction for execute list using a csv file on the server.

:param paramiko.client.SSHClient ssh_client: session with an SSH server.
"""
Expand Down
Loading