You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have added "add remote host" form in the settings page for now.
I have 2 primary endpoints:
add_host (will be called from settings page)
add_host_to_task (will be called from job details)
Once the request comes in api.py's add_host what function should be invoked to add the host to dagobah_host table?
I am little confused with the way I should structure my code.
Should I create a new Host(object) class in core.py? And write a new function commit_host in base.py (which will endup in sqlite.py and mongo.py)?
If I do all that, I will end up writing a parallel Job object. But again, remote host info is completely isolated from the job info so does it make sense to keep these in two different classes.
DagobahHost model:
class DagobahHost(Base):
__tablename__ = 'dagobah_host'
id = Column(Integer, primary_key=True)
task_id = Column(Integer, ForeignKey('dagobah_task.id'), index=True)
name = Column(String(1000), nullable=False)
username = Column(String(1000), nullable=False)
password = Column(String(1000))
key = Column(String(1000))
def __init__(self, name, username):
self.name = name
self.username = username
def __repr__(self):
return "<SQLite:DagobahHost (%d)>" % self.id
@property
def json(self):
return {'task_id': self.task_id,
'name': self.name,
'username': self.username,
'password': self.password,
'key': self.key}
def update_from_dict(self, data):
for key in ['task_id', 'name', 'username', 'password', 'key']:
if key in data:
setattr(self, key, data[key])
The Host class should live in the poorly-named components.py module.
The core Dagobah class should maintain its own list of Hosts, very similar to how it maintains a list of the Jobs it knows about. You should add a commit_host method to all of the backends, as well as making relevant changes to other methods like delete_dagobah (basically, do for hosts whatever the backend is doing for jobs). The Mongo backend should get a new collection to store the Hosts on (in addition to their place as an array on the main Dagobah collection). The SQLite backend will need to do its whole normalized thing, adding at least one additional table depending on how you want to structure the Hosts themselves.
Can you elaborate on what you meant by a parallel job object?
You're tackling a pretty big feature here and getting yourself into pretty much the entire code base, so let me know if you'd like me to help with anything. Thanks again.
I have added "add remote host" form in the settings page for now.
I have 2 primary endpoints:
add_host
(will be called from settings page)add_host_to_task
(will be called from job details)Once the request comes in api.py's
add_host
what function should be invoked to add the host to dagobah_host table?I am little confused with the way I should structure my code.
Should I create a new Host(object) class in core.py? And write a new function
commit_host
in base.py (which will endup in sqlite.py and mongo.py)?If I do all that, I will end up writing a parallel Job object. But again, remote host info is completely isolated from the job info so does it make sense to keep these in two different classes.
DagobahHost model:
And in
DagobahTask(Base)
:What do you suggest? How will you structure your code?
The text was updated successfully, but these errors were encountered: