You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This is the main issue that collects together all of the individual todo points we would like to accomplish. This issue and the Developer Package + Paper Revision Milestone go together, and this issue should be treated as a long form (markdown enabled) description of that milestone.
I have grouped together Paper Revisions and the Dev Package because the Dev Package is a crucial step to increase the accessibility of the NeuroCAAS Development workflow, and I'd like to have it streamlined before drawing lots of attention to our project again.
Topics
I have broken down the work items we will address into several topics:
Developer Package
Developer Package 1: Dockerization: In the time between submitting the paper and now, it's become clear that Docker would make the developer process a lot smoother. In particular, we can go from developers spending most of their time configuring and saving an AWS instance that we host, to having them develop and test a docker image locally that is compatible with NeuroCAAS, and notifying us when it's ready to be deployed. This process has the following workflow:
Package methods to set up/develop docker container to be compatible with NeuroCAAS (Docker Prototype #22)
Set up local testing and logging to mock what user would see in S3 bucket (Docker Prototype #22)
With dockerized analyses, it's more feasible to address some more reviewer comments like:
How would you handle custom preprocessing? (develop locally, run locally or set up PR to NeuroCAAS)
What about a local/cluster implementation (we don't have plans to do it, but it's more feasible now).
Developer Package 2: Analysis Monitoring/Update: We currently have data about usage of each analysis, the number of users/ the number of active jobs, and total per-user usage that we are using to monitor costs and restrict usage as necessary. It would be good to process this information and make it available to developers through some simple interface. Likewise, as users develop their analyses, we want to make it easy for them to update. Most of the time this should be possible by simply updating the docker image their analysis lives in, and submitting a pull request again.
Paper Revisions 1: Developer Perspective: One comment we heard from both reviewers was that we did not focus on the developer's perspective. To this end:
Come up with some good use cases that we can put into a figure (chaining together analyses, running two analyses on the same data), and add a corresponding section to results. (Paper Revisions: Developer Perspective #23)
Mention in text that users can put up their own preprocessing through our new dev workflow, or download a docker image that does the same processing locally. (Paper Revisions: Developer Perspective #23)
Feature the developer's workflow more prominently in Figure 2. (Figure 2 Revision #30)
Paper Revisions 2: Usage Metrics: We are now collecting usage metrics for NeuroCAAS via Google Analytics and through the records generated when users run analyses. These could be added to or replace the schematic currently included as Figure 3.
Make some quantifications of : 1) site visits, 2) distribution of compute usage per user, 3) distribution of analysis duration per user. (Per-Analysis Developer Stats #32)
To make the point clearer, include quantifications of difficulties involved with local installations (for example, run CI-type operating system and language matrix on github repos). (Add quantification of installation difficulties #29)
Paper Revisions 3: Grant Materials: Having put together material for grants gives up a lot of extra figures to work with.
Paper Revisions 4: Miscellaneous: We got a few comments on informal style and the balance of content to motivation. It doesn't seem like reviewers liked our 3 part exposition of contribution (which does read more like motivation to be fair). When addressing topics 1 and 2 here let's focus on decreasing motivation and increasing our own quantifications or concrete examples.
The text was updated successfully, but these errors were encountered:
Regarding use cases (chaining, running simultaneously), a schematic or just mentioning in the text might be fine. We don't want to get stuck in the weeds of the particular analysis comparison.
If we're going to make a schematic of use cases, or showcase an example, it would be good to choose something high throughput, where the benefits of NeuroCAAS really come into play.
Paper Revision 2:
There are two points here: we need to emphasize the ease of using neurocaas (maybe include a video link for example), as well as the difficulty of doing things locally.
Add a paragraph/adjust Figure 3 to describe installation difficulties. Consider again an example case: LocaNMF and PMD perform better/require code compilation, which will fail on a different operating system without care being taken. It would be
Add a video link.
Talk to DLC users? How are people actually using the other proposed solutions?
Encourage people to use the local implementation and look inside the docker containers. Maybe this is also a good way to figure out all of the options that they would want to set, without implementing a new GUI wholesale.
Add a section that mentions all of the new analyses we can cover now.
One axis of the landscape figure should be (job scale * computing expertise): these two elements are definitely related.
Description
This is the main issue that collects together all of the individual todo points we would like to accomplish. This issue and the Developer Package + Paper Revision Milestone go together, and this issue should be treated as a long form (markdown enabled) description of that milestone.
I have grouped together Paper Revisions and the Dev Package because the Dev Package is a crucial step to increase the accessibility of the NeuroCAAS Development workflow, and I'd like to have it streamlined before drawing lots of attention to our project again.
Topics
I have broken down the work items we will address into several topics:
Developer Package
Developer Package 1: Dockerization: In the time between submitting the paper and now, it's become clear that Docker would make the developer process a lot smoother. In particular, we can go from developers spending most of their time configuring and saving an AWS instance that we host, to having them develop and test a docker image locally that is compatible with NeuroCAAS, and notifying us when it's ready to be deployed. This process has the following workflow:
With dockerized analyses, it's more feasible to address some more reviewer comments like:
Developer Package 2: Analysis Monitoring/Update: We currently have data about usage of each analysis, the number of users/ the number of active jobs, and total per-user usage that we are using to monitor costs and restrict usage as necessary. It would be good to process this information and make it available to developers through some simple interface. Likewise, as users develop their analyses, we want to make it easy for them to update. Most of the time this should be possible by simply updating the docker image their analysis lives in, and submitting a pull request again.
Paper Revisions
Paper Revisions 1: Developer Perspective: One comment we heard from both reviewers was that we did not focus on the developer's perspective. To this end:
Paper Revisions 2: Usage Metrics: We are now collecting usage metrics for NeuroCAAS via Google Analytics and through the records generated when users run analyses. These could be added to or replace the schematic currently included as Figure 3.
Paper Revisions 3: Grant Materials: Having put together material for grants gives up a lot of extra figures to work with.
Paper Revisions 4: Miscellaneous: We got a few comments on informal style and the balance of content to motivation. It doesn't seem like reviewers liked our 3 part exposition of contribution (which does read more like motivation to be fair). When addressing topics 1 and 2 here let's focus on decreasing motivation and increasing our own quantifications or concrete examples.
The text was updated successfully, but these errors were encountered: