New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Swipes for Science #51

Open
akeshavan opened this Issue Sep 9, 2018 · 14 comments

Comments

Projects
None yet
7 participants
@akeshavan
Copy link

akeshavan commented Sep 9, 2018

Project Lead: @akeshavan

Mentor: @derekhoward

Welcome to OL6, Cohort D! This issue will be used to track your project and progress during the program. Please use this checklist over the next few weeks as you start Open Leadership Training 🎉.


Before Week 1 (Sept 12): Your first mentorship call

  • Complete the OLF self-assessment (online, printable). If you're a group, each teammate should complete this assessment individually. This is here to help you set your own personal goals during the program. No need to share your results, but be ready to share your thoughts with your mentor.
  • Make sure you know when and how you'll be meeting with your mentor.

Before Week 2 (Sept 19): First Cohort Call (Open by Design)

Before Week 3 (Sept 26): Mentorship call

  • Look up two other projects and comment on their issues with feedback on their vision statement.
  • Complete your Open Canvas (instructions, canvas). Comment on this issue with a link to your canvas.
  • Start your Roadmap. Comment on this issue with your draft Roadmap.

Before Week 4 (Oct 3): Cohort Call (Build for Understanding)

  • Look up two other projects and comment on their issues with feedback on their open canvas.
  • Pick an open license for the work you're doing during the program.
  • Use your canvas to start writing a README, or landing page, for your project. Link to your README in a comment on this issue.

This issue is here to help you keep track of work during the first month of the program. Please refer to the OL6 Syllabus for more detailed weekly notes and assignments past week 4.

@acabunoc acabunoc added this to the Cohort D milestone Sep 13, 2018

@akeshavan

This comment has been minimized.

Copy link

akeshavan commented Sep 18, 2018

[DRAFT] Vision or Mission Statement

I’m working with biomedical researchers to annotate their large datasets by engaging citizen scientists so that researchers and citizen scientists working together can accelerate scientific discovery.

I’m working openly because I believe that everyone can be a part of the scientific process, regardless of their educational background.

@mekline

This comment has been minimized.

Copy link

mekline commented Sep 18, 2018

I'd love to hear more about this! What kinds of datasets and annotations are you working with?

@akeshavan

This comment has been minimized.

Copy link

akeshavan commented Sep 18, 2018

Many! Primarily brain imaging research. I'll list some apps below:

  1. braindr.us -- this project is to study pediatric mental health by analyzing thousands of brain MRI images. Kids move a lot in the MRI scanner, which results in poor quality images. So braindr is like a Tinder app for brains -- swipe left to fail a bad quality image! You can read about the whole experiment at http://results.braindr.us

  2. braindrles.us -- this is a collaboration with researchers at USC who are studying stroke lesions. They ran automated stroke segmentation on hundreds of brain images, and need to know which algorithm did the best job. So here, you swipe left if you see a poor-quality stroke segmentation.

  3. appstract.pub -- this is a text annotation application for scientific abstracts. I wanted to know the distribution of sample sizes in neuroimaging studies of autism, but no database indexes this information! So in appstract, you tap on the highlighted numbers that describe the sample size from abstract text. I plan on expanding this so that you can annotate info other than numbers.

  4. whaledr -- this is an app where you listen to a 5 second sound clip from the ocean (and see its spectrogram), and swipe right if you hear a whale. Its still very much a work in progress, and we are still working on creating a good tutorial! But you can check out the "WhaleChats" section to hear example sound clips :)

If you have any similar annotation needs for your research, we should chat! I want to work on making these apps more configurable/extensible so that anyone can use them.

@Wentale

This comment has been minimized.

Copy link

Wentale commented Sep 20, 2018

Hi Anisha, Nice to meet you in the cohort meeting this week. I like the vision that keeps it broad enough but includes the key elements of your project. Also thanks to whaledr....that gave me the idea for the cohort name of Whales Song!

@akeshavan

This comment has been minimized.

Copy link

akeshavan commented Sep 21, 2018

Thanks @Wentale ! I voted for Whale Song 👍

@nerantzis

This comment has been minimized.

Copy link

nerantzis commented Sep 24, 2018

I do agree. Every citizen and every students can participate in science!

@akeshavan

This comment has been minimized.

@kmahelona

This comment has been minimized.

Copy link

kmahelona commented Sep 27, 2018

Yay citizen science! There's all sorts of challenges when crowd sourcing data annotation. We're doing this for quality control of te reo Māori readings and because it's an endangered language we often struggle with questions like what's "correct" pronunciation. I'd be keen to learn about your challenges of engaging with citizen scientists and how you navigate those.

@akeshavan

This comment has been minimized.

Copy link

akeshavan commented Sep 28, 2018

@kmahelona how cool! I've only just started working with citizen science, so I'm still new to the challenges. In my proof of concept app (braindr.us) we found a way to remove bad-quality annotations with a machine learning algorithm -- you can read more about it at results.braindr.us. Basically, we had a few answers we knew were correct, and we weighted people's responses based on this ground truth.

p.s. Your project (#68) sounds awesome!

@kmahelona

This comment has been minimized.

Copy link

kmahelona commented Oct 2, 2018

Re Open Canvas, problem solution is nice and to the point. For me I feel like contributor profile should include some info on the types of people we think would be good for our project (e.g. the obvious one is "likes to work on open sourced projects" or for us we'd love to have other indigenous developers involved in #68).

@akeshavan

This comment has been minimized.

Copy link

akeshavan commented Oct 3, 2018

Thanks @kmahelona , I've updated my Open Canvas to be more specific on the contributor profile.

Also, here is my roadmap: SwipesForScience/SwipesForScience#16

@mlbonatelli

This comment has been minimized.

Copy link

mlbonatelli commented Oct 3, 2018

Here is my Open Canvas: https://docs.google.com/presentation/d/1ms4z2pXf3zPzrfGwQS876jgwklOYmYR6UxWee5J5l1c/edit?usp=sharing

First of all, I loved the name! ❤️ And regarding your Unique Value Proposition, I think is great, sounds fun and entertaining! But annotating data is something that needs to be done with carefull, are you concern about that?

@akeshavan

This comment has been minimized.

Copy link

akeshavan commented Oct 3, 2018

@mlbonatelli thanks! Yes, crowdsourcing annotations is hard. The first thing we need to do is make sure, as scientists, that we are effectively explaining our data and how to annotate it to a general audience. In Swipes for Science, I want to have a template that scientists fill in with text and images and that displays this information nicely. Another thing we can do is use machine learning to aggregate annotations by different users in a smart way -- we did this in the braindr.us project (you can read more about it at results.braindr.us)

@mlbonatelli

This comment has been minimized.

Copy link

mlbonatelli commented Oct 5, 2018

@mlbonatelli thanks! Yes, crowdsourcing annotations is hard. The first thing we need to do is make sure, as scientists, that we are effectively explaining our data and how to annotate it to a general audience. In Swipes for Science, I want to have a template that scientists fill in with text and images and that displays this information nicely. Another thing we can do is use machine learning to aggregate annotations by different users in a smart way -- we did this in the braindr.us project (you can read more about it at results.braindr.us)

Really nice initiative @akeshavan ! If I can give another suggestion is, after the scientists done the template, maybe having a trial version with others scientists and people from the general public to see if they really understood the information?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment