Hackathon Guildlines

COVID-19 Data Science Hackathon Contest Evaluation Guidelines

We encourage teams to select a topic from the list of recommended ones, although teams can also design a topic and submit it by the posted deadline. All COVID-19 Data Science Hackathon topics should be rooted in a recently published paper with the potential to impact the prevention, diagnosis, treatment, or research of COVID-19 caused by SARS-CoV-2. A panel of data science experts (“judges”) will be asked to evaluate and improve the submitted topics.

In the end, a successful hackathon team will be expected to develop answers to a particular COVID-19 hypothesis/problem using data science approaches, by following these evaluation criteria:

  • Does it involve a highly-interesting hypothesis, phrased as a question statement? All good topics should be firmly grounded in at least one recent COVID-19 published in the biomedical literature.
  • Does it have the potential to advance public biomedical knowledge/understanding of COVID-19? The advancement may be defined broadly, from epidemiology to treatment, including basic, translational, and clinical, and population impacts.
  • Does it involve the innovative use of data science tools from GitHub/GitLab? At least one of the team members should be skilled in deploying the software tools and designated as a technical contact in the team’s roster before the Hackathon.
  • Does it leverage on existing publicly available data sets, particularly those on the U-BRITE platform? If additional data sets are needed, beyond what is already available (see COVID-19 Featured Data), the team should work ahead of time (some of these data require non-trivial effort to stage) to communicate with the Hackathon IT Architect (Jelai Wang), who will assess whether we can bring them to U-BRITE before the Hackathon.
  • Is the project feasible to perform with 50-80 hours of total team efforts within the 2-day Hackathon period? The team should stay focused on collecting key evidence to address the hypothesis with the understanding that future work may be needed to clear all doubts.
  • Does the informatics approach adopted demonstrate rigor? Is the sample set too small or biased? What statistical techniques are used to quantify the biases or validity of conclusions made?

We will not evaluate the hackathon project based on these factors:

  • Experimental validation of findings. In silico validations, using cross-validations, statistics, independent literature search, or unbiased separate data sets, are sufficient.
  • Comprehensiveness of subject problem examined. Due to time constraints, preliminary conclusions that can help answer the question are adequate.
  • Publication-readiness. We understand that reaching a point for publishing findings requires additional effort, especially by testing multiple alternative hypotheses.
  • Provide scientific guidance on the overall project design, data to be used, and choices of analytical tools necessary to complete the project on time
  • Facilitate constructive and critical discussions among the teams to improve the overall rigor and reproducibility of the approach
  • Coordinate tasks and different roles to be performed by different team members
  • Help overcome obstacles as they arise before and during the Hackathon
  • Help the team prepare a final presentation to compete in the contest
  • Form a team at least one week before the Hackathon and send us a list of team members
  • Discuss and pick a topic for the Hackathon
  • Prepare for the Hackathon by installing necessary software, attending training sessions, ensuring readiness, and working with organizers to bring in necessary public data sets or software tools into U-BRITE beforehand.
  • Divide up roles: data cleaning, programming, informatics approach design, data analysis, technical writing, etc.
  • Develop your solution during the Hackathon
  • Prepare and deliver a presentation about your work (see below). Start preparing after the Hackathon is complete so you can focus on your project during the two days of the Hackathon. Presentations will be recorded.
  • Attend presentations as assigned by organizers, either live Zoom or watch the recording.
  • Judges may also be mentors but will need to recuse themselves from evaluating teams they are mentoring.
  • Evaluate presentations based on scientific rigor, innovation, and presentation
  • Submit evaluations to organizers
  • Each team will have 10min (less than ten slides) to report their problem solved, approach, and conclusions.
  • Prepare your presentation after (not during) the Hackathon
  • Teams presentations will be held via Zoom Friday, June 19 at 10:15 am.
  • Judges will rate the team based on the criteria specified above.
  • The codes, results, and documentation all need to be deposited in U-BRITE. Participants agree to make their work open-source and openly accessible to U-BRITE users.

Reach out if you have questions.

We look forward to an innovative, productive, and fun hackathon!

Thanks to all mentors, judges, and participants!