Skip to content

๐Ÿ“œ Guideline and criteria for evaluating hackathon and design focused code sprint globally conducted by developer friendly companies

License

Notifications You must be signed in to change notification settings

mayurah/Evaluation-Manual

Folders and files

NameName
Last commit message
Last commit date

Latest commit

ย 

History

17 Commits
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 

Repository files navigation

Evaluation Manual

Guideline and criteria for evaluating hackathon and code sprint globally conducted by developer friendly companies

Here are the current evaluation parameters that we have listed, which we use at HackerEarth

Please feel free to fork this repo and add more aspects which would help evaluate hackathons, security/ctf, cloud orchestration and code sprint events involving aspects such as innovation, code, design and user experience.

Four core parameters:

  • Idea
  • Code
  • Design
  • Overall Usability

Note: Weightage for each aspect is given 10 by default, but can be balanced and changed as per requirement.

Idea/Conceptualization (Important for hackathons)

  • Creativity and Innovation aspects
  • Viability of Idea and overall effectiveness on relevant domain
  • If there's already existing solution available to proposed idea
  • Problem Solution, Business Model, Impact

Design/Wireframe/Blueprints

  • Elaborative self-explanatory design
  • Any interactive and creativity from design perspective
  • Subtle patterns / Textures / Elegant design / Icons / Typography
  • Use of current tech (2016) such as React, eletron.atom.io, AngularJS, Material Design, Flat Icons, etc.
  • Use of UI / UX / CSS3 and HTML5 features in case of web/mobile development

Code/Scalability

  • Coding standard, beautified and well-maintained code, external libraries
  • Algorithms, Web API, creativity, logic, framework, network communication
  • Use of version control, demo url and deployment tools to make development operations easier
  • Documentation, Collaborative tools and devOps automation
  • Have a look at Data Science for sub-parameters used for evaluating data science submission

Overall Usability

  • Overall functionality of the submission
  • Use of sane labels/text/context/design which would make sense
  • Added efforts/features/value to the overall work (video, presentation, etc).
  • Embracing innovation/idea/task in mind
  • Lesser bugs

PS: This paper is made for and used at HackerEarth for the evaluation process of Hackathon and Hiring challenges.

Note: To know more about criteria for ๐Ÿ›กsecurity events, please consider looking at security manual on top of base evaluation criteria.

Love it? Say hi to me at ma yur (@) hackerearth (dot) com (no space).

About

๐Ÿ“œ Guideline and criteria for evaluating hackathon and design focused code sprint globally conducted by developer friendly companies

Resources

License

Security policy

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published