Skip to content
Revamped codebase of EvalAI Frontend
TypeScript HTML CSS Other
Branch: master
Clone or download
amanex007 and Sanji515 Correct password field label in signup page (#280)
* Update signup.html GCI

* has been given to denote that the field is mandatory ti enter just like other fields like confirm password etc, corrected as per @vkartik97

* Update signup.component.html

Small correction

Co-authored-by: Sanjeev Singh <>
Latest commit d3d23ca Jan 23, 2020
Type Name Latest commit message Commit time
Failed to load latest commit information.
.github Docs: Fix GitHub name(#240) Dec 15, 2019
docker Setup: Fix docker configuration to set up evalai-ngx in dev environment( May 13, 2019
e2e Add homepage(#140) Jul 21, 2019
scripts/tools Add tool to fetch pull request to test on local server. Feb 15, 2018
src Correct password field label in signup page (#280) Jan 23, 2020
.editorconfig Setup: Add basic setup for Angular (#1) Jan 23, 2018
.gitignore Fix #65: Add Compodoc for documentation along with updating readme(#67) Jun 25, 2018
.travis.yml Visible scroll-top button and smooth scroll on privacy-policy page (#111 Jun 13, 2019 Fix #72: Add Code of Conduct(#93) Mar 19, 2019
LICENSE Initial commit Jan 21, 2018 Readme: Fix command to clone the project(#132) Jun 2, 2019
Updated Docs: Fix GitHub name(#240) Dec 15, 2019
angular.json Adds dashboard page(#141) Jul 21, 2019
codecov.yml Add Codecov support for testing code coverage (#44) Mar 9, 2018
docker-compose.yml Setup: Fix docker configuration to set up evalai-ngx in dev environment( May 13, 2019
karma.conf.js Angular-CLI: Upgrade Angular-CLI version from 5 to 7(#125) Jun 2, 2019
package.json Adds dashboard page(#141) Jul 21, 2019 Add auth token in surge deployment script Mar 10, 2019
protractor.conf.js Angular docker setup complete Jan 31, 2018
tsconfig.json Angular-CLI: Upgrade Angular-CLI version from 5 to 7(#125) Jun 2, 2019
tslint.json Angular-CLI: Upgrade Angular-CLI version from 5 to 7(#125) Jun 2, 2019


Revamped codebase of EvalAI Frontend

Join the chat at codecov Build Status

EvalAI is an open source web application that helps researchers, students and data-scientists to create, collaborate and participate in various AI challenges organized round the globe.

In recent years, it has become increasingly difficult to compare an algorithm solving a given task with other existing approaches. These comparisons suffer from minor differences in algorithm implementation, use of non-standard dataset splits and different evaluation metrics. By providing a central leaderboard and submission interface, we make it easier for researchers to reproduce the results mentioned in the paper and perform reliable & accurate quantitative analysis. By providing swift and robust backends based on map-reduce frameworks that speed up evaluation on the fly, EvalAI aims to make it easier for researchers to reproduce results from technical papers and perform reliable and accurate analyses.

A question we’re often asked is: Doesn’t Kaggle already do this? The central differences are:

  • Custom Evaluation Protocols and Phases: We have designed versatile backend framework that can support user-defined evaluation metrics, various evaluation phases, private and public leaderboard.

  • Faster Evaluation: The backend evaluation pipeline is engineered so that submissions can be evaluated parallelly using multiple cores on multiple machines via mapreduce frameworks offering a significant performance boost over similar web AI-challenge platforms.

  • Portability: Since the platform is open-source, users have the freedom to host challenges on their own private servers rather than having to explicitly depend on Cloud Services such as AWS, Azure, etc.

  • Easy Hosting: Hosting a challenge is streamlined. One can create the challenge on EvalAI using the intuitive UI (work-in-progress) or using zip configuration file.

  • Centralized Leaderboard: Challenge Organizers whether host their challenge on EvalAI or forked version of EvalAI, they can send the results to main EvalAI server. This helps to build a centralized platform to keep track of different challenges.


Our ultimate goal is to build a centralized platform to host, participate and collaborate in AI challenges organized around the globe and we hope to help in benchmarking progress in AI.

Performance comparison

Some background: Last year, the Visual Question Answering Challenge (VQA) 2016 was hosted on some other platform, and on average evaluation would take ~10 minutes. EvalAI hosted this year's VQA Challenge 2017. This year, the dataset for the VQA Challenge 2017 is twice as large. Despite this, we’ve found that our parallelized backend only takes ~130 seconds to evaluate on the whole test set VQA 2.0 dataset.

Development setup

Use Docker Compose to run all the components of EvalAI-ngx together. The steps are:

  1. Get the source code on to your machine via git.

    git clone evalai-ngx && cd evalai-ngx
  2. Build and run the Docker containers. This might take a while. You should be able to access EvalAI at localhost:8888.

    docker-compose up
  3. That's it. Open web browser and hit the url Three users will be created by default which are listed below -

    SUPERUSER- username: admin password: password
    HOST USER- username: host password: password
    PARTICIPANT USER- username: participant password: password

For deploying with Surge:

Surge will automatically generate deployment link whenever a pull request passes Travis CI.

Suppose pull request number is 123 and it passes Travis CI. The deployment link can be found here:

Code scaffolding

Run ng generate component component-name to generate a new component. You can also use ng generate directive|pipe|service|class|guard|interface|enum|module.

Code Documentation

We are using compodoc for documentation. The goal of this tool is to generate a documentation for all the common APIs of the application like modules, components, injectables, routes, directives, pipes and classical classes.

Compodoc supports these JSDoc tags.


Please go through our Contribution Guidelines. Also go through our detailed Code Structure Guide to make the most of existing re-usable features. Finally, go through the Pull Request Template when creating your pull request.

Building and Serving the documentation

Run the following command to build and serve the docs:

npm run doc:buildandserve

Open http://localhost:8080 in the browser to have a look at the generated docs.


Run ng build to build the project. The build artifacts will be stored in the dist/ directory. Use the -prod flag for a production build.

Running unit tests

Run ng test to execute the unit tests via Karma.

Running end-to-end tests

Run ng e2e to execute the end-to-end tests via Protractor.

The Team

EvalAI-ngx is currently maintained by Shekhar Rajak, Mayank Lunayach, Shivani Prakash Gupta, Rishabh Jain and Deshraj Yadav.

You can’t perform that action at this time.