No description, website, or topics provided.
Switch branches/tags
open-release/hawthorn.2 open-release/hawthorn.1 open-release/hawthorn.1rc3 open-release/hawthorn.1rc2 open-release/hawthorn.1rc1 open-release/ginkgo.2 open-release/ginkgo.1 open-release/ginkgo.1rc1 open-release/ficus.4 open-release/ficus.3 open-release/ficus.2 open-release/ficus.1 open-release/ficus.1rc4 open-release/ficus.1rc3 open-release/ficus.1rc2 open-release/ficus.1rc1 open-release/eucalyptus.3 open-release/eucalyptus.2 open-release/eucalyptus.1 open-release/eucalyptus.1rc2 open-release/eucalyptus/1rc1 nedbat/test/h.1.1 nedbat/test/ficus.4.4 nedbat/test/ficus.4.3 nedbat/test/ficus.4.2 nedbat/test/ficus.3.3 nedbat/test/ficus.3.2 nedbat/test/ficus.3.1 nedbat/test/ficus.2.3 nedbat/test/ficus.2.1 nedbat/test/ficus.2rc1.2 nedbat/test/ficus.2rc1.1 nedbat/test/ficus.1rc4.3 nedbat/test/ficus.1rc4.2 nedbat/test/ficus.1rc4.1 nedbat/test-ficus.1rc4 ned/test-ficus.2 ned/test-ficus.1 named-release/dogwood.3 named-release/dogwood.2 named-release/dogwood.1 named-release/dogwood named-release/dogwood.rc3 named-release/dogwood.rc2 named-release/dogwood.rc1 edx-west/release-pre-20150202 1.0.0 0.28.0 0.27.0 0.26.1 0.26.0 0.25.1 0.25.0 0.24.0 0.23.0 0.22.0 0.21.0 0.20.1-rc.3 0.20.1-rc.2 0.20.1-rc.1 0.20.0-rc.3 0.20.0-rc.2 0.20.0-rc.1 0.19.0-rc.1 0.18.0-rc.1 0.17.0-rc.1 0.16.0-rc.2 0.16.0-rc.1 0.15.0-rc.1 0.14.0-rc.1 0.13.0-rc.1 0.12.0 0.12.0-rc.2 0.12.0-rc.1 0.11.0 0.11.0-rc.0 0.10.0 0.10.0-rc.0 0.9.0 0.9.0-rc.0 0.8.0 0.8.0-rc.0 0.7.1 0.7.1-rc.0 0.7.0-rc.0 0.6.3 0.6.2 0.6.1 0.6.0 0.5.0 0.5.0-rc.2 0.5.0-rc.1 0.5.0-rc.0 0.4.0 0.4.0-rc.0 0.3.0 0.3.0-rc.2 0.3.0-rc.1 0.3.0-rc.0 0.2.0
Nothing to show
Clone or download
Latest commit 9a2b35a Oct 10, 2018

README.rst

edX Analytics API Server build-status coverage-status

This repository includes the Django server for the API as well as the API package itself. The client is hosted at https://github.com/edx/edx-analytics-data-api-client.

License

The code in this repository is licensed under version 3 of the AGPL unless otherwise noted.

Please see LICENSE.txt for details.

Getting Started

  1. Install the requirements:

    $ make develop
    
  2. Setup the databases:

    $ ./manage.py migrate --noinput
    $ ./manage.py migrate --noinput --database=analytics
    

    The learner API endpoints require elasticsearch with a mapping defined on this wiki page. The connection to elasticsearch can be configured by the ELASTICSEARCH_LEARNERS_HOST and ELASTICSEARCH_LEARNERS_INDEX django settings. For testing, you can install elasticsearch locally:

    $ make test.install_elasticsearch
    

    To run the cluster for testing:

    $ make test.run_elasticsearch
    
  3. Create a user and authentication token. Note that the user will be created if one does not exist.

    $ ./manage.py set_api_key <username> <token>
    
  4. Run the server:

    $ ./manage.py runserver
    

Loading Data

The fixtures directory contains demo data and the generate_fake_enrollment_data management command can generate enrollment data. Run the command below to load/generate this data in the database.

$ make loaddata

Loading Video Data

The above command should work fine on its own, but you may see warnings about video ids:

WARNING:analyticsdataserver.clients:Course Blocks API failed to return
video ids (401). See README for instructions on how to authenticate the
API with your local LMS.

In order to generate video data, the API has to be authenticated with your local LMS so that it can access the video ids for each course. Instead of adding a whole OAuth client to the API for this one procedure, we will piggyback off of the Insights OAuth client by taking the OAuth token it generates and using it here.

  1. Start your local LMS server. (e.g. in devstack, run paver devstack --fast lms).

  2. If your local LMS server is running on any address other than the default of http://localhost:8000/, make sure to add this setting to analyticsdataserver/settings/local.py with the correct URL. (you will likely not need to do this):

    # Don't forget to add the trailing forward slash
    LMS_BASE_URL = 'http://example.com:8000/'
    
  3. Sign into your local Insights server making sure to use your local LMS for authentication. This will generate a new OAuth access token if you do not already have one that isn't expired.

    The user you sign in with must have staff access to the courses for which you want generated video data.

  4. Visit your local LMS server's admin site (by default, this is at http://localhost:8000/admin).

  5. Sign in with a superuser account. Don't have one? Make one with this command in your devstack as the edxapp user:

    $ edxapp@precise64:~/edx-platform$ ./manage.py lms createsuperuser
    

    Enter a username and password that you will remember.

  6. On the admin site, find the "Oauth2" section and click the link "Access tokens". The breadcrumbs should show "Home > Oauth2 > Access tokens".

    Copy the string in the "Token" column for the first row in the table. Also, make sure the "User" of the first row is the same user that you signed in with in step 3.

  7. Paste the string as a new setting in analyticsdataserver/settings/local.py:

    COURSE_BLOCK_API_AUTH_TOKEN = '<paste access token here>'
    
  8. Run make loaddata again and ensure that you see the following log message in the output:

    INFO:analyticsdataserver.clients:Successfully authenticated with the
    Course Blocks API.
    
  9. Check if you now have video data in the API. Either by querying the API in the swagger docs at /docs/#!/api/Videos_List_GET, or visiting the Insights engagement/videos/ page for a course.

Note: the access tokens expire in one year so you should only have to follow the above steps once a year.

Running Tests

Run make validate install the requirements, run the tests, and run lint.

How to Contribute

Contributions are very welcome, but for legal reasons, you must submit a signed individual contributor’s agreement before we can accept your contribution. See our CONTRIBUTING file for more information – it also contains guidelines for how to maintain high code quality, which will make your contribution more likely to be accepted.