We took a week's worth of commits from the GitHub Archive and found out where they came from and when they were done. Countries are coloured based on how many commits were done, relative to the country's maximum commits per hour.
Data processing was completed with Python and SQLite. The visualisation uses Datamaps with support from AngularJS.
This visualisation was created as part of the Third Annual GitHub Data Challenge because GitHub is awesome and so are public data sets. We encourage you to poke around the code and contribute!
Clone the repository and run the following commands:
npm install
npm start
You should then have a service on http://localhost:8000/app/index.html
The quick-start instructions above use a generated JSON file for its data. To generate this data you'll need to use the Python processing backend:
mkdir data
cd data
wget http://data.githubarchive.org/2014-07-{01..08}-{0..23}.json.gz
cd ..
python setup.py
A version of the visualisation lives on the ChronoCommit project page at http://asgardenterprises.github.io/ChronoCommit/)
; this mirrors the content of the gh-pages branch.
To update this, or replace current content:
- Gather all the client side code. Currently, this is the contents of /app.
- Ensure all the client side dependencies are present (e.g. bower install).
- Ensure an index.html file is available at the top level for GH to digest.
- Push the all the client side content to the gh-pages branch.
- Wait up to 10 minutes for the changes to manifest.
Pull requests function as normal, meaning updates by those without privileges can still be integrated.