To run locally, you can either:
- Install and run from a virtual environment
- Run with docker compose (see below)
- Create a Python 3.8+ virtualenv and activate it
Install dependencies:
python3 -m pip install -r requirements/dev.txt npm install
Alternatively, use the make task:
make install
Make a directory to store the project's data (
MEDIA_ROOT
,DOC_BUILDS_ROOT
, etc.). We'll use~/.djangoproject
for example purposes. Create a sub-directory namedconf
inside that directory:mkdir -p ~/.djangoproject/conf
Create a
secrets.json
file in theconf
directory, containing something like:{ "secret_key": "xyz", "superfeedr_creds": ["any@email.com", "some_string"], "db_host": "localhost", "db_password": "secret", "trac_db_host": "localhost", "trac_db_password": "secret" }
- Add
export DJANGOPROJECT_DATA_DIR=~/.djangoproject
(without the backticks) to your~/.bashrc
(or~/.zshrc
if you're using zsh,~/.bash_profile
if you're on macOS and using bash) file and then runsource ~/.bashrc
(orsource ~/.zshrc
, orsource ~/.bash_profile
) to load the changes. Create databases:
createuser -d djangoproject --superuser createdb -O djangoproject djangoproject createuser -d code.djangoproject --superuser createdb -O code.djangoproject code.djangoproject
Setting up database access
If you are using the default postgres configuration, chances are you will have to give a password for the newly created users to be able to use them for Django:
psql ALTER USER djangoproject WITH PASSWORD 'secret'; ALTER USER "code.djangoproject" WITH PASSWORD 'secret'; \d
(Use the same passwords as the ones you've used in your
secrets.json
file)Create tables:
psql -d code.djangoproject < tracdb/trac.sql python -m manage migrate
Create a superuser:
python -m manage createsuperuser
Populate the www and docs hostnames in the django.contrib.sites app:
python -m manage loaddata dev_sites
For docs (next step requires
gettext
):python -m manage loaddata doc_releases python -m manage update_docs_and_index
For dashboard:
To load the latest dashboard categories and metrics:
python -m manage loaddata dashboard_production_metrics
Alternatively, to load a full set of sample data (takes a few minutes):
python -m manage loaddata dashboard_example_data
Finally, make sure the loaded metrics have at least one data point (this makes API calls to the URLs from the metrics objects loaded above and may take some time depending on the metrics chosen):
python -m manage update_metrics
Compile the CSS (only the source SCSS files are stored in the repository):
make compile-scss
Finally, run the server:
make run
This runs both the main site ("www") as well as the docs and dashboard site in the same process. Open http://www.djangoproject.localhost:8000/, http://docs.djangoproject.localhost:8000/, or http://dashboard.djangoproject.localhost:8000/.
We use GitHub actions for continuous testing and GitHub pull request integration. If you're familiar with those systems you should not have any problems writing tests.
Our test results can be found here:
For local development don't hesitate to install tox to run the website's test suite.
Then in the root directory (next to the manage.py
file) run:
tox
Behind the scenes, this will run the usual python -m manage test
management command with a preset list of apps that we want to test as well as flake8 for code quality checks. We collect test coverage data as part of that tox run, to show the result simply run:
python -m coverage report
or for a HTML-based report:
python -m coverage html
(Optional) In case you're using an own virtualenv you can also run the tests manually using the test
task of the Makefile
. Don't forget to install the test requirements with the following command first though:
python -m pip install -r requirements/tests.txt
Then run:
make test
or simply the usual test management command:
python -m manage test [list of app labels]
The goal of the site is to target various levels of browsers, depending on their ability to use the technologies in use on the site, such as HTML5, CSS3, SVG, webfonts.
We're following Mozilla's example when it comes to categorizing browser support.
- Desktop browsers, except as noted below, are A grade, meaning that everything needs to work.
- IE < 11 is not supported (based on Microsoft's support).
- Mobile browsers should be considered B grade as well. Mobile Safari, Firefox on Android and the Android Browser should support the responsive styles as much as possible but some degradation can't be prevented due to the limited screen size and other platform restrictions.
Static files such as CSS, JavaScript or image files can be found in the djangoproject/static
subdirectory.
Templates can be found in the djangoproject/templates
subdirectory.
CSS is written in Scss and compiled via Libsass.
Run the following to compile the Scss files to CSS:
make compile-scss-debug
Alternatively, you can also run the following command in a separate shell to continuously watch for changes to the Scss files and automatically compile to CSS:
make watch-scss
Optionally you can use a tool like Foreman to run all process at once:
- the site (similar to www.djangoproject.com) on http://0.0.0.0:8000/ to be used with the modified /etc/hosts file (see above)
- the
make
task to automatically compile the SCSS files to CSS files
This is great during development. Assuming you're using Foreman simply run:
foreman start
If you just want to run one of the processes defined above use the run
subcommand like so:
foreman run web
That'll just run the www server.
Check out the Procfile
file for all the process names.
This project uses Bower to manage JavaScript libraries.
At any time, you can run it to install a new library (e.g., jquery-ui
):
npm run bower install jquery-ui --save
or check if there are newer versions of the libraries that we use:
npm run bower ls
If you need to update an existing library, the easiest way is to change the version requirement in bower.json
and then to run npm run bower install
again.
We commit the libraries to the repository, so if you add, update, or remove a library from bower.json
, you will need to commit the changes in djangoproject/static
too.
When running python -m manage update_docs_and_index
to build all documents it will also automatically index every document it builds in the search engine as well. In case you've already built the documents and would like to reindex the search index, run the command:
python -m manage update_index
This is also the right command to run when you work on the search feature itself. You can pass the -d
option to try to drop the search index first before indexing all the documents.
The business logic for dashboard metrics is edited via the admin interface and contained in the models in the dashboard
app (other than Dataum
, which contains the data itself). From time to time, those metrics should be extracted from a copy of the production database and saved to the dashboard/fixtures/dashboard_production_metrics.json
file.
To update this file, run:
python -m manage dumpdata dashboard --exclude dashboard.Datum --indent=4 > dashboard_production_metrics.json
We're using Transifex to help manage the translation process. The Transifex client app is required. To install it, run:
curl -o- https://raw.githubusercontent.com/transifex/cli/master/install.sh | bash
Before using the command-line Transifex client, create ~/.transifexrc
according to the instructions at https://docs.transifex.com/client/client-configuration. You'll need to be a member of the Django team in the Django organization at Transifex. For information on how to join, please see the Translations section of the documentation on contributing to and localizing Django.
Since this repo hosts three separate sites, our .po
files are organized by website domain. At the moment, we have:
dashboard/locale/
contains the translation files for https://dashboard.djangoproject.comdocs/locale/
contains the translation files for https://docs.djangoproject.com (only for the strings in this repository; translation of the documentation itself is handled elsewhere)locale/
contains the translation files for https://www.djangoproject.com (including strings from all apps other thandashboard
anddocs
)
Important: To keep this working properly, note that any templates for the dashboard
and docs
apps must be placed in the <app name>/templates/<app name>/
directory of the respective app, not in the djangoproject/templates/
directory.
When there are changes to the messages in the code or templates, a member of the translations team will need to update Transifex as follows:
Regenerate the English (only) .po file:
python -m manage makemessages -l en
(Never update alternate language .po files using makemessages. We'll update the English file, upload it to Transifex, then later pull the .po files with translations down from Transifex.)
Push the updated source file to Transifex:
tx push -s
Commit and push the changes to GitHub:
git commit -m "Updated messages" locale/en/LC_MESSAGES/* git push
Anytime translations on Transifex have been updated, someone should update our translation files as follows:
- Review the translations in Transifex and add to the space-delimited
LANGUAGES
list inupdate-translations.sh
, any new languages that have reached 100% translation. Pull the updated translation files:
./update-translations.sh
- Use
git diff
to see if any translations have actually changed. If not, you can just revert the .po file changes and stop here. Compile the messages:
python -m manage compilemessages
Run the test suite one more time:
python -m manage test
Commit and push the changes to GitHub:
git commit -m "Updated translations" locale/*/LC_MESSAGES/* git push
Build the images:
docker-compose build
Spin up the containers:
docker-compose up
- View the site at http://localhost:8000/
Run the tests:
docker-compose exec web tox docker-compose exec web python -m manage test
pre-commit is a framework for managing pre-commit hooks. These hooks help to identify simple issues before committing code for review. By checking for these issues before code review it allows the reviewer to focus on the change itself, and it can also help to reduce the number of CI runs.
To use the tool, first install pre-commit
and then the git hooks
$ python3 -m pip install pre-commit
$ python3 -m pre_commit install
On the first commit pre-commit
will install the hooks, these are installed in their own environments and will take a short while to install on the first run. Subsequent checks will be significantly faster. If the an error is found an appropriate error message will be displayed. If the error was with isort
then the tool will go ahead and fix them for you. Review the changes and re-stage for commit if you are happy with them.