Contributing to gcloudc

Django Glcoud Connectors is actively developed and maintained, so if you're thinking of contributing to the codebase, here is how to get started.

Get started with development

  1. First off, head to our GitLab page and fork the repository to have your own copy of it.
  2. Clone it locally to start setting up your development environment
  3. Run all tests to make sure your local version is working (see instructions in README.md).

Pick an issue & send a merge request

If you spotted a bug that you want to fix, it's a good idea to start off by adding an issue. This will allow us to verify that your issue is valid, and suggest ideas for fixing it, so no time is wasted for you.

For help with creating the merge request, check out GitLab documentation.

Code style

Code style should follow PEP-8 with a loose line length of 100 characters.

Need help?

Reach out to us on djangae-users mailing list.

Merge request requirements

For pull request to be merged, following requirements should be met:

  • Tests covering new or changed code are added or updated
  • Relevant documentation should be updated or added
  • Line item should be added to CHANGELOG.md, unless change is really irrelevant

Running tests

On setting up the first time, you'll need to:

  • Install the gcloud datastore emulator: gcloud components install cloud-datastore-emulator
    • If you don't have gcloud (the Google Cloud SDK) installed, installation instructions can be found here
    • Note that the Datastore emulator requires the Java JRE.
  • Create a Python 3 virtualenv: python3 -m venv .venv && source ./.venv/bin/activate
  • Install the prerequisites: pip3 install tox
  • Create test output directory: mkdir .reports

Then you can run:

$ tox

Under the hood tox runs ./manage.py test. To pass down arguments to this command simply separate them with a double hyphen. e.g.

tox -e py37 -- --failfast

Running tests with real Datastore

It is sometime useful to be able to run tests against a real version of the Datastore instead of the emulator.

You need to create your own GCP project and then set the GCLOUDC_PROJECT_ID environment variable. Two GCP Projects have been set for Pototo employees: - gclouc-test-optimistic for datastore running in OPTIMISTIC concurrency mode. - gclouc-test-pessimistic for datastore running in PESSIMISTIC concurrency mode.

Make sure you have the have configured the application default credentials by running:

gcloud auth application-default login

You can then run tox passing the --use-remote-datastore option like this:

$ tox -e py39-22 -- --use-remote-datastore

Indexes management

Updating during development

When running the test locally, we use the Google Cloud datastore emulator as database. When a query that requires a composite index is run, the emulator detects it and if not present already, it adds it to the .index.yaml file, below the # AUTOGENERATED row.

You'll notice this, because the file will appear to have changes in git. Unfortunately, all the automatically-added lines are discarded every time the emulator starts. If you notice new indexes have been added, move them above the # AUTOGENERATED line and save the file. This will ensure they won't be deleted automatically.

Deploying indexes

If new indexes have been added since the last deployment, and you want to run test against a real instance of Datastore they need to be deployed to. gcloud app deploy --project=<your-project> .index.yaml

Release process

Release to pypi is managed by GitLab CI. To create a new release create the relevant tag and push it to the gitlab remote. But first you should do some version fiddling...

1. Update the version in setup.py to the new version by removing the 'a' suffix (most likely)
2. Commit this change
3. Run `git tag -a X.Y.Z -m "Some description"
4. Run `git push origin master && git push --tags`
5. Open setup.py again, bump to the *next* release version, use an 'a' suffix
6. Run `git commit -am "Bump to alpha version" && git push origin master`

This will trigger a pipeline that will publish the package in test.pypi.org. If that is successful, you can then manually trigger the job publish to prod pypi on the same pipeline to deploy to the official pypi registry.