Compliant Kubernetes Service documentation has moved

Please note: You are not reading Kubernetes documentation. If you're looking for Compliant Kubernetes Service documentation, it has moved. Read more here.


This article will provide in-depth information about the STACK and how to go about testing and upgrading the STACK for your code services. This article will cover concepts related to test environments, services, the STACK, jobs, and the build/deployment process. At the end of this article you should have a good understanding of how to safely test upgrading a STACK. Before we jump in, let's talk about some of these important concepts.


What is a STACK?

From a high-level, an organization is a group of users and environments. A Datica environment is an encrypted network space where services will be deployed. A service is a specification and can be one of many types (code, database, cache, etc). The executing instance of a service specification is called a job. A job can be one of many types (web, worker, cron, console, etc). A single service specification can be used to execute many jobs of many job types. A single job is a containerized (Docker) group of processes; an isolated runtime environment defined by its service specification at the time of deployment.

For instance, a PostgreSQL service is a specification of how to configure and run PostgreSQL inside a Datica environment while the job is the running PostgreSQL server. Datica manages the runtime environments of most service types, the only exception being code services.

The STACK is how you configure a code service such that its jobs' runtime environment will use a specific version of Ubuntu. Since building and executing an application is highly dependent on the system libraries available to the application, this needs to be configurable by users of CPaaS. It allows developers to take advantage of the latest security improvements and features offered by the OS on their own schedule. The following STACKS are available for code services and map to the corresponding version of Ubuntu.

  • cedar-14 -> Ubuntu 14.04 **out of support
  • datica-16 -> Ubuntu 16.04 *supported
  • datica-18 -> Ubuntu 18.04 *supported

Let's walk through the steps of how to upgrade a STACK.


IMPORTANT: test STACK upgrades in a non-prod environment

It's very important to test every code service STACK upgrade in a non-production environment before performing the upgrade in your production environment. Since you are changing the OS version, many system libraries will be changing. You might find libraries are missing, functionality for those libraries have changed, or newer versions of libraries are incompatible with modules your application is currently using on the older OS. Changing these modules could also have an impact on the code utilizing them. If needed, you can install required libraries using buildpack hooks or upgrade application module versions using your application frameworks module manager. You should also run all acceptance tests before moving upgrading a service in a production environment.


Execute a STACK upgrade

For my example I'll be upgrading a Python/Django/Postgres App from cedar-14 to datica-16. I've already followed the tutorial to get the app built and deployed. Running the datica status command shows my environment has multiple jobs, one of which is an instance of the python-sample-app service specification.

datica status



We can check which STACK the service is configured to use for builds by listing the service's variables.

$ datica vars list python-sample-app | grep 'STACK'


Let's verify the job deployed from this service specification does indeed have Ubuntu 14.04 as it's runtime environment by using the datica console command. The console command deploys a new job based on the current service specification (including the docker image; relevant later) and allows you to run commands in that runtime environment. We'll use this to check the OS version and also print the STACK environment variable. 

datica console python-sample-app "lsb_release -a ; printenv STACK"



Now let's upgrade the STACK for the service and see if our application will build and deploy cleanly.

datica vars set python-sample-app -v "STACK=datica-16"


WARNING: The output says we should redeploy for the variable to "take effect". What this actually means is only new deployments will have the environment variable(s) we just defined. This isn't the "effect" we are looking for. In fact, if you were to redeploy now, or run the previous console command, you would see the STACK environment variable updated, but the runtime environment is still Ubuntu 14.04.

What we want to do is rebuild the application using the STACK we just specified, if the build is successful, then a deploy/redeploy will have the effect we are looking for. In CPaaS a code build can only be triggered in the event of a git push to the remote (triggering git's server-side post-receive hook). I don't have any new changes committed to push, so I'll create an empty commit to push to the datica remote.

git commit --allow-empty -m "Triggering datica-16 build"
git push datica master


The build has been triggered and in the build output (if it makes it far enough) there is a message about the stack changing to heroku-16 (Datica implements heroku buildpacks). This is what we want. The application is being rebuilt using the system libraries of the new OS.

-----> Stack has changed from cedar-14 to heroku-16, clearing cache


From here the build will either pass (great! continue to testing) or fail. If it fails you'll need to start hunting down what is incompatible exist between the application and the new OS version. In my case there were a couple changes I needed to make to incorporate a renamed python module and runtime environment changes required me to update the application settings. Once I made the changes below, the build was successful.




Here is the quick and dirty.

# Update the STACK. Use the version you require.
datica vars set -v "STACK=datica-16"

# Create an empty commit
git commit --allow-empty -m "Trigger datica-16 build"

# Push to git remote to build and deploy
git push datica master



Q: My build failed, is my application down?

No. A deploy only follows if it the build was successful.


Q: Can I update the STACK of my database, cache, etc?

No. The STACK is a heroku buildpack concept and only applies to code services. Datica manages the execution environment for non code services like databases, caches, and queues.


Q: How can I revert changes quickly?

You can use the datica releases command to list the service's previously successful builds then use the datica rollback command to deploy it. This deploys the code at a specific commit using the same docker image it was previously deployed on. Here is an example of rolling back to the commit which was running at the start of the article. You'll notice the STACK still says datica-16 because the service specification is separate from the individual release and variables are defined at the service level.

$ datica rollback python-sample-app fb42d36c4c4989226943728e57ce4a08a907834c
Rolling back python-sample-app to fb42d36c4c4989226943728e57ce4a08a907834c
Rollback successful! Check the status with "datica status" and your logging dashboard for updates.

$ datica console python-sample-app "lsb_release -a ; printenv STACK"