Community based news organization Daily Voice covers important news, sports, and information from a local level. When rolling out a new version of their application, Daily Voice had two main concerns: they wanted to make sure they could handle the spikes in traffic that come with breaking news and they wanted to make deploying upgrades to their application more efficient. Enter AutoScaling Docker Containers on EC2 Container Service in Amazon Web Services (AWS).
Previously, discrepancies would often arise between the code deployed and tested in a QA environment and the code that was deployed in production. To solve this issue and ensure that tests performed in a QA environment were valid, we created a deployment process to leverage the immutable container based architecture employed by Docker. In so doing, Daily Voice benefited by having a predictable result. The code base could be deployed, tested, and moved directly to production without changing the underlying environment, upon which any validation tests were performed. Using jobs configured in Jenkins, the production environment simply takes the containers used in QA and uses them in production, taking advantage of environment variables in the container to connect to the proper back-end database.
Once a successful deployment procedure was in place, the need arose for a method of scaling the environment to handle sporadic heavy increases in traffic. Using the AutoScale feature of AWS, we were able to quickly pick up the extra load by deploying new servers into EC2 Container Service in response to said traffic. Auto-configuration of the newly created instances was taken care of using Salt to ensure minimal manual intervention.
In a disposable environment like this, it can be difficult to get the logs needed when the system fails. In order to facilitate debugging of issues, a central logging system was implemented using Elasticsearch, Logstash, and Kibana (ELK). Thus, in a scenario of a new instance being created by AutoScaling, Salt ensures that a Docker container is processing all logs from all containers to the centralized ELK server. Elasticsearch then indexes the logs to make them easier to search and Kibana provides the front end application to perform queries.