For different reasons Git users may put their binary files to the source control. They may do this by mistake, but any pushed binary will be stored in history forever. As a result we have a big size of Git repository. Today we will talk about how to clean repository...
Implement build, packaging and integration tests on the development branch before merge to master.
Requirements: Build all components (repo) build and run unit-tests.
Create installation package(s) for the product.
Verify clean installation and upgrade scenarios.
Allow to run multiple CI processes in parallel.
The solution was included:
Jenkins slaves in Docker containers.
Creating Docker image with the current build installed there.
Running multiple environments (Docker) for integration tests.
Docker Registry. Ansible. Bash scripts.
Migration from TFS to Jenkins 2 + Git.
Develop Dev/Stage/Prod environments for Jenkins server: Ansible base project (clone/export/import of Jenkins environments).
Migration of sources from TFS to Git including: binaries cleanup.
develop Bitbucket hooks.
Install, configure and integrate Artifactory. Resolve problems of Jenkins Pipeline builds.
Help to improve product quality by implement CI solution.
Algosec is a growing security company which needs to improve its development workflow. New CI solution should resolve many build/package/deploy problems. The solution should stabilize the product with multiple tests running on every change pushed to Git.
Organize an infrastruction servers: Git server (BitBucket), Nexus, Jenkins server and slaves.
Change an existing build scripts for different build systems:Makefile, Maven, Gradle, NPM.
Created Jenkins build jobs allowing to build the product component in parallel, which significantly improved the build time.
Bitbucket server (installation, configuration, hooks development).
Nexus (move all binary components there.
Jenkins (installation, plugins, slaves).
Build scripts (Makefile, Maven, Gradle, npm) - bash scripting.
Help to build a demo for Docker conference and create instructions for integration HPE Operations Analytics (OPSA) with Docker.
Increase number of clients by supporting log analytics of Docker based applications. My role is to research how to work properly with Docker’s logs and to find a best way to integrate the logs into OPSA. One of problems was to find a real Docker-based application which generates a lot of logs, and finally I was created a containers with relations by myself. Technology stack: Docker Engine, Docker Swarm (a little bit), rsyslog, sytemd, Logstash, bash, RHEL 7, Centos 7, Ubuntu, HPE Applications - OPSA, Logger, SmartConnector.