- Company Name
- The Judge Group
- Job Title
- DevOps Engineer
- Job Description
-
**Job Title**
DevOps Engineer
**Role Summary**
Design, implement, and maintain automated build, deployment, and monitoring pipelines for high‑performance computing (HPC) and distributed data systems. Collaborate with software, data engineering, and security teams to ensure reliable, scalable, and secure delivery of applications across containerized and cluster environments.
**Expectations**
- Maintain end‑to‑end CI/CD workflows that support multi‑language codebases (C/C++, Python, Java).
- Ensure compliance with security and quality standards through automated static and dynamic analysis.
- Leverage HPC‑specific tools for performance profiling and optimization.
- Manage container orchestration, storage, and distributed compute resources.
**Key Responsibilities**
- Develop and maintain build scripts using Make, Maven, Gradle, or Ant.
- Configure and troubleshoot CI/CD pipelines in Jenkins, GitLab CI, or equivalent.
- Implement Git branching strategies and enforce workflow best practices.
- Automate build and test processes using Bash, Python, or Ruby.
- Build, push, and deploy Docker containers; manage Kubernetes clusters.
- Integrate code quality and security tools (SonarQube, Valgrind, GDB, etc.).
- Optimize application performance for CPU, GPU, and parallel execution (MPI, OpenMP, CUDA).
- Work with large data sets and optimize I/O using Lustre, GPFS, HDFS, or similar.
- Support distributed processing frameworks (Hadoop, Spark, Dask).
- Maintain documentation, runbooks, and incident post‑mortems.
- Coordinate with security teams to meet Secret clearance requirements.
**Required Skills**
- Build automation: Make, Maven, Gradle, Ant.
- CI/CD: Jenkins, GitLab CI.
- Version control: Git (branching, workflows).
- Scripting: Bash, Python, Ruby.
- Containerization: Docker, Kubernetes.
- Static/dynamic analysis: SonarQube, Valgrind, Gprof, Intel VTune.
- Programming: C/C++, Python.
- Parallel programming: MPI, OpenMP, CUDA.
- HPC storage: Lustre, GPFS, HDFS.
- Distributed frameworks: Hadoop, Spark, Dask.
- Security awareness: adherence to Secret clearance standards.
**Required Education & Certifications**
- Bachelor’s degree in Computer Science, Software Engineering, or related field.
- Minimum 3 + years of professional DevOps or software build management experience.
- (Optional) Certifications in Docker, Kubernetes, Jenkins, or related DevOps tools are a plus.