- Company Name
- QinetiQ US
- Job Title
- DevOps Engineer
- Job Description
-
**Job Title:** DevOps Engineer
**Role Summary:**
Lead the migration and modernization of data workflows from on‑premises infrastructure to AWS, ensuring secure, scalable, and efficient data pipelines for data engineering and science teams.
**Expectations:**
- Own end‑to‑end data lifecycle from exploration to production.
- Deliver robust, automated cloud solutions that replace legacy on‑prem resources.
**Key Responsibilities:**
- Design, implement, and maintain AWS-based data pipelines using CloudFormation, EC2, S3, and RDS.
- Build CI/CD pipelines with Git, Jenkins, and scripting (Bash).
- Automate infrastructure with Ansible, Terraform, or Terraform-based CDK where appropriate.
- Develop and deploy containerized services using Docker; manage container orchestration.
- Integrate data processing tools such as Apache Spark, NiFi, Elasticsearch, and Presto/Trino.
- Create and document RESTful APIs; expose services with Java/Scala or Python.
- Monitor, troubleshoot, and optimize data workflows and performance.
- Ensure compliance with security and clearance requirements, maintaining TS/SCI status.
**Required Skills:**
- Deep knowledge of AWS services: CloudFormation, EC2, S3, RDS.
- Infrastructure automation: Ansible, CloudFormation, Terraform (preferred), CDK (preferred).
- CI/CD expertise: Git, Jenkins, Bash scripting.
- Proficient in Python, Java, or Scala for data pipeline development.
- Experience with containers: Docker, Docker Compose, or Kubernetes basics.
- Data processing: Apache Spark, NiFi, Elasticsearch, Presto/Trino, Hive, MapReduce.
- Datastores: Graph, NoSQL, relational databases.
- Familiarity with REST API design (Java/Scala/Python).
- Security clearance: Active TS/SCI with polygraph.
**Required Education & Certifications:**
- Active TS/SCI clearance with polygraph; no additional academic degree required.
- Proficiency in at least one major cloud provider (AWS) and related DevOps tooling.