cover image
Saxon Global

Saxon Global

saxonglobal.com

9 Jobs

160 Employees

About the Company

Saxon Global is one of the fastest-growing Inc. 5000 Companies in the U.S., providing enhanced IT consulting and staffing solution services for the past 20 years.

Listed Jobs

Company background Company brand
Company Name
Saxon Global
Job Title
Software Engineer
Job Description
Job title: Senior Software Engineer – GCP / Python / Node.js / JavaScript Role Summary: Contract developer focusing on backend (60 %) and frontend (40 %) components of a Customer Data Platform and Engagement suite. Responsibilities include building web applications, orchestrating email campaigns, and designing automated testing frameworks, all on Google Cloud Platform. Expectations: Deliver high‑quality, cloud‑native code; collaborate across cross‑functional teams; adhere to agile practices; maintain rigorous automated test coverage; demonstrate measurable impact on customer engagement metrics. Key Responsibilities: * Design, develop, and maintain backend services and APIs on GCP (App Engine, Cloud Functions, BigQuery, Cloud Pub/Sub). * Implement frontend components using modern JavaScript frameworks (React/Vue) and Node.js server‑side logic. * Create and schedule email campaigns, integrating with third‑party deliverability services. * Build, maintain, and run automated testing pipelines (unit, integration, end‑to‑end) to ensure system reliability. * Optimize performance and scalability of cloud applications. * Participate in code reviews, documentation, and continuous improvement initiatives. * Work with data engineers to model and query large customer datasets. Required Skills: • Google Cloud Platform services (compute, storage, networking, IAM) • Proficiency in Python and Node.js • Advanced JavaScript (ES6+) and modern front‑end frameworks • RESTful API design and implementation • SQL/BigQuery data querying • Automated test frameworks (pytest, Jest, Cypress) • CI/CD pipelines (Git, Cloud Build, Docker) • Version control best practices • Strong debugging and problem‑solving skills • Effective communication and collaboration in Agile teams. Required Education & Certifications: • Bachelor’s degree in Computer Science, Software Engineering, or related field (or equivalent experience) • Relevant GCP certifications preferred (e.g., Professional Cloud Developer, Cloud Architect) • Experience in retail or customer‑facing technology domains.
United states
Remote
24-11-2025
Company background Company brand
Company Name
Saxon Global
Job Title
Sr. Site Reliability Engineer
Job Description
**Job Title:** Sr. Site Reliability Engineer **Role Summary:** Lead the migration and optimization of financial services platforms to Azure‑native, containerized environments (AKS/Kubernetes). Design and enforce SRE practices – SLOs/SLIs, highly available architectures, automated CI/CD pipelines, and end‑to‑end observability – to deliver resilient, high‑performance systems at global scale. **Expectations:** - Contract‑to‑hire, hybrid onsite/remote model. - Deliver measurable improvements in uptime, performance, and reliability. - Build standards, mentor developers, and foster cross‑functional collaboration. **Key Responsibilities:** - Partner with Architecture and Development teams to design and validate highly available, scalable solutions. - Define, implement, and monitor SLOs/SLIs; guide teams on consequences and remediation strategies. - Troubleshoot incidents, conduct root‑cause analysis, and plan corrective actions. - Develop and maintain automated pipelines (Azure DevOps, Terraform, Jenkins) with integrated testing and security scans. - Implement containerization (AKS/Kubernetes/Docker) and reconcile database schemas (SQL Server, Oracle, NoSQL) for performance. - Create and enforce best‑practice standards; lead code review and mentoring initiatives. **Required Skills:** - Programming/Scripting: C#, .NET, Java, Go, PowerShell, Bash. - CI/CD & IaC: Azure DevOps (YAML/ARM), Terraform, Jenkins, Chef, Octopus Deploy. - Containerization & Orchestration: AKS, Kubernetes (Open Source), Docker. - Databases: SQL Server, Oracle, CosmosDB/NoSQL; query tuning, indexing. - Security & Test Automation: SonarQube/Checkmarx, Selenium, SpecFlow, JMeter, Postman. - DevOps Practices: SLO/SLI management, incident response, problem management. - Agile/Scrum leadership and continuous improvement mindset. **Required Education & Certifications:** - Bachelor’s degree in Computer Science, Engineering, or related field. - Azure certifications preferred (e.g., Azure Solutions Architect, Azure DevOps Engineer, Azure Administrator).
Arlington, United states
Hybrid
25-11-2025
Company background Company brand
Company Name
Saxon Global
Job Title
Python Developer
Job Description
**Job Title** Senior Python Developer **Role Summary** Architect, develop, and maintain production‑grade Python services and APIs using FastAPI, Django, or Flask. Design scalable, event‑driven microservices with high availability, async I/O, caching, and robust observability. Lead code quality, security, and DevOps practices while collaborating across cross‑functional teams. **Expectations** - Minimum 10 years of professional Python software engineering. - Deep expertise in at least one modern Python web framework and async programming. - Strong computer‑science fundamentals (algorithms, data structures, concurrency). - Proven ability to design distributed, event‑driven systems. - Excellent communication and leadership skills for cross‑team initiatives. **Key Responsibilities** - Design, implement, and maintain high‑reliability Python APIs and microservices. - Model data schemas, write efficient SQL, and integrate with PostgreSQL/MySQL and Redis. - Profile and optimize performance hot spots, apply back‑pressure, circuit breakers, and idempotency. - Enforce security best practices: authN/authZ, secrets management, secure coding, and dependency hygiene. - Conduct code reviews, write automated tests (pytest), enforce static typing (mypy/pyright), linting, and CI/CD pipelines. - Containerize services with Docker, deploy to Kubernetes or Azure Functions, and manage IaC via Terraform. - Implement observability using OpenTelemetry, create dashboards and alerts. - Lead release management and support post‑deployment monitoring and incident response. **Required Skills** - Python 3.x, FastAPI/Django/Flask, async/await. - Microservices architecture, event‑driven patterns (pub/sub). - Caching (Redis), SQL/PostgreSQL/MySQL, ORM (SQLAlchemy/Django ORM). - Profiling tools (cProfile, py-spy). - Testing: unit, integration, contract tests. - CI/CD: Azure DevOps, GitHub Actions, pipeline automation. - Containerization: Docker; orchestration: Kubernetes. - IaC: Terraform. - Observability: OpenTelemetry, Prometheus/Grafana. - Static typing: mypy/pyright; linting: flake8/black. - Security: OAuth2/OIDC, secrets management (Azure Key Vault, AWS Secrets Manager). - Strong communication, documentation, and cross‑team leadership. **Required Education & Certifications** - Bachelor’s degree in Computer Science, Software Engineering, or equivalent professional experience. - Optional: Azure Certified Developer or Kubernetes Administrator, Security (e.g., CISSP, CISM) may be advantageous but not required.
Chicago, United states
On site
02-12-2025
Company background Company brand
Company Name
Saxon Global
Job Title
DevOps Platform Engineer at Remote only in Canada
Job Description
**Job Title** DevOps Platform Engineer (Remote – Canada) **Role Summary** Leveraging Azure cloud services, Databricks, and Terraform to design, automate, and maintain data platform infrastructure. The role is predominantly DevOps‑centric (80%) with a strong focus on data engineering practices (20%). The engineer ensures secure, efficient, and scalable data operations, automates deployment pipelines, and collaborates across data and operations teams. **Expectations** - Deliver reliable CI/CD pipelines for data and application workloads. - Automate Terraform scripts and infrastructure as code processes. - Maintain and optimize Azure Databricks clusters, pipelines, and data mesh components. - Execute Unity Catalog migrations and related cataloging tasks. - Collaborate with data and engineering teams to decouple applications and run parallel data workflows. - Adhere to EST working hours for coordination with onshore teams. **Key Responsibilities** 1. Design, implement, and maintain Azure-based data platforms using Databricks, pipelines, and cluster management. 2. Build and maintain Terraform infrastructure code for both Development and Production environments. 3. Develop and manage GitHub repositories, YAML pipelines, and CI/CD workflows for automated deployments. 4. Perform Unix-based data mesh architecture design, ensuring decoupling, parallel execution, and minimal latency. 5. Migrate and configure Unity Catalog for enhanced data governance and security. 6. Monitor and troubleshoot platform performance, capacity, and cost optimization. 7. Coordinate with Data Engineering team on 20% data engineering tasks, including pipeline design, data validation, and quality checks. 8. Stay updated on Azure, Databricks, and Terraform best practices; apply continuous improvement initiatives. **Required Skills** - Azure Cloud Platform (Azure Databricks, Azure Data Factory, Azure AD, Network, Storage). - Terraform (Infrastructure as Code, module creation, state management). - CI/CD (Azure DevOps Pipelines, GitHub Actions, YAML scripting). - Source control operations (Git, GitHub, branching, pull request management). - Data engineering fundamentals: pipeline design, data cleansing, batch/stream processing. - Data mesh architecture concepts, application decoupling, parallel execution. - Azure Cluster lifecycle management (startup, shutdown, auto‑scale). - Python/Scala for Databricks notebooks and scripting. - Basic knowledge of Kafka (optional but preferred). - Strong troubleshooting, performance tuning, and problem‑solving skills. - Effective communication and collaboration in a remote/EST‑aligned environment. **Required Education & Certifications** - Bachelor’s degree in Computer Science, Information Technology, or related field. - Azure Certified Solutions Architect or Data Engineer (AZ-305 / DP-203) preferred. - Databricks Certified Professional Developer or Engineer evident. - Terraform Associate certification optional but beneficial. ---
Canada
Remote
09-12-2025