- Company Name
- Glassbox
- Job Title
- Backend Data Developer
- Job Description
-
Job Title: Backend Data Developer
Role Summary: Own and evolve the core data infrastructure that powers a high‑scale distributed product. Design, implement, and maintain Java‑based backend services, ingestion, indexing, and storage workflows that integrate with modern data technologies such as OpenSearch, ClickHouse, Kafka, Cassandra, Spark, Iceberg, and Postgres. Resolve production incidents, optimize for performance, and mentor cross‑functional teams.
Expectations: Deliver production‑ready, highly scalable code; proactively identify and eliminate bottlenecks; ensure system durability, efficiency, and elasticity under heavy load; participate in incident response and system hardening; provide technical leadership and mentorship.
Key Responsibilities:
• Design and build core data infrastructure and backend services for high‑throughput distributed systems.
• Develop, test, and maintain Java code, focusing on clean architecture, modularity, and performance.
• Implement robust data ingestion, indexing, and storage pipelines across OpenSearch, ClickHouse, Kafka, and Cassandra.
• Create internal tooling and frameworks that simplify data access and processing for other teams.
• Resolve complex production issues in data pipelines, storage layers, and distributed services.
• Profile and optimize memory usage, latency, and throughput.
• Lead initiatives to improve scalability, resilience, and system hardening.
• Mentor and guide R&D teams on best practices and architectural decisions.
Required Skills:
• Strong Java development skills with deep understanding of JVM fundamentals.
• Proven experience building and scaling distributed backend systems at enterprise scale.
• Hands‑on expertise with OpenSearch, ClickHouse, Kafka, Cassandra, Spark, Iceberg, and Postgres (or equivalent).
• Ability to write efficient, production‑grade, maintainable code.
• Advanced troubleshooting, debugging, and performance profiling capabilities.
• Experience with real‑time processing and high‑throughput data pipelines.
• Familiarity with incident response, monitoring, and system hardening practices.
Required Education & Certifications:
• Bachelor’s (or higher) degree in Computer Science, Software Engineering, Data Engineering, or related field.
• Relevant certifications (e.g., AWS Solutions Architect, Google Cloud Data Engineer, or similar) are a plus but not mandatory.