- Company Name
- evoteo
- Job Title
- Développeur Big Data Spark Java F/H
- Job Description
-
Job title: Big Data Spark Java Developer (F/H)
Role Summary:
Design, develop, and maintain high‑performance, secure data pipelines and services using Java and Apache Spark, integrating with Hadoop, Kafka, and relational databases. Deliver data solutions that support analytics and decision‑making while adhering to quality, security, and performance standards.
Expectations:
* Minimum 6 years’ professional experience as a Java developer (excluding internships/alternatives).
* Proven expertise in Java, Spark, and big‑data technologies.
* Strong knowledge of microservices, REST APIs, CI/CD, and agile practices (Scrum/Kanban/SAFe).
* Ability to write clean, testable code, with automated unit/integration tests and secure coding practices.
* Demonstrated autonomy, analytical mindset, and capacity to transfer knowledge through mentoring or documentation.
Key Responsibilities:
* Develop and maintain Java/Spark applications to ingest, process, and store large volumes of data.
* Design and optimize relational (PostgreSQL) and NoSQL storage solutions.
* Ensure reliable execution of data workflows; troubleshoot and resolve production issues.
* Apply best security practices for data pipelines and services.
* Collaborate with data analysts and scientists to deliver understandable, accessible data products.
* Perform continuous improvement: refactor code, enhance performance, and adopt new tools or frameworks.
* Lead code reviews, contribute to architecture decisions, and mentor junior developers.
* Stay current with emerging big‑data and cloud technologies, recommending relevant advancements.
Required Skills
* Programming: Java (8/11/17+), Apache Spark (RDD/DataFrame/SQL), SQL, PL/pgSQL.
* Big‑Data Ecosystem: Hadoop, Spark, Kafka, Elasticsearch, Hive.
* DevOps & Deployment: Git, Maven, Jenkins, Docker, Kubernetes, Ansible, CI/CD pipelines.
* Scripting: Bash, Python (basic).
* Data Tools: Tableau Desktop, Power BI (for reporting).
* Frameworks: Spring Boot, Hadoop ecosystem, Spring Data, Hibernate/JPA.
* Cloud Platforms (optional): AWS, GCP, Azure – basic familiarity.
* Methodologies: Agile (Scrum/Kanban), TDD/BDD, automated testing frameworks (JUnit, TestNG).
* Soft skills: analytical, autonomous, collaborative, strong communication.
Required Education & Certifications
* Minimum Bachelor’s degree (Bac+5) in Computer Science, Engineering, or equivalent (Master’s preferred).
* Certifications (desired): Java SE Programmer, Pivotal Certified Spark Developer, AWS Certified Solutions Architect (Associate), Kubernetes Administrator.
---