cover image
DGTL PERFORMANCE

DGTL PERFORMANCE

www.dgtl-performance.com

3 Jobs

4 Employees

About the Company

Fondée en 2018, DGTL Performance est une entreprise de services numériques basée à Toulouse, spécialisée en DATA et business intelligence. Intermédiation - Formation - Accompagnement Experte en data, l’entreprise propose à sa clientèle trois types de services différents et complémentaires Pour plus d’informations, nous vous invitons à consulter notre site internet.

Listed Jobs

Company background Company brand
Company Name
DGTL PERFORMANCE
Job Title
Data Engineer Talend
Job Description
**Job Title:** Data Engineer (Talend) **Role Summary:** Design, develop, and maintain data integration pipelines using Talend, with a focus on database modeling, performance optimization, and Azure cloud environments. Collaborate with cross‑functional teams to ensure robust, scalable solutions and contribute to functional and technical testing, documentation, and process improvement. **Expectations:** - Operate autonomously while maintaining strong teamwork and communication. - Navigate high‑pressure situations with resilience and proactive problem‑solving. - Deliver clear status reports to stakeholders and propose actionable improvements. - Show initiative in testing and handling non‑standardized data inputs. **Key Responsibilities:** - Develop and evolve Talend jobs and related database models. - Write, optimize, and maintain SQL (DDL/DML) scripts, preferably on PostgreSQL. - Conduct functional and technical testing, and support maintenance (MCO). - Analyze database and Azure VM performance; implement tuning measures. - Contribute to process enhancements and retro‑document existing solutions. - Utilize Git for version control and collaborate within CI/CD workflows. - Apply basic Java coding where required by Talend components. **Required Skills:** - Proven, recent experience with Talend data integration. - Strong SQL expertise, including query optimization (PostgreSQL preferred). - Fundamental knowledge of relational and BI data modeling. - Proficiency with Git (branching, merging, pull requests). - Hands‑on experience with a cloud platform, ideally Microsoft Azure. - Basic Java programming skills (junior level). - Soft skills: high autonomy, teamwork, effective communication, resilience, problem‑solving, and a proactive improvement mindset. **Required Education & Certifications:** - Bachelor’s degree in Computer Science, Information Systems, Engineering, or a related field (or equivalent professional experience). - No specific certifications required; relevant Talend or cloud certifications are a plus.
Nantes, France
Hybrid
10-11-2025
Company background Company brand
Company Name
DGTL PERFORMANCE
Job Title
Machine Learning Engineer
Job Description
**Job Title** Machine Learning Engineer **Role Summary** Design, develop, and deploy Machine Learning and Generative AI solutions on Google Cloud Platform. Collaborate with data leaders, product owners, and business teams to translate business requirements into scalable AI models and deploy them in production environments. **Expectations** - Deliver end‑to‑end ML projects from concept to production. - Advocate and educate stakeholders on AI/ML capabilities and best practices. - Ensure robust monitoring, documentation, and maintainability of deployed models. **Key Responsibilities** 1. Gather and analyze business needs for AI/ML solutions. 2. Evaluate and benchmark market solutions against requirements. 3. Prepare and transform data pipelines for algorithm training. 4. Select, test, and validate ML/AI algorithms. 5. Implement ML Ops: monitoring, versioning, and continuous deployment. 6. Develop reusable code modules (preprocessing, training, post‑processing, deployment). 7. Build user interfaces to expose models to end users. 8. Author technical and functional documentation. 9. Conduct workshops for Rimowa teams on ML and Generative AI concepts. **Required Skills** - SQL query writing (advanced). - Python (high proficiency). - Terraform (good knowledge). - Deploy and manage models on Vertex AI or equivalent GCP services. - Strong communication in French and English (oral & written). - Structured, organized, and proactive in project management. - Experience in computer vision and/or Dataiku is a plus. **Required Education & Certifications** - Bachelor’s or Master’s degree in Computer Science, Data Science, or related field. - Certifications in cloud platforms (e.g., GCP, Vertex AI) or data engineering are advantageous.
Paris, France
Hybrid
19-11-2025
Company background Company brand
Company Name
DGTL PERFORMANCE
Job Title
DATA ENGINEER
Job Description
**Job Title:** Data Engineer **Role Summary:** Design, build, and maintain high‑performance data pipelines and transformations using Kafka, DBT, Trino, and modern data lake technologies. Drive migration from legacy Cloudera/Spark environments to new architecture, ensure data quality, security, and compliance, and enable continuous integration/continuous delivery workflows for the data platform. **Expectations:** - Deliver robust, well‑documented ingestion and transformation solutions. - Mentor teammates on DBT, Kafka, and platform best practices. - Maintain adherence to data modeling standards, security policies, and GDPR compliance. - Actively participate in Agile ceremonies and contribute to continual platform improvement. **Key Responsibilities:** - Build and document pipelines with Kafka Connect, DBT, and Trino. - Design optimized, standardized data models that align with platform guidelines. - Lead migration from Cloudera/Spark to Kafka, Trino, and Iceberg; optimize existing processes. - Integrate code into CI/CD pipelines (Git, Jenkins/Argo, Airflow) and enforce deployment best practices. - Manage data access controls, permissions, and GDPR‑compliant audit trails. - Implement observability: logging, monitoring, job performance metrics, and access audits. - Review and mentor on coding standards, testing, and documentation. - Participate in daily stand‑ups, sprint planning, reviews, and retrospectives. **Required Skills:** - Hands‑on experience with Kafka, DBT, Trino, and Python. - Familiarity with Cloudera/Spark, Iceberg, and other data lake formats. - Strong ETL/ELT, data modeling, and data quality fundamentals. - Proficiency with CI/CD and version control (Git). - Knowledge of data security, access controls, and GDPR compliance. - Experience with observability tools (Prometheus, Grafana, ELK). - Comfortable working in Agile environments and cross‑functional teams. **Required Education & Certifications:** - Bachelor’s or Master’s degree in Computer Science, Data Engineering, Information Systems, or related field. - Certifications in Kafka (e.g., Confluent Certified Developer), DBT, or Trino are highly desirable.
Nantes, France
Hybrid
01-12-2025