cover image
Vallum Associates

Data Architect

Hybrid

London, United kingdom

Freelance

17-09-2025

Share this job:

Skills

Python Apache Airflow Neo4J Problem-solving Programming Databases apache Analytical Skills Data Science python programming

Job Specifications

Responsibilities:

* Design, develop, and maintain scalable data pipelines using SPARQL and Python to extract, transform, and load data into graph databases.

* Create and optimize complex SPARQL queries to retrieve and analyze data from graph databases.

* Develop graph-based applications and models to solve real-world problems and extract valuable insights from data.

* Collaborate with data scientists and analysts to understand their data requirements and translate them into effective data pipelines and models.

* Ensure data quality and integrity throughout the data pipeline process.

* Stay up-to-date with the latest advancements in graph databases, data modeling, and programming languages.

Qualifications:

* Bachelor's degree in Computer Science, Data Science, or a related field.

* Proven experience with SPARQL and Python programming languages.

* Strong understanding of graph databases (e.g., RDF, Neo4j, GraphDB).

* Experience with data modeling and schema design.

* Knowledge of data pipeline tools and frameworks (e.g., Apache Airflow, Luigi).

* Excellent problem-solving and analytical skills.

* Ability to work independently and as part of a team. Clinical knowledge

About the Company

Vallum Associates offer best in class talent acquisition on a contingency, retained, or project basis. Through our dedicated sector consultants, our specialised brands have the knowledge and connections to provide tailored hiring and project services across industries : * Banking & Financial * Energy, Utilities & Commodities * Engineering & Renewable * Insurance Services Our specialised industry and sector specific consultants are able to offer a personalised experience to fit your needs. Our unique associate consultativ... Know more