cover image
Sedra Solutions

Sedra Solutions

www.sedrasolutions.com

1 Job

6 Employees

About the Company

Founded in 2022, Sedra Solutions is revolutionizing forensic accounting for businesses and governments with its advanced algorithms, AI, and Machine Learning-driven software platform. Specializing in detecting and documenting financial fraud, Sedra offers significant time and cost savings. Sedra aims to arm organizations with cutting-edge analytic tools for efficiently tackling financial fraud. Sedra's commitment to innovation ensures its software remains at the forefront of technology, continually evolving to meet the challenges of financial analysis. As a new force in forensic accounting technology, Sedra is a crucial ally for organizations aiming to protect their financial interests, heralding a more secure and transparent financial future.

Listed Jobs

Company background Company brand
Company Name
Sedra Solutions
Job Title
Senior Data Engineer
Job Description
**Job Title:** Senior Data Engineer **Role Summary:** Own end‑to‑end data engineering within a GCP‑first platform, designing scalable pipelines, data models, and quality systems that support analytics, product, and ML workloads. **Expectations:** Deliver robust, cost‑efficient data solutions, drive architecture decisions, enforce best practices, collaborate cross‑functionally, and uphold security & compliance standards in a fast‑moving startup. **Key Responsibilities:** - Design, build, and operate scalable data pipelines, primarily on Dataflow and other GCP services. - Own ingestion, transformation, and modeling of data for analytics & product needs. - Define clear data contracts with engineering and product teams. - Deliver reliable, well‑modeled datasets in BigQuery, Elasticsearch, and Neo4j. - Implement data quality checks, monitoring, and alerting. - Optimize pipelines for cost, performance, and reliability as usage grows. - Establish best practices, patterns, and tooling for the data engineering organization. **Required Skills:** - 5+ years in data, backend, or platform engineering. - Strong SQL, Python, and/or Scala proficiency. - Proven experience building production data pipelines. - Solid ETL patterns and data modeling knowledge. - Experience with GCP services (Dataflow, BigQuery, Pub/Sub, etc.). - Familiarity with workflow orchestration (Airflow, Dagster, etc.). - Understanding of information security principles, ISMS policies, and cloud security best practices. - Excellent communication and cross‑functional collaboration. - Comfortable in an ambiguous, fast‑moving startup environment. **Preferred Qualifications:** - Apache Beam/Scio experience. - Support for analytics, experimentation, or ML workloads. - Experience with data quality, observability, or lineage tools. - Prior ownership of data systems in a small team. - Domain knowledge in legal technology or forensic accounting (bonus). **Required Education & Certifications:** - Bachelor’s degree in Computer Science, Engineering, or related field (or equivalent work experience). - GCP Professional Data Engineer certification (preferred).
New york, United states
Hybrid
Senior
17-02-2026