cover image
The Trade Desk

The Trade Desk

www.thetradedesk.com

4 Jobs

4,169 Employees

About the Company

A media buying platform built for what matters.

Trusted journalism. Premium streaming TV. One-click commerce. All these amazing online experiences thrive on an open internet. And they’re all fueled by relevant advertising.

That’s why we created an independent media buying platform designed for the open internet. One that helps marketers reach more customers in more places, with more transparency and choice at every stage — from planning to pricing to performance.

At its best, the internet is an open marketplace of ideas, content, and commerce. And we intend on keeping it that way. Because the open internet matters.

Listed Jobs

Company background Company brand
Company Name
The Trade Desk
Job Title
Summer 2026 Toronto Software Engineering Internship
Job Description
Job Title: Software Engineering Internship – Summer 2026 Role Summary: An entry‑level software engineering internship focused on owning, designing, and shipping a complete feature or project within a global technology environment. Interns receive dedicated mentorship, collaborate with distributed teams, and work across cutting‑edge domains such as distributed systems, large‑scale data processing, machine learning, and user interface design. Expectations: - Own and deliver a full project lifecycle from requirements through production release. - Demonstrate initiative, self‑directed learning, and responsible decision‑making. - Work independently while actively seeking and applying mentorship feedback. - Attend in‑office sessions 3 days per week (Tuesday, Wednesday, Thursday). Key Responsibilities: - Design, develop, and maintain scalable software components using a modern object‑oriented language. - Implement efficient algorithms and data structures to address business challenges. - Collaborate with engineering, product, and data science teams across multiple time zones. - Participate in code reviews, unit testing, and continuous integration pipelines. - Contribute to documentation and knowledge sharing within the team. - Apply best practices for performance, reliability, and security in distributed environments. Required Skills: - Proficiency in at least one modern object‑oriented programming language (e.g., Java, C++, Python, C#). - Strong grasp of fundamental data structures, algorithms, and complexity analysis. - Ability to learn new technologies quickly and independently. - Effective communication skills in English, both written and verbal. - Experience with version control (Git) and collaborative development workflows is a plus. Required Education & Certifications: - Current enrollment in a Bachelor’s or Master’s degree program in Computer Science, Software Engineering, or related field. - Expected graduation between Autumn 2026 and Summer 2027. - No specific certifications required; however, knowledge of distributed systems, cloud platforms, or machine learning is advantageous.
Toronto, Canada
On site
09-12-2025
Company background Company brand
Company Name
The Trade Desk
Job Title
Senior Software Engineer - Geo Targeting
Job Description
**Job Title** Senior Software Engineer – Geo Targeting **Role Summary** Lead end‑to‑end design, development, and delivery of large‑scale distributed systems for geo‑targeting solutions. Own product strategy, architecture, and API design while mentoring engineers across cross‑functional, global squads. Drive scalable, high‑performance services that handle massive data volumes and real‑time decisioning for advertising campaigns. **Expectations** - Own feature development from concept through production and support. - Collaborate effectively with distributed teams across multiple time zones. - Communicate complex technical concepts to both technical and non‑technical stakeholders. - Demonstrate strong problem‑solving skills and rapid learning of new technologies. **Key Responsibilities** - Design, implement, and maintain distributed services in cloud and on‑prem data centers. - Build and evolve APIs and web services (C#, .NET, Java, JavaScript/TypeScript). - Process petabyte‑scale data using Spark and Scala; integrate Python for data pipelines as needed. - Ensure system scalability, reliability, and performance at global scale. - Write clean, testable code; perform code reviews and enforce best practices. - Mentor junior engineers; share knowledge across the team. - Participate in agile ceremonies and contribute to continuous improvement. **Required Skills** - Extensive experience with distributed systems and large‑scale data processing. - Proficiency in C#, .NET, Java (web and API development). - Strong JavaScript/TypeScript coding skills; familiarity with modern frontend frameworks. - Hands‑on experience with Apache Spark, Scala; Python experience a plus. - Comfortable using Visual Studio / VS Code / Rider; Git version control. - Solid computer science fundamentals (algorithms, data structures, concurrency). - Excellent communication and collaborative skills in a global environment. **Required Education & Certifications** - Bachelor’s degree in Computer Science, Software Engineering, or related technical field (or equivalent experience). - Certifications in distributed systems, cloud platforms, or big‑data technologies are advantageous but not mandatory.
Seattle, United states
On site
Senior
19-12-2025
Company background Company brand
Company Name
The Trade Desk
Job Title
Senior Software Engineer - Data Transparency
Job Description
**Job Title:** Senior Software Engineer – Data Transparency **Role Summary:** Full‑stack owner of data‑platform infrastructure, responsible for designing, building, and delivering scalable, high‑quality data pipelines and foundational datasets that serve internal tools and external partners. Works across data engineering, analytics, solution architecture, and business intelligence to enable trustworthy, compliant data usage. **Expectations:** - Lead engineering efforts, mentor a diverse team, and drive systematic improvements at scale. - Own the data lifecycle from acquisition to production exposure, ensuring quality, lineage, and compliance for all datasets. - Foster a culture of continuous learning and innovation, partnering with Legal, Security, and Governance to uphold data stewardship standards. **Key Responsibilities:** - Architect and implement large‑scale data pipelines (Spark, Airflow) for high‑velocity, petabyte‑scale workloads. - Design core, reusable datasets that eliminate redundant downstream transformations. - Establish and maintain metadata, data lineage, observability, and quality checks throughout the ecosystem. - Build privacy‑centric data exports and processing layers for external stakeholders. - Collaborate with cross‑functional teams to translate business requirements into reliable, maintainable data solutions. - Monitor, troubleshoot, and optimize system performance and resource utilization. **Required Skills:** - Strong communication and stakeholder‑management abilities. - Expert in Spark for data processing and performance tuning. - Experience constructing complex, production‑grade pipelines in Airflow (plus). - Proficient in data architecture, quality, observability, and lineage frameworks. - Knowledge of data compliance, privacy regulations, and security best practices. - Proven track record of leading engineers across varied experience levels. **Required Education & Certifications:** - Bachelor’s or Master’s degree in Computer Science, Engineering, or a related technical discipline. - No mandatory certifications, but familiarity with Big Data or Cloud data services is a plus.
San jose, United states
On site
Senior
26-12-2025
Company background Company brand
Company Name
The Trade Desk
Job Title
Senior Data Scientist-Measurement
Job Description
Job Title: Senior Data Scientist – Measurement Role Summary: Lead end‑to‑end data science initiatives that quantify causal lift of advertising on the platform, building and deploying statistical and machine learning models to demonstrate incremental impact for internal and external clients. Expectations: Own the full research‑to‑production cycle of causal measurement projects, collaborate with engineering, product, and stakeholders, communicate insights effectively, and scale solutions on distributed clusters. Key Responsibilities: - Design, develop, and validate causal inference and lift‑measurement models to assess incremental ad impact. - Build and maintain ETL pipelines to aggregate data from diverse sources for model training. - Plan, execute, and analyze A/B or randomized experiments in production environments. - Translate analytical findings into product requirements and deployment plans. - Integrate models into production systems and monitor their performance. - Present results and recommendations to cross‑functional teams in clear, actionable formats. - Stay abreast of emerging techniques in causal inference and share best practices. Required Skills: - Advanced proficiency in Python (pandas, scikit‑learn, statsmodels, etc.). - Deep knowledge of statistical modeling, machine learning, and causal inference. - Proven experience designing and running production‑level experiments. - Ability to scale data workflows on distributed platforms (Spark, EMR, Databricks). - Strong data engineering fundamentals: ETL, data pipelines, and big‑data tools. - Excellent communication and stakeholder‑management skills. - Rapid learning and problem‑solving aptitude for new tools and technologies. Required Education & Certifications: - Bachelor’s or Master’s in Statistics, Computer Science, Applied Mathematics, or related field (Ph.D. preferred). - Minimum 4 years of data science/ML experience with end‑to‑end product delivery (or 2 years of industry experience for Ph.D. holders). + Preferred: experience in programmatic advertising and causal lift measurement.
Toronto, Canada
On site
Senior
09-02-2026