Back to Search
Overview
Mid-Level

Data Engineer, Revenue Technology

Confirmed live in the last 24 hours

The New York Times

The New York Times

Compensation

$110,000 - $130,000/year

New York, NY
Hybrid
Posted March 24, 2026

Job Description

The mission of The New York Times is to seek the truth and help people understand the world. That means independent journalism is at the heart of all we do as a company. It’s why we have a world-renowned newsroom that sends journalists to report on the ground from nearly 160 countries. It’s why we focus deeply on how our readers will experience our journalism, from print to audio to a world-class digital and app destination. And it’s why our business strategy centers on making journalism so good that it’s worth paying for. 

About the Role:

The position of the Data Engineer will be part of the Partner Revenue team that develops and deploys data platforms for advanced analytics and data processing. The Data Engineer defines and builds the data pipelines to promote faster, better, data-informed decision-making within the business.

The role is a hybrid position requiring 3 days a week in the office. You will report to the Director of Partner Revenue Technology.

Responsibilities:

  • You will develop scalable data and ETL solutions
  • You will develop scalable data platform components
  • You will support existing platforms and implement data pipelines that ingest, transform, and curate data
  • You will support any future migrations to new platforms using cloud-based technologies
  • Demonstrate support and understanding of our value of journalistic independence and a commitment to our mission to seek the truth and help people understand the world

Basic Qualifications:

  • 2+ years knowledge and experience in SQL for querying and data modeling
  • 2+ years experience with any ETL frameworks or tools preferably Pentaho Data Integration or similar tools
  • Demonstrated programming skills in Python and Shell scripting or similar language

Preferred Qualifications:

  • Knowledge in data modeling and data warehouse concepts and have technical abilities to develop and debug complex code using AWS/GCP cloud infrastructure
  • Experience with ETL Tools, Shell scripting, SQL, Python, Java, Airflow, dbt with the understanding of data flows, data warehouse architecture, ETL concepts and processing of structured and unstructured data
  • Proven knowledge of CI/CD practices and version control (Git)
  • Demonstrated ability to code with VSCode or similar IDE tools with the ability to use AI coding assistant tools such as cursor and github copilot