Back to Search
Overview
Senior

Senior Data Engineer

Confirmed live in the last 24 hours

Too Good To Go

Too Good To Go

København, Hovedstaden, Denmark
Hybrid
Posted April 9, 2026

Job Description

At Too Good To Go, we have an ambitious goal: to inspire and empower everyone to fight food waste together.

40% of all food produced in the world is wasted. And that has a huge impact on the health of our planet, with 10% of greenhouse gas emissions coming from food waste. 

We’re more than an app: we are a certified B Corporation with a mission to empower everyone to take action against food waste, so alongside our marketplace app, we create educational tools, explore new business solutions - such as our Retail Technologies offering, and influence legislation to help reduce food waste.

We’re growing fast: Our community of 133 million registered users and 261,000 active partners across 21 countries, have together already prevented 517+ million meals from going to waste - avoiding over 1.397.000 tonnes of CO2e!

The Role

Too Good To Go is looking for a Senior Data Engineer (we would also consider mid-levels) to further enhance our existing data platform. You will be a part of our data platform team and will be based from our HQ in Copenhagen.

You get to bring your own opinions and experience when it comes to the technology stack. We’ve kept the legacy systems to a minimum, so there is room to adapt and optimize.

Tech Stack

Our data stack consists of Airflow, Python, dlt, AWS Redshift, lambda, Kinesis, S3, glue, dbt, metaflow and we use Terraform to manage our infrastructure using GitOps principle. We have access to a number of AI tools to boost development productivity.

Responsibilities 

  • Design, build, and maintain a data platform and data pipelines from end-to-end, ensuring data accuracy, availability, and quality for the Tech, Product and Analytics teams
  • Collaborate closely with software engineers and analytics engineers to understand data requirements, develop data models, and optimize data pipelines for advanced analytics and machine learning use cases
  • Develop and maintain scalable, efficient, and reliable ETL processes, using best practices for data ingestion, storage, and processing
  • Work with stakeholders to identify and prioritize analytics requirements
  • Proactively monitor data pipelines, troubleshoot, and resolve data-related issues
  • Contribute to the continuous improvement of data engineering practices, including documentation, code reviews, and knowledge sharing

Desired Experience

  • Proven experience in a similar role, working in full autonomy but also collaboratively.
  • Experience with infrastructure-as-code, with Terraform.
  • A background in building scalable and data-intensive applications on cloud infrastructure.
  • Fluent in Python & SQL and you understand the internals of modern data warehouses 
  • Experience with CI/CD tools, github actions, Docker (ideally on Kubernetes).
  • You have worked with Airflow or similar orchestration tools.
  • Experience applying DevOps principles to data systems (DataOps).
  • Exposure to ML platforms.
  • Event stream processing.
  • Significant knowledge of AWS services (or other cloud providers), particularly those related to data storage, processing, and analytics (e.g.,S3, Kinesis, Lambda, Redshift, OpenSearch, RDS, Glue).
  • Some experience to modern data architecture

Our values

  • We win together: Food waste is a big beast to fight. We believe in a #oneteam.
  • We raise the bar: We always push for more. We work smart, smash barriers and elevate one another.
  • We keep it simple: Our ambitions are bold but our solutions are simple.
  • We build a legacy: We’re proud of the change we’re driving. 
  • We care: We always look out for each other. Caring is also about the way we do business. We do the right thing.

What We Have To Offer

  • An opportunity to work in a global social-impact company and certified B Corporation! where you can see a real and tangible impact in your role.
  • To be an integral member of our defined product teams. We are eager for you to make an impact and contribute to the product scope and development; Your insights are valuable, and we are here to listen.
pythongorustawskubernetesdockermachine learningaidevopsdata