Back to Search
Overview
Internship

Data Engineer Intern (Remote)

Confirmed live in the last 24 hours

ezCater

ezCater

Compensation

$38/hr

Boston, MA
Hybrid
Posted March 6, 2026

Job Description

ezCater is the leading food for work technology company in the US, connecting anyone who needs food for their workplace to over 100,000 restaurants nationwide. For workplaces, ezCater provides flexible and scalable solutions for everything from recurring employee meals to one-off meetings, all backed by 24/7 customer service with real humans. ezCater also enables companies to manage their food spend in a single, customizable platform. For restaurant partners, ezCater helps them grow their business by bringing them more orders and new high-value customers.  We're backed by top investors including Insight, Iconiq, Lightspeed, GIC, SoftBank, and Quadrille.

The Data Engineering team at ezCater sits within our broader Data Technology organization and partners with stakeholders across the company to power data informed decisions. You will join a group of data engineers, data product managers, principal and staff engineers, and business analysts who design and build high quality data models and pipelines, backed by robust testing and monitoring, to deliver reliable data products for the business. As a Data Engineer Intern, you will be embedded on a real project, working with modern tools in our data stack to solve complex data problems, collaborate with partners across functions, and ship work that is used in production and creates measurable value for ezCater.

Internship dates: June 1, 2026 - August 14, 2026

What You'll Do: 

  • Provide timely, well-documented responses to data questions from stakeholders across Engineering, Marketing, Business, Data Science, and Analytics, escalating when appropriate and keeping requesters informed.
  • Take on well-scoped, small data modeling requests in our warehouse: update or add dbt models and tests, write clear model documentation, and open PRs with thoughtful descriptions and checks.
  • Support the health of our data platform by pairing with the on-call engineer to monitor job runs and alerts, triage issues, and execute playbooks; escalate quickly when outside scope.
  • Proactively monitor nightly and hourly pipelines (e.g., Airflow DAGs, dbt Cloud runs, AWS DMS replication tasks) and remediate common failures (retries, backfills, simple config fixes) under guidance.
  • Help maintain data reliability by responding to Monte Carlo observability alerts, validating anomalies, and creating or refining data quality tests.
  • Partner with engineers to simplify and automate routine remediation steps; contribute small improvements that make our pipelines more robust and observable.
  • Document what you learn: improve runbooks, add troubleshooting steps to our internal knowledge base, and keep tickets updated with clear status and outcomes.

What You Have

  • Proficiency in SQL and comfort working with large datasets; basic scripting in Python is a plus.Foundational understanding of data warehousing
pythongoawskubernetesdockeraidataanalyticsproductdesign