Back to Search
Overview
Senior

Senior Data Engineer

Confirmed live in the last 24 hours

Agility Robotics

Agility Robotics

Compensation

$175,000 - $273,000/year

Hybrid- Any Office (Fremont, CA, Salem, OR, or Pittsburgh, PA)
Hybrid
Posted March 30, 2026

Job Description

Agility’s commercially deployed humanoids operate alongside teams in warehouses, manufacturing facilities, and distribution centers—tackling physically demanding and repetitive tasks while enabling workers to focus on higher-value work. With industry-leading safety standards and years of proven deployment data, we're pioneering a new era of automation that enhances human potential.

About The Team

Agility Robotics is building the future of work through humanoid robots that operate in human environments. The Data Platform team builds the data infrastructure that powers everything from fleet operations and hardware reliability to business analytics and machine learning. We enable engineers across robotics, perception, and product teams to derive insight from the vast quantities of telemetry and log data generated by our robots in the field.

About The Role

We are looking for a Senior Data Engineer to join our Data Platform team and help shape the foundation of data-driven operations at Agility. In this role, you’ll work closely with robot software and hardware teams(among others) to design, curate, and maintain high-quality datasets that enable analytics, debugging, and fleet-scale insights.

You’ll bridge the gap between raw robot data and actionable information — working both on-robot data generation and in the cloud ingestion and processing pipelines. You’ll design transformations, author pipelines, and collaborate across teams to deliver reliable and queryable data products for hardware reliability, system health, workflow metrics, and root cause analysis.

What You’ll Do

  • Collaborate with robot software and hardware teams to define, collect, and curate data needed for analytics and debugging.
  • Develop and maintain ETL pipelines that transform raw robot logs and telemetry into structured datasets using Spark, Airflow (or equivalent orchestration tools), and AWS data services.
  • Contribute to on-robot data production workflows to ensure high-fidelity, well-structured data capture.
  • Design derived datasets and transformations across Avro, Parquet, and other sensor data formats to power fleet operations, reliability analysis, and business metrics.
  • Implement data quality checks, schema evolution, and metadata management practices using our internal Data Registry and cataloging systems.
  • Work closely with the ingestion and storage services that move robot data into the cloud (S3-based data lake).
  • Collaborate with internal consumers — reliability, analytics, and ML teams — to design efficient data models for their workflows.
  • Occasionally contribute to shared libraries or APIs in Python, Java, or C++ to support data capture and consumption.

What We’re Looking For

Required:

  • 5+ years of experience as a Data Engineer or similar role building and maintaining production data pipelines.
  • Strong proficiency in Apache Spark or equivalent distributed data processing frameworks.
  • Experience
pythonjavagoawsmachine learningaidataanalyticsproductdesign