Google Cloud Data Engineer
Confirmed live in the last 24 hours
NMI
Job Description
We are seeking a skilled Mid-Level Data Engineer to join our Data Platform team. At this level, you will be a hands-on executor and domain owner — responsible for building, maintaining, and improving the pipelines and data models that power analytics and business intelligence across the company. You will own specific areas of our BigQuery data warehouse end-to-end, delivering reliable data products within a framework set by senior and staff engineers.
This is not an architecture or strategy role — it is a role for someone who takes well-defined problems and executes them with high craft and reliability. You will work closely with data analysts, analytics engineers, and product teams, and are expected to grow toward greater technical ownership over time.
Key Duties
-
-
- Build and maintain production-grade ELT pipelines that ingest data from internal applications, third-party SaaS tools, and event streams into our BigQuery data warehouse.
- Own specific data domains end-to-end — from raw ingestion through to marts — ensuring your areas of the warehouse are accurate, tested, and well-documented.
- Write and maintain dbt models, tests, macros, and documentation within our established dbt project conventions and code review process.
- Develop and manage Airflow DAGs on Cloud Composer or other similar tools to orchestrate data workflows, following patterns and standards set by the team.
- Implement data quality checks and monitoring to catch anomalies before they reach downstream consumers.
- Optimize BigQuery queries and models for cost and performance within your domain, escalating architectural tradeoffs to senior engineers when appropriate.
- Collaborate with analysts and stakeholders to translate business data needs into well-scoped pipeline and modeling tasks.
- Participate in on-call rotations, respond to pipeline incidents, and write clear postmortems.
- Contribute to team documentation and runbooks so that your work is maintainable by others.
Skills and experienceRequired:
- 3–5 years of experience in data engineering or a closely related data infrastructure role.
- Proven experience designing and implementing scalable data pipelines and warehouse architectures.
- Strong expertise in Google Cloud Platform (BigQuery, Cloud Storage, Cloud Composer, Pub/Sub, Dataflow).
- Hands-on experience with dbt (data build tool) — models, tests, macros, sources, and documentation — at production scale.
- Experience building and maintaining data pipelines with Apache Airflow or a comparable workflow orchestration tool.
- Strong proficiency in SQL, including advanced BigQuery SQL (window functions, partitioning, clustering, query optimization).
- Proficiency in Python for data engineering tasks, including API integrations, data processing scripts, and custom operators.
- Familiarity with data modeling concepts: star schema, dimensional modeling, slowly changing dimensions (SCD).
- Experience with version control (Git) and collaborative development workflows (pull requests, code review).
- Understanding of data quality, lineage, and observability best practices.
- Startup or growth-stage mindset — comfortable with ambiguity, rapid iteration, and evolving priorities.
- Excellent communication skills, with the ability to collaborate effectively across technical and non-technical teams
Preferred:
- Experience with Terraform or similar infrastructure-as-code tools for managing cloud data infrastructure.
- Familiarity with streaming technologies such as GCP Pub/Sub, Dataflow, or Apache Kafka.
- Knowledge of Looker, Tableau, or other BI tools and how data models power them.
- Google Cloud Professional Data Engineer certification
Why join us: - Work on a modern, best-in-class GCP and BigQuery data stack with a high-performing team.
- Influence data platform architecture decisions and grow into a senior or staff engineering role.
- Competitive compensation, equity, and benefits with a culture that values engineering craft and continuous learning
-
Similar Jobs
Synchrony Financial
Senior Analyst, Analytics – Lowe’s (L09)
Nasdaq
Senior Data Engineer (Python)
HNTB
Senior Scheduler
Cardinal Health
Sr. Analyst, Process Analytics
S&P Global
Data & Technology Research Operations Specialist
S&P Global