Senior Analytics Engineer, Compensation Analytics
Confirmed live in the last 24 hours
Klaviyo
Compensation
$92,000 - $138,000/year
Job Description
At Klaviyo, we value the unique backgrounds, experiences and perspectives each Klaviyo (we call ourselves Klaviyos) brings to our workplace each and every day. We believe everyone deserves a fair shot at success and appreciate the experiences each person brings beyond the traditional job requirements. If you’re a close but not exact match with the description, we hope you’ll still consider applying. Want to learn more about life at Klaviyo? Visit careers.klaviyo.com to see how we empower creators to own their own destiny.
About the team and role
Data is at the heart of every decision made at Klaviyo, and we’re looking for a Business Intelligence Data Engineer to join our Go To Market (GTM) team supporting Compensation Analytics. This domain of data aims to improve the experience of all Klaviyos variable compensation plans. Secondary to this, the role will support ancillary functions of the Professional Services organization. This role sits in Data Engineering as part of the GTM team, which is part of a hub and spoke model of analytics engineering at Klaviyo.
You’ll build and steward the source of truth for Compensation data so People leaders and analysts can answer compensation questions quickly and confidently and turn those insights into a more incentivized, higher-performing organization. You will directly support the compensation analytics teams at Klaviyo, working cross functionally with Systems, Payroll, Audits, Planning, and People Operations.
You will be an independent self-serving, embedded partner to all of GTM leadership where variable compensation plans exist, capable of translating ambiguous requirements into stable data products. You’ll be supported by the broader Data Engineering organization’s standards, tooling, and review practices..
How you’ll make a difference
- Deliver compensation‑data SSoT that drives better employee experience, is SOX compliant, and entirely auditable by both internal and 3rd party organizations. Maintain and sStand up curated, documented marts that make it easy to monitor attainment health, quota setting, audits, and cycles so that leadership can minimize time in front of compensation boards to focus on their organizations.
- Own the pipelines & models end‑to‑end. Build and maintain reliable integrations from core compensation systems (e.g., SPM/ICM/CRM), model them in dbt, and publish governed marts and reverse‑ETLs to operational destinations where they create value.
- Create attainment views with Compensation and People Analytics. Partner with analysts to build quota→BoB management→attainment→submission→booking lifecycle views of quarterly compensation. Focusing on quicker cadences to booking and dynamic reconciliation processes
- Raise the bar on data reliability and governance. Instrument monitoring and alerting, tests (freshness/volume/constraints), and documentation so the compensation data ecosystem is discoverable, auditable, and self-serve.
- Operate as a trusted partner to leadership. Work directly with Operations and Compensation leadership to scope problems, clarify trade‑offs, and communicate technical concepts in exec‑ready language.
- Transform workflows by putting AI at the center, building smarter systems and ways of working from the ground up.
What you’ll do (responsibilities)
- Integrations & ingestion: Own secure ingestion from HRIS/ATS/comp/performance systems into Snowflake; define SLAs/SLOs; implement monitoring & alerting for each feed.
- Modeling & marts: Design dimensional/entity models (dbt) for employees, positions, org structure, requisitions/offers, performance/promo history, compensation/equity, and movement; publish curated marts with strong contracts and lineage.
- Reverse ETL: Operationalize high‑value models to downstream tools and workflows using reverse‑ETL patterns to close the loop between insight and action.
- Quality & governance: Implement tests (unit/integration, schema/freshness), multi-layered validation frameworks that routinely validate data integrity, data policies (masking, purpose‑based access), and documentation that enable safe self‑service across the analytics community.
- Repository stewardship: Maintain the analytics codebase (dbt repo), perform code reviews, and ensure modular, reusable patterns the broader tea