Senior Data Platform Engineer, ISS - Data Platform Engineering
Confirmed live in the last 24 hours
Pure Storage
Job Description
We’re in an unbelievably exciting area of tech and are fundamentally reshaping the data storage industry. Here, you lead with innovative thinking, grow along with us, and join the smartest team in the industry.
This type of work—work that changes the world—is what the tech industry was founded on. So, if you're ready to seize the endless opportunities and leave your mark, come join us.
THE ROLE
We’re building a new capability inside Platform Engineering, within our Infrastructure Shared Services (ISS) organisation: Data Platform Engineering. Our goal is to create a platform that removes friction, providing a consistent way to build, operate, and evolve data-driven services with clear ownership, guardrails, and a strong developer experience. Everpure’s own platform thinking is anchored in the idea of a unified data plane - a single fabric that connects data across locations and delivers consistent, secure data services across protocols and applications.
In this role, you’ll build and operate a trusted, scalable, and intelligent data platform that empowers teams to make faster data-driven decisions while shaping a culture of ownership and innovation to meet the needs of an ever-changing business and technology landscape.
WHAT YOU'LL DO
- Design, build, and operate the core data platform services that enable teams to ingest, model, query, and publish data seamlessly.
- Architect a multi-engine environment (e.g., Trino/Starburst, Dremio, ClickHouse, Postgres/pgLake) with an Iceberg-based lakehouse, prioritizing interoperability, operability, and security.
- Drive platform evolution through experimentation, measuring performance, reliability, and cost trade-offs to standardize high-impact solutions.
- Establish durable corporate knowledge by developing strong defaults, comprehensive documentation, and intuitive self-service interfaces.
- Manage core lakehouse components, including table layout conventions, catalogue integration, and critical lifecycle maintenance patterns such as compaction and optimization.
- Implement robust ingestion patterns for both batch and streaming data, ensuring clear SLAs and highly observable failure modes.
- Enable multi-engine access by configuring connectors and catalog permissions while maintaining strict guardrails for a consistent user experience.
WHAT YOU BRING
- 6+ years of experience in the design and development of data pipeline automation, specifically extracting data from API-based sources.
- Technical expertise in developing complex automation frameworks, data modeling, and ETL processes using SQL, Python, DBT, Apache Airflow, or similar languages.
- Full-stack proficiency across the data stack, with deep knowledge of Python, SQL, and ETL methodologies.
- nodepythongorustawsaiiosdatadesign
Similar Jobs
EMC Insurance
Senior Data Engineer
Synchrony Financial
Sr. Analyst, SOX IT Testing (L09)
Western Union
Agent Oversight Analyst
Minnesota State
Computer Systems Networking, Cybersecurity and Networking Faculty
Nasdaq
Quality Engineering - Senior Specialist
Northern Trust