Back to Search
Overview
Mid-Level

Data Engineer III

Confirmed live in the last 24 hours

GHX (Global Healthcare Exchange)

GHX (Global Healthcare Exchange)

Office Location or Remote - USA
Remote
Posted April 20, 2026

Job Description

This role is not eligible for visa sponsorship


ROLE OVERVIEW

The Data Engineer III (Enterprise BI) designs, builds, and supports data solutions that power enterprise reporting and analytics. You'll bridge the gap between data infrastructure and business insights, partnering with Product, Engineering, Analytics, and business stakeholders to deliver reliable pipelines, optimize data models, and translate complex data into actionable insights. This role requires expertise in BI tools (Tableau, Sigma Computing, SAP Business Objects), SQL, Python, and cloud data platforms (AWS/Snowflake) within an Agile delivery environment.

CORE RESPONSIBILITIES

  • Design and build ETL/ELT pipelines and dimensional data models using dbt, Airflow, Python, PySpark, and AWS services (S3, Glue, Lambda)
  • Create executive dashboards and perform complex SQL analysis to drive strategic decisions (Tableau, Sigma, SAP BO)
  • Optimize SQL queries, data structures, and warehouse resources for performance and cost efficiency at scale (Snowflake, Redshift)
  • Partner with stakeholders to translate business requirements into self-service analytics capabilities
  • Implement infrastructure-as-code (CloudFormation/CDK) and contribute to CI/CD automation
  • Troubleshoot production issues across data pipelines, queries, and APIs; perform root cause analysis
  • Provide technical mentorship, establish development standards, and drive data engineering best practices
  • Document solutions and communicate designs to cross-functional teams in Confluence/JIRA
  • Apply data governance, security, and monitoring/alerting best practices
  • Leverage AI-assisted development tools (GitHub Copilot, Claude, etc.) to increase productivity and accelerate delivery

REQUIRED QUALIFICATIONS

Education & Experience

  • Bachelor's degree in Computer Science, Data Science, Mathematics, Statistics, or related quantitative field
  • 6+ years of data engineering experience building BI applications and data platforms
  • 5+ years of ETL/ELT development in cloud data warehouses (AWS, Snowflake, Redshift, or similar)
  • 4+ years creating dashboards and visualizations in enterprise BI tools (Tableau, Sigma, SAP BO, Power BI, or Looker)
  • Proven track record delivering production data solutions in Agile environments (Scrum/Kanban)

Technical Skills

  • Expert-level SQL and Python proficiency
  • Proven experience designing dimensional data models (star/snowflake schema) optimized for analytics
  • Demonstrated SQL optimization and performance tuning in large-scale production environments
  • Strong business acumen with ability to translate technical solutions into business value
  • Excellent communication skills for presenting to executive and non-technical audiences
  • Deep analytical and troubleshooting skills with root cause analysis capabilities
  • Must be located in the United States (remote position)

PREFERRED QUALIFICATIONS

Cloud & Data Engineering

  • Advanced Snowflake experience (streams, tasks, dynamic tables, Snowpipe, time travel)
  • Hands-on experience with dbt for analytics engineering and data quality testing
  • Apache Airflow (or Prefect, Dagster) for workflow orchestration
  • Deep AWS experience (Glue, Lambda, Step Functions, SNS/SQS, API Gateway, EventBridge)
  • PySpark for distributed data processing and large-scale transformations
  • Streaming data platforms (Kafka, AWS Kinesis, Spark Streaming) for real-time analytics
  • Alteryx Cloud Designer (Trifacta)

Development & DevOps

  • Infrastructure-as-code using CloudFormation or CDK
  • Modern Angular (17+) for data-driven web applications; AngularJS modernization experience a plus
  • Version control (Git), CI/CD workflows (GitHub Actions, GitLab CI), and containerization (Docker)
  • Python data science libraries (pandas, numpy, scipy) and statistical analysis
  • AI-assisted development tools (Claude Code, GitHub Copilot, OpenAI Codex) and LLM integration
pythongoawsdockermachine learningaidevopsdataanalyticsproduct