Back to Search
Overview
Mid-Level

Data Engineer

Confirmed live in the last 24 hours

Industrial Electric Manufacturing

Industrial Electric Manufacturing

Compensation

$130,000 - $170,000/year

Jacksonville, Florida, United States; US - Remote
Remote
Posted April 15, 2026

Job Description

At IEM, we’re not just building innovative electrical distribution systems, we’re shaping the future. IEM is dedicated to delivering world-class solutions for complex power needs. After 75 years, we continue to push the boundaries of what’s possible. Whether you’re an experienced professional or just starting out, you’ll have the opportunity to contribute, grow, and make a lasting impact on industries that power the world’s most dynamic markets.

Location: US Remote

Reports To: Director of Data and Analytics 

Salary Range: $130,000 - $170,000

 

Position Summary:

The Data Engineer is a core builder on IEM’s data and analytics team. This role develops and maintains the data pipelines and transformation models that power Tableau dashboards and business decisions across the organization. Working within an established modern data stack (Fivetran, Snowflake, dbt, Tableau. The Data Engineer turns raw source data into reliable, well-tested, and well-documented analytics-ready datasets. This is a hands-on individual contributor role with real ownership of production data models and growing influence on engineering standards.

 

Ideal Candidate Profile

You have 3–5 years of experience building data pipelines and transformation workflows in cloud environments. You’re proficient with SQL and Python, comfortable working in Snowflake, and have hands-on experience with dbt or similar transformation tools. You write clean, tested code and take pride in documentation. You’re curious about the business context behind the data and can translate stakeholder questions into well-modeled datasets. You work well on a small team, take feedback constructively, and are eager to grow your skills in analytics engineering, data modeling, and cloud data architecture. You’re excited about AI’s role in modern data work and comfortable using AI coding assistants and agents to accelerate your output.


Key Responsibilities:

  • Data Pipeline Development: Build and maintain ELT pipelines using Fivetran and custom integrations that ingest data from source systems including Procore, Salesforce, ERP platforms, and internal databases into Snowflake
  • dbt Transformation Models: Develop, test, and document dbt models that transform raw data into clean, reliable datasets for analytics and reporting across Finance, Production, Supply Chain, and Engineering
  • Data Modeling: Build dimensional models and staging layers following team conventions, ensuring data is structured for optimal Tableau dashboard performance and ad-hoc analysis
  • Data Quality: Write and maintain dbt tests, monitor data freshness, and investigate data quality issues when they arise, owning resolution through to root cause
  • Source System Integration: Work with APIs and data connectors to integrate new data sources, troubleshoot ingestion issues, and ensure reliable data flow into the warehouse
  • Documentation: Maintain clear documentation for data models, pipeline configurations, and business logic so the team can understand and extend your work
  • Collaboration: Partner with business stakeholders to understand data needs, clarify requirements, and deliver datasets that answer real operational questions
  • Performance: Monitor query performance and pipeline efficiency, identifying opportunities to optimize warehouse costs and model run times
  • Engineering Standards: Participate i
pythonaidataanalyticsproductdesignsales