hero

Echelon Portfolio Company Careers

Come join our Network of companies solving the world's hardest problems
companies
Jobs

Data Integration Engineer, Provider Data

Helm Health

Helm Health

United States
Posted on Nov 4, 2025
Data Integration Engineer, Provider Data
USA
Product
Remote
Full-time

About Helm Health

At Helm, we are transforming health insurance with "Dynamic Copay" – a new insurance plan that allows members to see simple upfront prices for all medical care before making decisions. Our team is building the infrastructure to power these plans for health insurance payors. With Helm, our clients offer simpler health plans to their members, helping them navigate to higher-value care.
Our team has specialized in Dynamic Copay solutions since 2020, and Helm is the only independent platform in the market. We have grown rapidly since our launch, working with clients from local health plans to the nation’s largest health insurers. The market is forming around us – now is the time to join!


The Role

We are seeking a Data Integration Engineer with deep experience in provider data to join our Product team. This is a hybrid role that combines the technical rigor of a data engineer with the domain expertise of provider network management. You will be responsible for building and maintaining automated pipelines to transmit, ingest, and normalize provider data across multiple external partners and internal teams, ensuring accuracy, reliability, and scalability.

Responsibilities

Data Transmission & Integration
  • Build automated pipelines to transmit provider data securely to and from external partners.
  • Monitor incoming partner files, detect data deliveries, and manage ingestion workflows.
  • Parse and validate returned files for accuracy and completeness.
Provider Data Management
  • Normalize provider datasets for consistent use across internal systems and our database.
  • Manage complex and overlapping datasets, ensuring accurate deduplication and mapping.
  • Collaborate with internal product and external provider network teams to ensure data supports operational and strategic needs.
Platform & Pipeline Engineering
  • Develop ETL/ELT processes using modern data frameworks and workflows.
  • Ensure robust data quality, error handling, and monitoring across pipelines.
  • Partner with engineering and product stakeholders to design scalable data flows that align with business goals.


Requirements

  • Proven experience working directly with provider and claims data (networks, rosters, directories, claims, or related healthcare datasets).
  • Hands-on experience with Google Cloud Platform (GCP), including tools such as BigQuery, Cloud Storage, Bigtable, etc.
  • Strong SQL skills and proficiency in Python or a similar programming language.
  • Experience designing and maintaining automated data pipelines for complex datasets.
  • Ability to work cross-functionally, bridging technical data needs with product and provider network priorities.

Characteristics

  • Self-directed
  • Mission-driven (healthcare)
  • Willing to ask questions and admit what you don’t know
  • Can explain technical concepts to non-technical people

Internal Tools/Technology

  • Slack / Google Workspace / Zoom
  • Cursor / Linear / GitHub / Notion / Whimsical
  • macOS / Linux

Benefits/Offerings

  • Competitive base salary + equity
  • Unlimited PTO (mandatory 12 days)
  • Computer + home office stipend
  • 401(k) + matching
  • Health and dental insurance
  • Autonomy and tons of room for career growth

Occasional Travel

We meet quarterly as a company.
Please note that this is a fully remote opportunity.
Ready to apply?
Powered by
First name *
Last name *
Email *
LinkedIn URL *
Phone number *
Location *
Resume *
Click to upload or drag and drop here
Will you now or in the future require sponsorship for employment authorization in the United States (e.g., H-1B visa)? *
Req ID: R3