What are the responsibilities and job description for the Staff Data Platform Engineer position at Selby Jennings?
About the Company
A leading Insurtech organization made up of a few hundred experts across actuarial science, engineering, data science, and clinical practice is looking to bring on a staff level data platform engineer. Together, the team builds category‑defining, data‑driven SaaS products used by insurance, health IT, and life sciences clients to make high‑impact decisions at scale.
About the Role
As a Staff Data Engineer, you are a technical authority and multiplier. You lead through architecture, influence, and execution rather than people management. You will drive the evolution of a mission‑critical data platform that must be scalable, highly secure, auditable, and performant. This role is deeply hands‑on and centered on Databricks AWS, with real ownership over platform direction.
Responsibilities
- Lead the design and long‑term evolution of enterprise data lakes, lakehouse architectures, and warehouses.
- Establish and enforce enterprise medallion architectures (Bronze / Silver / Gold).
- Make architecture trade‑offs balancing performance, cost, governance, and developer velocity.
- Own advanced Databricks capabilities (Delta Live Tables, Unity Catalog).
- Build and operate scalable, production‑grade systems on AWS.
- Implement observability, monitoring, and cost transparency.
- Drive Infrastructure as Code standards using Terraform and asset bundles.
- Design pipelines for large‑scale healthcare and insurance datasets.
- Implement standards‑based data models (FHIR, OMOP, HL7).
- Support data containing identified and de‑identified PHI/PII with strict access controls.
- Implement governance frameworks for lineage, auditability, quality, and stewardship.
- Enforce least‑privilege access and data security best practices.
- Ensure platform compliance with healthcare and insurance regulations.
- Build robust ETL/ELT pipelines using SQL, Python, Scala, and Databricks tooling.
- Design event‑driven ingestion and API‑based integrations.
- Develop internal APIs and dataset‑backed services for downstream systems.
- Integrate with external systems (financial, HR, CRM, vendor platforms).
- Act as a thought leader and domain expert.
- Partner across engineering, data science, analytics, product, and business teams.
- Communicate complex technical trade‑offs clearly to senior, non‑technical stakeholders.
Qualifications
- 15 years designing and building enterprise data platforms (data lakes, lakehouses, warehouses, data products).
- Expert‑level experience with Databricks on AWS (Delta Lake, Delta Live Tables, Unity Catalog).
- Proven ownership of platform architecture decisions spanning scale, performance, reliability, and cost.
- Deep hands‑on experience with relational and non‑relational databases (SQL Server, PostgreSQL, DynamoDB, DocumentDB or similar).
- Extensive experience working with healthcare and insurance data, including PHI/PII.
- Strong working knowledge of healthcare data standards: FHIR, HL7, C‑CDA.
- Experience applying clinical coding systems / ontologies: ICD‑10, SNOMED, LOINC, RxNorm, NDC.
- Expert‑level data modeling for structured and unstructured data.
- Experience building batch, streaming, and event‑driven pipelines.
- Strong background in data governance, lineage, auditability, and security.
- Hands‑on experience with Infrastructure as Code (Terraform, GitHub).
- Proven collaboration with Data Science teams, enabling analytics and AI‑driven use cases.
- Experience in Insurtech, insurance, healthcare, or life sciences.
- AWS Solution Architect or equivalent certification.
- Prior ownership of platforms serving multiple external clients.
Required Skills
- Expertise in Databricks and AWS.
- Strong data modeling skills.
- Experience with healthcare and insurance data.
- Proficiency in SQL, Python, and Scala.
- Knowledge of data governance and security best practices.
Preferred Skills
- Experience with Infrastructure as Code tools like Terraform.
- Familiarity with clinical coding systems.
- Experience in building ETL/ELT pipelines.
- Knowledge of data standards such as FHIR and HL7.
This position is open to remote work. Applicants must be willing to travel to the office in Brookfield, WI quarterly.
Salary : $130,000 - $230,000