What are the responsibilities and job description for the Data Engineer position at Experfy?
Data Engineer
Experfy
posted Jan 12, 2026
Boston, Massachusetts, 02110, United States
Contract
Hourly Rate $63/hr
1 Position
Overview
Client is seeking to hire an experienced SQL Data Engineer to work with our EHS IT team supporting the Department of
Mental Health and Department of Public Health Hospitals. These resources are responsible for the design,
development, maintenance and support of our Business Applications, Data Warehouse and additional processes and
reports.
The Data Engineer is responsible for developing, maintaining, and optimizing data pipelines and integration processes that support analytics, reporting, and business operations. This role focuses on designing and implementing scalable data solutions using Snowflake and Informatica, ensuring that data is accurate, reliable, and accessible to key stakeholders across the organization.
Key Responsibilities
Design, build, and maintain ETL/ELT pipelines using Informatica to move and transform data from various source systems into Snowflake.
Ensure data handling and storage comply with HIPAA, HITECH, and organizational privacy/security standards.
Support the development of data models, schemas, and views in Snowflake to enable efficient data querying and analytics.
Support interoperability and data exchange initiatives using healthcare standards such as HL7, FHIR, and X12 EDI.
Implement data ingestion, transformation, and quality processes to ensure consistent, trusted data across
environments.
Monitor and troubleshoot data pipelines to ensure high performance and reliability.
Collaborate with data analysts, data scientists, other data engineers, and business users to understand data
requirements and deliver solutions.
Develop and maintain documentation for data flows, metadata, and transformation logic.
Assist with data security, access control, and governance within Snowflake and Informatica.
Participate in testing, deployment, and release management for new data workflows and enhancements.
Required Qualifications
3–5+ years of experience in SQL based data engineering or ETL development.
Hands-on experience with:
o Snowflake Cloud Data Platform, including SQL development and data modeling.
o Informatica PowerCenter or Informatica Intelligent Cloud Services (IICS) for ETL/ELT design and
implementation.
Strong experience in SQL language, complex procedures and understanding of data warehousing concepts.
3 to 5 years of experience with SQL and scripting languages
3 to 5 years of experience in development with strong knowledge of the procedures and techniques involved in
designing, coding, testing, documenting, and maintaining database systems, processes and programs written
from detailed specifications.
3 to 5 years of experience with the Software Development Life Cycle (SDLC), specifically Scrum and Agile
methodology and DevOps practices.
Experience integrating structured and semi-structured data (e.g., JSON, XML, CSV).
Familiarity with cloud platforms (AWS, Azure, or GCP) and storage services (e.g., S3, Blob Storage).
Understanding of data governance, data quality, and metadata management principles.
Preferred Knowledge, Skills & Abilities:
Experience with Python for scripting and automation.
Knowledge of Informatica Data Quality (IDQ) or Data Catalog tools.
Exposure to CI/CD pipelines, Git, and Agile/Scrum environments.
Soft Skills
Strong analytical and problem-solving skills.
Excellent communication and collaboration with cross-functional teams.
Detail-oriented, with a focus on data accuracy and process improvement.
Eager to learn new tools and technologies in a fast-paced data environment.
Education and Certification:
Bachelor’s degree in computer science, Information Systems, Data Engineering, Health Informatics, or related
field or equivalent experience.
Snowflake or Informatica certifications are a plus.
Status:
Open
Expertise
Functional Compentencies
CI/CD pipelines
Git
Agile/Scrum
Informatica Data Quality (IDQ)
cloud platforms
Technology & Tools
Software Development Life Cycle (SDLC)
Data Catalog
Python
JSON
XML
CSV
Work Arrangement
Job Type
Contract
Minimum Time Commitment
37 Hours/Week
Location
Ideal Candidate Location
Boston, Massachusetts, 02110, United States
Percentage of Travel Required
26-50%
Compensation
Hourly Rate
Maximum Hourly Rate
$63/hr