What are the responsibilities and job description for the Lead Integration Engineer (ETL Snowflake) position at Jobs via Dice?
JOB TITLE: Lead Integration Engineer (ETL Snowflake)
JOB LOCATION: Waltham, MA (1 day a week onsite)
WAGE RANGE*: $70-85/hr. W2
JOB NUMBER: NGGJP00017056
6 Month Contract to Start
JOB DESCRIPTION:
Waltham 1 x a week and when asked for meetings
The Lead Integration Engineer is a collaborative, hands-on role on the Digital Platform Data Engineering Team responsible for designing sound, scalable, and performant solutions that meet the requirements in a large data volume enterprise, and leading the team responsible for implementation. Solutions and design trade-offs are communicated to business and technical stakeholders for effective decision making.
The Lead Integration Engineer will work on a major IT initiative to enable *** Business Partners by improving how our employees serve our customers today and by creating the platform for tomorrow's growth.
Key Accountabilities
JOB LOCATION: Waltham, MA (1 day a week onsite)
WAGE RANGE*: $70-85/hr. W2
JOB NUMBER: NGGJP00017056
6 Month Contract to Start
JOB DESCRIPTION:
Waltham 1 x a week and when asked for meetings
The Lead Integration Engineer is a collaborative, hands-on role on the Digital Platform Data Engineering Team responsible for designing sound, scalable, and performant solutions that meet the requirements in a large data volume enterprise, and leading the team responsible for implementation. Solutions and design trade-offs are communicated to business and technical stakeholders for effective decision making.
The Lead Integration Engineer will work on a major IT initiative to enable *** Business Partners by improving how our employees serve our customers today and by creating the platform for tomorrow's growth.
Key Accountabilities
- Perform and collaborate on system designs, data solution development, with a focus on data integration pipelines to/from various platforms and/or legacy systems.
- Provide day-to-day supervision and leadership of engineering team(s) responsible for implementation.
- Ensure integration solutions adhere to data privacy, security, and compliance standards
- Define and maintain integration architecture principles, ensuring scalability, security, and performance across platforms.
- Updates to integration process data-mappings and data catalogs
- Participate in, and/or lead, design and pre-deployment reviews.
- Data profiling
- Must be comfortable reading/understanding data models.
- Design and development of data synchronization processes
- Participate in the design/development of integration solutions.
- Assist in determining the cross-application data standards, data distribution standards and promote data-driven design
- Create database deliverables ensuring quality and traceability to requirements and adherence to all quality management plans and governance standards
- Work with the team to ensure that all components work together to meet objectives and performance goals as defined in the requirements. Identify and communicate any cross-area or cross-release issues that may affect other areas of the project
- Participate in go-live preparation activities such as QA, dress rehearsals, issue remediation, cutover and go-live aftercare
- Participate in data management maturity assessments to identify gaps and pain points for capabilities including data quality, governance, analytics, metadata management, master data management
- Seek opportunities for continuous improvement in processes, procedures and systems with regards to data
- Supervisory/Interpersonal
- Good team leadership and influencing skills for colleagues, partners / vendors; project sponsors
- 7 years experience in database development (Snowflake, SQLServer, Oracle, etc)
- 7 years experience designing, coding, and delivering ETL workflows using Matillion (preferred), Informatica, Ab Inito, or DataStage.
- Excellent oral and written communication skills.
- MS Office (Word, Excel, PPT, Visio, et al).
- Experience in data mapping, modeling and working with teams to define integration pipelines.
- Experience in a dynamic work environment
- Proven ability to work well in a team environment, and be capable of building and maintaining positive relationships with other staff, departments, and customers
- Strong understanding of integration patterns (ETL, ELT, API-led, event-driven).
- Experience in optimize integration workflows for throughput, latency, and cost efficiency.
- Experience with CI/CD pipelines for integration deployments.
- Experience with cloud integration platforms
- Knowledge of data governance frameworks and metadata management.
- Proficiency in scripting languages.
- Experience with cloud database architecture is a plus
- Experience with Matillion is a plus
- Experience with real-time data loads is a plus
- Experience with Agile/Lean software development process and practices is a plus
- Experience in with SaaS solutions such as Salesforce, ESRI and Workforce is a plus
- Ability to work with cross-functional teams, subject matter experts, and architects
- While an hourly range is posted for this position, an eventual hourly rate is determined by a comprehensive salary analysis which considers multiple factors including but not limited to: job-related knowledge, skills and qualifications, education and experience as compared to others in the organization doing substantially similar work, if applicable, and market and business considerations. Benefits offered include medical, dental and vision benefits; dependent care flexible spending account; 401(k) plan; voluntary life/short term disability/whole life/term life/accident and critical illness coverage; employee assistance program; sick leave in accordance with regulation. Benefits may be subject to generally applicable eligibility, waiting period, contribution, and other requirements and conditions. Benefits offered are in accordance with applicable federal, state, and local laws and subject to change at TCM's discretion.
Salary : $70 - $85