What are the responsibilities and job description for the Senior Data Engineer, Investments Technology position at Liberty Mutual Insurance Group?
Design, build, and maintain scalable data pipelines and data solutions that support analytics and investment insights. Develop data provisioning workflows, transformations, and integrations using Snowflake's cloud-native capabilities in an AWS environment. Apply data warehousing best practices, including data modeling, performance optimization, and scalable architecture design. Build and enhance ELT pipelines using Snowflake and Python to ensure reliable and efficient data processing. Partner with data engineers, analysts, and business stakeholders to understand requirements and deliver high-quality data solutions. Support and enhance data quality, governance, and observability practices across the platform. Contribute to CI/CD pipelines and deployment processes, improving automation and reliability of data workflows. Work with enterprise orchestration tools (e.g., ActiveBatch, transitioning to Stonebranch) to manage and schedule data workflows. Participate in code reviews, knowledge sharing, and mentoring, helping to elevate engineering practices across the team. Stay current with evolving data engineering tools and practices, and recommend improvements where appropriate. Bachelor's degree in a technical or business discipline, or equivalent experience. 5 years of experience in data engineering, with a focus on building and maintaining scalable data pipelines and data platforms. Strong proficiency in SQL and solid understanding of data modeling and data warehousing concepts. Experience with Snowflake and AWS services (e.g., S3, Lambda) in a cloud-based data environment. Experience using Python for data transformation, integration, or automation. Familiarity with ELT/ETL pipeline development and modern data architecture patterns. Experience with CI/CD tools and practices (e.g., GitHub Actions, Bamboo, Jenkins, or similar). Experience working with or exposure to workflow orchestration tools. Experience in investments, asset management, or financial services. Familiarity with API-based integrations (e.g., REST services). Exposure to data governance, security, and access control practices. Experience with modern data tooling (e.g., data observability, cataloging, or transformation frameworks). Bachelor or Master`s degree in technical or business discipline or equivalent experience Generally 5 years of professional experience Highly proficient in data engineering languages and tools, and strong proficiency in general programming languages and frameworks; ability to develop on multiple platforms Strong negotiation, facilitation and consensus building skills Strong oral and written communication skills; presentation skills Extensive knowledge of the following: IT concepts, strategies, methodologies Versed in diverse technologies and new technical architecture principles and concepts Extensive knowledge in layered systems architectures solutions and designs and shared data engineering concepts Business function(s) and I/T industry business issues; extensive knowledge of business operations, strategies and objectives Extensive understanding of agile data engineering concepts and processes Must be proactive, demonstrate initiative and be a logical thinker Extensive consultative skills, including the ability to understand and apply customer requirements, including drawing out unforeseen implications and making recommendations for design, the ability to define design reasoning, understanding potential impacts of design requirements Extensive understanding of backlog tracking, burndown metrics, and incremental delivery Strong collaboration, prioritization, and adaptability skills required