What are the responsibilities and job description for the Business Intelligence Data Engineer position at survitec-pilot?
Role overview:
The Data & Analytics team are responsible for the entire cycle from data infrastructure to insight and data strategy. Over the last year, we have developed a new data platform powered by Snowflake and successfully delivered several analytics use cases for the business. We are currently focused on leveraging our data platform to build and deliver new business use cases that drive value across the organisation.
As a Data Engineer, you’ll play a key role in enhancing and evolving our data engineering practices. Working closely with fellow Data Engineers, Analytics Engineers, and Data Analysts, you will design and build robust data pipelines that align with our existing data architecture. This position offers a balance of hands-on engineering and collaboration with senior technical leadership, along with opportunities for growth, learning, and mentorship within a rapidly growing data team.
Key responsibilities:
- Develop and maintain robust scalable data pipelines, ensuring the integrity and availability of data for analysis and reporting
- Design, build and manage a modern data platform leveraging modern solutions (e.g. Azure, Snowflake, dbt , Matillion)
- Implement CI/CD pipelines for data pipelines and infrastructure using DevOps tools and methodologies.
- Implement data governance and security best practices to ensure data quality, compliance, and manage metadata.
- Collaborate with cross-functional teams to understand data needs, develop data models, and build analytics tools for reporting, analytics, and AI/ML initiatives.
- Stay informed about emerging data technologies and contribute to evaluating tools and practices that enhance the team’s data capabilities.
- Monitoring end-to-end operational processes and overseeing the development and maintenance of data pipelines to ensure quality, reliability, security, and scalability
Essential skills & experience:
- Minimum 3 years of experience as a data engineer, with proven expertise in developing and building robust and scalable data pipelines.
- Snowflake / dbt experience is essential.
- Strong understanding and experience with modern data warehouse solutions like Snowflake, and proficiency in cloud platforms such as Azure.
- Experience in data pipelines and ETL/ELT tools e.g. dbt
- Experience working with data models based on Kimball dimensional modelling or Data Vault methodologies.
- Proficiency in SQL and Python programming languages
- Extensive experience with DevOps (Git, Azure DevOps, CI / CD, PowerShell / Bash scripts)
- Proficient in integrating data quality checks and unit testing within ETL/ELT workflows to ensure reliable and accurate data delivery.
- Extensive experience extracting data from diverse source systems, including APIs, flat files, and on-premise databases.
- Good to have skills include data visualization e.g. Power BI