What are the responsibilities and job description for the Senior Data Engineer position at Jobs via Dice?
Dice is the leading career destination for tech experts at every stage of their careers. Our client, RightTalents, is seeking the following. Apply via Dice today!
Title: Senior Data Engineer
Client : Boehringer Ingelheim
Location: Ridgefield, CT (Onsite)
Duration: Fulltime
Duties & Responsibilities
Title: Senior Data Engineer
Client : Boehringer Ingelheim
Location: Ridgefield, CT (Onsite)
Duration: Fulltime
Duties & Responsibilities
- Design, develop, and maintain scalable data pipelines and ETL processes to support data integration and analytics.
- Collaborate with data architects, modelers and IT team members to help define and evolve the overall cloud-based data architecture strategy, including data warehousing, data lakes, streaming analytics, and data governance frameworks
- Collaborate with data scientists, analysts, and other business stakeholders to understand data requirements and deliver solutions.
- Optimize and manage data storage solutions (e.g., S3, Snowflake, Redshift) ensuring data quality, integrity, security, and accessibility.
- Implement data quality and validation processes to ensure data accuracy and reliability.
- Develop and maintain documentation for data processes, architecture, and workflows.
- Monitor and troubleshoot data pipeline performance and resolve issues promptly.
- Associate degree in Computer Science, MIS or related field with a minimum of 8 years' experience; or bachelor's degree in computer science, or MIS, or related field with minimum of 4years of experience; or a master's degree in computer science, MIS, or related field with minimum 2 years of experience; or relevant Business or IT experience of minimum of 4 years.
- Cloud Expertise: Expert-level proficiency in at least one major cloud platform (AWS, Azure, or Google Cloud Platform) with extensive experience in their respective data services (e.g., AWS S3, Glue, Lambda, Redshift, Kinesis; Azure Data Lake, Data Factory, Synapse, Event Hubs; Google Cloud Platform BigQuery, Dataflow, Pub/Sub, Cloud Storage); experience with AWS data cloud platform preferred
- SQL Mastery: Advanced SQL writing and optimization skills.
- Data Warehousing: Deep understanding of data warehousing concepts, Kimball methodology, and various data modeling techniques (dimensional, star/snowflake schemas).
- Big Data Technologies: Experience with big data processing frameworks (e.g., Spark, Hadoop, Flink) is a plus.
- Database Systems: Experience with relational and NoSQL databases (e.g., PostgreSQL, MySQL, MongoDB, Cassandra).
- DevOps/CI/CD: Familiarity with DevOps principles and CI/CD pipelines for data solutions.
- Hands-on experience with AWS services such as AWS Glue, Lambda, Athena, Step Functions, and Lake Formation
- Proficiency in Python and SQL
- 4 years of progressive experience in data engineering, with a significant portion dedicated to cloud-based data platforms.
- ETL/ELT Tools: Hands-on experience with ETL/ELT tools and orchestrators (e.g., Apache Airflow, Azure Data Factory, AWS Glue, dbt).
- Data Governance: Understanding of data governance, data quality, and metadata management principles.
- AWS Experience: Ability to evaluate AWS cloud applications, make architecture recommendations; AWS solutions architect certification (Associate or Professional) is a plus
- Familiarity with Snowflake
- Knowledge of dbt (data build tool)
- Strong problem-solving skills, especially in data pipeline troubleshooting and optimization