What are the responsibilities and job description for the Senior Data Engineer :: W2 Only position at Jobs via Dice?
Dice is the leading career destination for tech experts at every stage of their careers. Our client, Trebecon LLC, is seeking the following. Apply via Dice today!
Job Title: Data Engineer
Location: Seattle, WA(Onsite)
Experience: 12 Years
Job Summary
We are seeking an experienced Data Engineer to design, build, and maintain scalable data pipelines and data platforms. The ideal candidate will work closely with data analysts, data scientists, and business stakeholders to ensure reliable, high-quality data availability for analytics and reporting.
Key Responsibilities
Job Title: Data Engineer
Location: Seattle, WA(Onsite)
Experience: 12 Years
Job Summary
We are seeking an experienced Data Engineer to design, build, and maintain scalable data pipelines and data platforms. The ideal candidate will work closely with data analysts, data scientists, and business stakeholders to ensure reliable, high-quality data availability for analytics and reporting.
Key Responsibilities
- Design, develop, and maintain ETL/ELT data pipelines
- Build and optimize data models for analytics and reporting
- Integrate data from multiple sources (databases, APIs, files, streaming sources)
- Ensure data quality, data validation, and data governance standards
- Optimize performance of data workflows and queries
- Collaborate with cross-functional teams to understand data requirements
- Implement monitoring, logging, and error-handling for data pipelines
- Support production issues and perform root cause analysis
- Strong experience with SQL and relational databases
- Hands-on experience with Python or Scala
- Experience with ETL tools (Informatica, Talend, dbt, Airflow, SSIS, etc.)
- Experience with Cloud platforms (AWS, Azure, or Google Cloud Platform)
- Knowledge of Data Warehousing concepts (Snowflake, Redshift, BigQuery, Synapse)
- Familiarity with CI/CD pipelines and version control (Git)
- Experience with Big Data technologies (Spark, Kafka, Hadoop)
- Knowledge of data governance and data quality frameworks
- Exposure to real-time/streaming data processing
- Domain experience in Healthcare, Finance, or Insurance