What are the responsibilities and job description for the Senior Data Engineer position at DCode Talent?
Senior Data Engineer – Airflow, DBT Core, Kubernetes/OpenShift
Location-3 days in Jersey City NJ
Job Summary
- Serve as a Senior Data Engineer with 10 years of hands-on experience in enterprise data engineering.
- Design, build, and support large-scale, production data platforms and data pipelines in financial services environments.
- Develop, schedule, and optimize Apache Airflow DAGs for workflow orchestration in on-premises Kubernetes/OpenShift environments.
- Design, implement, and maintain dbt Core data models, including testing, macros, and performance tuning.
- Collaborate closely with infrastructure teams for Kubernetes/OpenShift operations and containerized deployment management.
- Optimize distributed, high-volume data processing workloads to support financial/accounting platforms and enterprise system migrations.
- Write efficient, production-quality code in Python for data engineering and process automation tasks.
- Utilize strong SQL skills for complex data transformations, analytics, and data warehousing tasks.
- Support CI/CD pipelines and Git-based workflows for continuous integration and deployment of data solutions.
- (Preferred) Experience with financial services or accounting platforms and enterprise system migrations from legacy to modern data stacks.
- (Preferred) Familiarity with Oracle data warehouses and cloud-based data platform deployments.