What are the responsibilities and job description for the Sr. Data Engineer - W2 Contract position at IT America Inc?
Role: Senior Data Engineer - Airflow, DBT Core, Kubernetes/OpenShift
Location: Jersey City, NJ, Hybrid 3 days a week in the office
Looking for Permanent / Visa Independent Consultants
MUST HAVE SKILLS are Kubernetes operations, Airflow design and implementation, and hands-on DBT model development in an on-prem OpenShift environment.
MUST HAVE SKILLS
• 10 years of professional experience in data engineering, designing and supporting enterprise-scale data platforms in production environments
• the role requires hands-on experience with DBT and Apache Airflow deployed on Kubernetes, specifically within an on-prem OpenShift environment.
• This position involves closer interaction with infrastructure, including Kubernetes operations, Airflow design and implementation, and hands-on DBT model development in an on-prem setup.
• Given these requirements, we are looking for someone with deeper, practical experience in dbt and Airflow within Kubernetes-based, on-prem environments.
Job Summary
We are seeking a highly skilled Senior Data Engineer with 8-10 years of hands-on experience in enterprise data engineering, including deep expertise in Apache Airflow DAG development, dbt Core modeling and implementation, and cloud-native container platforms (Kubernetes / OpenShift). This role is critical to building, operating, and optimizing scalable data pipelines that support financial and accounting platforms, including enterprise system migrations and high-volume data processing workloads. The ideal candidate will have extensive hands-on experience in workflow orchestration, data modeling, performance tuning, and distributed workload management in containerized environments.
Required Skills & Qualifications:
• 10 years of professional experience in data engineering, analytics engineering, or platform engineering roles
• Proven experience designing and supporting enterprise-scale data platforms in production environments
• Expert-level Apache Airflow (DAG design, scheduling, performance tuning)
• Expert-level DBT Core (data modeling, testing, macros, implementation)
• Strong proficiency in Python for data engineering and automation
• Deep understanding of Kubernetes and/or OpenShift in production environments
• Extensive experience with distributed workload management and performance optimization
• Strong SQL skills for complex transformations and analytics
• Experience running data platforms on cloud environments
• Familiarity with containerized deployments, CI/CD pipelines, and Git-based workflows
• Experience supporting financial services or accounting platforms is plus
• Exposure to enterprise system migrations (e.g., legacy platform to modern data stack) is plus
• Experience with data warehouses (Oracle) is plus