What are the responsibilities and job description for the Senior Databricks Engineer with Python Experience position at Jobs via Dice?
Dice is the leading career destination for tech experts at every stage of their careers. Our client, Alltech Consulting Services, Inc., is seeking the following. Apply via Dice today!
Job Title Senior Databricks Engineer With Python Experience
Location: Wilmington, DE 5 days onsite role, no hybrid
Long Term Contract role
Job Description:
We are looking for a Senior Data Engineer with strong experience in Databricks, PySpark, and modern Data Warehouse systems. The ideal candidate can design, build, and optimize scalable data pipelines and work closely with analytics, product, and engineering teams.
Key Responsibilities:
Design and build ETL/ELT pipelines using Databricks and PySpark
Develop and maintain data models and data warehouse structures (dimensional modeling, star/snowflake schemas)
Optimize data workflows for performance, scalability, and cost
Work with cloud platforms (Azure/AWS/Google Cloud Platform) for storage, compute, and orchestration
Ensure data quality, reliability, and security across pipelines
Collaborate with cross-functional teams (Data Science, BI, Product)
Write clean, reusable code and follow engineering best practices
Troubleshoot issues in production data pipelines
Required Skills:
Strong hands-on skills in Databricks, PySpark, and SQL
Experience with data warehouse concepts, ETL frameworks, batch/streaming pipelines
Solid understanding of Delta Lake and Lakehouse architecture
Experience With At Least One Cloud Platform (Azure Preferred)
Experience with workflow orchestration tools (Airflow, ADF, Prefect, etc.)
Nice to Have:
Experience with CI/CD for data pipelines
Knowledge of data governance tools (Unity Catalog or similar)
Exposure to ML data preparation pipelines
Soft Skills:
Strong communication and documentation skills
Ability to work independently and mentor others
Problem-solver with a focus on delivering business value
Job Title Senior Databricks Engineer With Python Experience
Location: Wilmington, DE 5 days onsite role, no hybrid
Long Term Contract role
Job Description:
We are looking for a Senior Data Engineer with strong experience in Databricks, PySpark, and modern Data Warehouse systems. The ideal candidate can design, build, and optimize scalable data pipelines and work closely with analytics, product, and engineering teams.
Key Responsibilities:
Design and build ETL/ELT pipelines using Databricks and PySpark
Develop and maintain data models and data warehouse structures (dimensional modeling, star/snowflake schemas)
Optimize data workflows for performance, scalability, and cost
Work with cloud platforms (Azure/AWS/Google Cloud Platform) for storage, compute, and orchestration
Ensure data quality, reliability, and security across pipelines
Collaborate with cross-functional teams (Data Science, BI, Product)
Write clean, reusable code and follow engineering best practices
Troubleshoot issues in production data pipelines
Required Skills:
Strong hands-on skills in Databricks, PySpark, and SQL
Experience with data warehouse concepts, ETL frameworks, batch/streaming pipelines
Solid understanding of Delta Lake and Lakehouse architecture
Experience With At Least One Cloud Platform (Azure Preferred)
Experience with workflow orchestration tools (Airflow, ADF, Prefect, etc.)
Nice to Have:
Experience with CI/CD for data pipelines
Knowledge of data governance tools (Unity Catalog or similar)
Exposure to ML data preparation pipelines
Soft Skills:
Strong communication and documentation skills
Ability to work independently and mentor others
Problem-solver with a focus on delivering business value
Sr Databricks Engineer with Python | Contract W2
Next Gen Software Solutions LLC -
Wilmington, DE
Senior Manager of Software Engineering - Java/Python/Databricks/AWS
JPMorgan Chase -
Wilmington, DE
Senior Lead Software Engineer - Golang/Python/Java
JPMorgan Chase -
Wilmington, DE