What are the responsibilities and job description for the Databricks Engineer position at Interon IT Solutions?
Databricks Engineer AWS Cloud
Location: Minnesota (Local candidates only)
Duration: Long-term
Job Summary
We are hiring a Databricks Engineer with solid AWS cloud experience to build scalable data pipelines, optimize Spark workloads, and support ongoing data platform modernization initiatives.
Key Responsibilities
Location: Minnesota (Local candidates only)
Duration: Long-term
Job Summary
We are hiring a Databricks Engineer with solid AWS cloud experience to build scalable data pipelines, optimize Spark workloads, and support ongoing data platform modernization initiatives.
Key Responsibilities
- Develop and maintain ETL/ELT pipelines using Databricks, Spark, and Delta Lake.
- Build and optimize distributed data processing jobs using PySpark/Scala.
- Implement data ingestion from structured and unstructured sources into S3/Delta Lake.
- Work closely with architects to deploy Databricks solutions in production.
- Support AWS integrations: S3, Glue Catalog, Lambda, SNS/SQS, Redshift, etc.
- Debug, optimize, and tune Spark jobs for performance.
- Participate in code reviews, testing, and CI/CD deployments.
- 7 years of data engineering experience.
- Strong Databricks and Spark programming expertise.
- Hands-on AWS cloud experience.
- Experience using Delta Lake, notebooks, workflows, and job clusters.
- Knowledge of Git, CI/CD, automation tools.
Staff Software Engineer, Observability
Databricks -
Mountain View, CA
Senior Software Engineer
Databricks -
Mountain View, CA
Senior Software Engineer - Security
Databricks -
Bellevue, WA