Demo

Azure Data Engineering Specialist (Medallion, Databricks)

Jobs via Dice
Greenfield, IN Full Time
POSTED ON 11/18/2025
AVAILABLE BEFORE 12/17/2025
Dice is the leading career destination for tech experts at every stage of their careers. Our client, Reliable Software Resources, is seeking the following. Apply via Dice today!

Azure Data Engineering Specialist (Medallion, Databricks)

Location :: Onsite Greenfields, IN

Duration :: FTE

Job Description ::

We are seeking a highly skilled Data Engineering Specialist with above mentioned Primary Skills to join our dynamic team who are at the forefront of enabling enterprises in healthcare sectors. The ideal candidate should be passionate about working on Data Engineering on Azure Cloud with strong focus on DevOps practices in building product for our customers. Effectively Communicate and Collaborate with internal teams and customer to build code leveraging or building low level design documents aligning to standard coding principles and guidelines.

Skills/Experience

  • 4 years of experience in Azure Databricks with PySpark and 5 years of experience in Azure Cloud platform
  • 3 years of experience in ADF (Azure Data Factory), ADLS Gen 2 and Azure SQL
  • 2 years of experience in Databricks workflow & Unity catalog
  • 2 years of experience in Python programming & package builds
  • Prior experience in working on Agile/Scrum projects with exposure to tools like Jira/Azure DevOps
  • Provide constructive feedback during code reviews and be open to receiving feedback on your own code
  • Problem-Solving and Analytical Thinking, Capability to troubleshoot and resolve issues efficiently
  • Provides regular updates, proactive and due diligent to carry out responsibilities
  • Communicate effectively with internal and customer stakeholders
  • Communication approach: verbal, emails and instant messages
  • Strong interpersonal skills to build and maintain productive relationships with team members

Secondary Skills

  • Good to have Azure Entra/AD skills and GitHub Actions
  • Good to have orchestration experience using Airflow, Dagster, LogicApp
  • Good to have experience working on event-driven architectures using Kafka, Azure Event Hub
  • Good to have experience in managing Cloud storage solutions on Azure Data Lake Storage (ADLS)
  • Good to have exposure on Google Cloud Pub/Sub; Experience with Google Cloud Storage will be an advantage
  • Good to have experience developing and maintaining Change Data Capture (CDC) solutions preferably using Debezium
  • Good to have hands-on experience on data migration projects specifically involving Azure Synapse and Databricks Lakehouse

Job / Role Description

  • Data management experience handling Analytics workload covering design, development, and maintenance of Lakehouse solutions sourcing data from platforms such as ERP sources, API sources, Relational stores, NoSQL and on-prem sources using Databricks/PySpark as distributed /Big Data management service, supporting batch and near-real-time ingestion, transformation, and processing
  • Strong experience in implementing secure, hierarchical namespace-based Data Lake Storage for structured/semi-structured data, aligned to bronze-silver-gold layers with ADLS Gen2. Hands-on experience with lifecycle policies, access control (RBAC/ACLs), and folder-level security. Understanding of best practices in file partitioning, retention management, and storage performance optimization
  • Comprehensive experience working across the Azure ecosystem, including networking, security, monitoring, and cost management relevant to data engineering workloads. Understanding of VNets, Private Endpoints, Key Vaults, Managed Identities, and Azure Monitor. Exposure to DevOps tools for deployment automation (e.g., Azure DevOps, ARM/Bicep/Terraform)
  • Skilled in governing and manage data access for Azure Data Lakehouses with Unity Catalog. Experience in configuring data permissions, object lineage, and access policies with Unity Catalog. Understanding of integrating Unity Catalog with Azure AD, external metastores, and audit trails
  • Lead solution design discussions, mentor juniors, and ensure adherence to coding guidelines, design patterns, and peer review processes. Able to prepare Design documents for development and guiding the team technically. Experience preparing technical design documents, HLD/LLDs, and architecture diagrams. Familiarity with code quality tools (e.g., SonarQube, pylint), and version control workflows (Git)
  • Ability to optimize Spark jobs and manage large-scale data processing using RDD/Dataframe APIs. Demonstrated expertise in partitioning strategies, file format optimization (Parquet/Delta), and Spark SQL tuning. Familiarity with Databricks runtime versions, cluster policies, libraries, and workspace management
  • Experience in building efficient orchestration solutions using Azure data factory, Databricks Workflows. Ability to design modular, reusable workflows using tasks, triggers, and dependencies. Skilled in using dynamic expressions, parameterized pipelines, custom activities, and triggers; Familiarity with integration runtime configurations, pipeline performance tuning, and error handling strategies
  • Capable of developing T-SQL queries, stored procedures, and managing metadata layers on Azure SQL; Experience in writing modular, testable Python code used in data transformations, utility functions, and packaging reusable components. Familiarity with Python environments, dependency management (pip/Poetry/Conda), and packaging libraries. Ability to write unit tests using Pytest/unit test and integrate with CI/CD pipelines.
  • Demonstrates strong verbal and written communication, proactive stakeholder engagement, and a collaborative attitude in cross-functional teams. Ability to articulate technical concepts clearly to both technical and business audiences. Experience in working with product owners, QA, and business analysts to translate requirements into deliverables

Best Regards,

Thanks,

Abdul Samad

IT Talent Acquisition

Salary.com Estimation for Azure Data Engineering Specialist (Medallion, Databricks) in Greenfield, IN
$86,436 to $111,917
If your compensation planning software is too rigid to deploy winning incentive strategies, it’s time to find an adaptable solution. Compensation Planning
Enhance your organization's compensation strategy with salary data sets that HR and team managers can use to pay your staff right. Surveys & Data Sets

What is the career path for a Azure Data Engineering Specialist (Medallion, Databricks)?

Sign up to receive alerts about other jobs on the Azure Data Engineering Specialist (Medallion, Databricks) career path by checking the boxes next to the positions that interest you.
Income Estimation: 
$92,929 - $122,443
Income Estimation: 
$122,257 - $154,284
Income Estimation: 
$92,929 - $122,443
Income Estimation: 
$122,257 - $154,284
Income Estimation: 
$168,522 - $211,152
Income Estimation: 
$189,259 - $248,928
Income Estimation: 
$71,122 - $96,652
Income Estimation: 
$92,929 - $122,443
Income Estimation: 
$122,257 - $154,284
Income Estimation: 
$143,391 - $179,890
View Core, Job Family, and Industry Job Skills and Competency Data for more than 15,000 Job Titles Skills Library

Job openings at Jobs via Dice

Jobs via Dice
Hired Organization Address Rapid, SD Full Time
Dice is the leading career destination for tech experts at every stage of their careers. Our client, Swoon Group, is see...
Jobs via Dice
Hired Organization Address Rapid, SD Temporary
Dice is the leading career destination for tech experts at every stage of their careers. Our client, Rose International,...
Jobs via Dice
Hired Organization Address Alaska, AK Full Time
Dice is the leading career destination for tech experts at every stage of their careers. Our client, DMS Vision Inc., is...
Jobs via Dice
Hired Organization Address Alaska, AK Full Time
Dice is the leading career destination for tech experts at every stage of their careers. Our client, SVK Technology Solu...

Not the job you're looking for? Here are some other Azure Data Engineering Specialist (Medallion, Databricks) jobs in the Greenfield, IN area that may be a better fit.

Azure Data Engineering Specialist (Medallion, Databricks)

Reliable Software Resources, Greenfield, IN

Azure Data Engineering Lead

MethodHub, Indianapolis, IN

AI Assistant is available now!

Feel free to start your new journey!