Demo

Senior Data Engineer – Azure & Databricks Focus

Verus Mortgage Capital
Bloomington, MN Full Time
POSTED ON 12/20/2025
AVAILABLE BEFORE 2/17/2026

Department: Information Technology - Data & Reporting

Location: Bloomington, MN

Compensation: $130,000 - $150,000 / year

Description

We’re seeking a Senior Data Engineer with deep experience in the Azure ecosystem, particularly Azure Databricks, Delta Lake, and SQL, to drive the build-out of our medallion data platform.

You’ll design and deliver high-performance, governed data pipelines that integrate data from diverse sources—SQL Server Managed Instance, other Azure-based systems, and third-party sources—using PySpark, SQL, and Databricks utilities. You will collaborate closely with data analytics teams and business stakeholders to ensure the datasets you produce enable self-service insights and analytics.

Experience working with AI/ML or a strong interest in creating a data platform to support machine learning is highly valued. Mortgage or financial-services experience is a plus, but not required.

Responsibilities and Duties:

  • Design, develop, and optimize data pipelines in Azure Databricks using PySpark and SQL, applying Delta Lake and Unity Catalog best practices.
  • Build modular, reusable libraries and utilities within Databricks to accelerate development and standardize workflows.
  • Implement Medallion architecture (Bronze, Silver, Gold layers) for scalable, governed data zones.
  • Integrate external data sources via REST APIs, SFTP file delivery, and SQL Server Managed Instance, implementing validation, logging, and schema enforcement.
  • Utilize parameter-driven jobs and manage compute using Spark clusters and Databricks serverless. Collaborate with data analytics teams and business stakeholders to understand requirements and deliver analytics-ready datasets.
  • Monitor and troubleshoot Azure Data Factory (ADF) pipelines (jobs, triggers, activities, data flows) to identify and resolve job failures and data issues.
  • Automate deployments and manage code using Azure DevOps for CI/CD, version control, and environment management.
  • Contribute to documentation, architectural design, and continuous improvement of data engineering best practices.
  • Support the design and readiness of the data platform for AI and machine learning initiatives.

Education and Experience:

  • 5 years of hands-on data-engineering experience in Azure-centric environments.
  • Expertise with Azure Databricks, PySpark, Delta Lake, and Unity Catalog.
  • Strong SQL skills with experience in Azure SQL Database or SQL Server Managed Instance.
  • Proficiency in Azure Data Factory for troubleshooting and operational support.
  • Experience integrating external data using REST APIs and SFTP.
  • Working knowledge of Azure DevOps for CI/CD, version control, and parameterized deployments.
  • Ability to build and maintain reusable Databricks libraries, utility notebooks, and parameterized jobs.
  • Proven track record partnering with data analytics teams and business stakeholders.
  • Excellent communication, problem-solving, and collaboration skills.
  • Interest or experience in AI and machine learning data preparation.
Preferred Qualifications:
  • Experience implementing Medallion architecture and working within governed data environments.
  • Knowledge of data governance, RBAC, and secure access controls in Azure.
  • Familiarity with dimensional modeling, data warehousing concepts, and preparing datasets for BI tools (e.g., Power BI).
  • Understanding of Spark cluster management, serverless compute, and performance optimization.
  • Exposure to creating and managing Databricks utility widgets and leveraging Delta Lake features like time travel and schema enforcement.
  • Mortgage or financial-services industry experience (a plus but not required).
  • Hands-on experience preparing datasets for AI/ML models.
Key Competencies:
  • Azure Data Engineering Expertise: Skilled in Azure Databricks, PySpark, Delta Lake, Unity Catalog, and SQL-based environments.
  • Data Pipeline Development: Proven ability to design, optimize, and maintain scalable ETL/ELT pipelines using Databricks and Azure Data Factory
  • Data Architecture & Governance: Knowledge of Medallion architecture, schema enforcement, RBAC, and secure access controls.
  • Integration Skills: Experience ingesting and validating data from REST APIs, SFTP, SQL Server Managed Instance, and other Azure sources.
  • DevOps & Automation: Strong proficiency with Azure DevOps for CI/CD, version control, and automated deployments.
  • Performance Optimization: Ability to tune Spark clusters, leverage serverless compute, and optimize processing at scale.

How This Role Demonstrates Our Values:
  • Integrity: Builds secure, governed, and trustworthy data solutions.
  • Collaboration: Partners effectively with stakeholders and teams.
  • Excellence: Delivers high-quality, optimized, and scalable pipelines.
  • Critical Curiosity: Learns, questions, and innovates with new technologies.

Benefits

  • Great compensation package
  • Attractive benefits plans and paid time off
  • 401(k) w/ company matching
  • Professional learning and development opportunities
  • Tuition Reimbursement
  • And much more!
Successful applicants will exemplify strong ethics, integrity, respect for others, accountability for decisions and actions, and good citizenship.

Maintaining a reliable, uninterrupted high speed internet connection is a requirement of hybrid or remote positions.

All job duties and responsibilities must be performed within the guidelines of the Verus Residential Mortgage Employee Handbook and established company policies and procedures. It is the responsibility of each employee to maintain confidentiality of the company, its clients and to follow applicable laws and regulations in the performance of duties.

Verus Mortgage Capital is an equal opportunity employer. All qualified applicants are welcomed to apply and will receive consideration for employment without unlawful discrimination because of a person’s race, religious creed, color, national origin, citizenship status, ancestry, marital status, sex, age, or sexual orientation, or because of a person’s disability or medical condition.

Salary : $130,000 - $150,000

If your compensation planning software is too rigid to deploy winning incentive strategies, it’s time to find an adaptable solution. Compensation Planning
Enhance your organization's compensation strategy with salary data sets that HR and team managers can use to pay your staff right. Surveys & Data Sets

What is the career path for a Senior Data Engineer – Azure & Databricks Focus?

Sign up to receive alerts about other jobs on the Senior Data Engineer – Azure & Databricks Focus career path by checking the boxes next to the positions that interest you.
Income Estimation: 
$122,257 - $154,284
Income Estimation: 
$143,391 - $179,890
Income Estimation: 
$122,257 - $154,284
Income Estimation: 
$143,391 - $179,890
Income Estimation: 
$168,522 - $211,152
Income Estimation: 
$189,259 - $248,928
Income Estimation: 
$71,122 - $96,652
Income Estimation: 
$92,929 - $122,443
Income Estimation: 
$92,929 - $122,443
Income Estimation: 
$122,257 - $154,284
View Core, Job Family, and Industry Job Skills and Competency Data for more than 15,000 Job Titles Skills Library

Not the job you're looking for? Here are some other Senior Data Engineer – Azure & Databricks Focus jobs in the Bloomington, MN area that may be a better fit.

  • UST Alpharetta, GA
  • Role Description Senior Data Engineer (Azure Databricks) Lead II - Data Engineering Who We Are Born digital, UST transforms lives through the power of tech... more
  • Just Posted

  • Databricks San Francisco, CA
  • P-59 At Databricks, we are passionate about enabling data teams to solve the world's toughest problems — from making the next mode of transportation a real... more
  • 15 Days Ago

AI Assistant is available now!

Feel free to start your new journey!