Demo

Senior Databricks Solution Engineer (EDP & CDP)

Acnovate Corporation
Philadelphia, PA Contractor
POSTED ON 4/28/2026
AVAILABLE BEFORE 5/28/2026

Senior Databricks Solution Engineer

Enterprise Data Platform (EDP) & Customer Data Platform (CDP)
Azure Databricks | Lakeflow | Delta Live Tables | ADF | Kafka | MongoDB

 

This role blends solution architecture, hands‑on data engineering, and enterprise enablement, with a strong focus on retail, customer, marketing, and digital data. You will define and enforce enterprise data standards, build reusable frameworks, and deliver secure, governed data pipelines that support analytics, personalization, marketing activation, and executive reporting.


Key Responsibilities


1. Enterprise Lakehouse Architecture & Standards

  • Define and enforce Client’s Lakehouse architecture standards using Azure Databricks, aligned to scalability, security, and cost efficiency.
  • Implement and operationalize the Medallion Architecture (Bronze / Silver / Gold) as the enterprise standard:
    • Bronze – raw, immutable, audit‑ready ingestion
    • Silver – cleansed, conformed, validated, and privacy‑compliant datasets
    • Gold – curated, analytics‑ready, semantic‑aligned data products (including standardized current and history tables where required, e.g., composite composite_hist / SCD Type 2)
  • Establish reference architectures, design patterns, and guardrails that enable consistent adoption across Stores, Digital, Marketing, Supply Chain, and Corporate domains.
  • Standardize on Unity Catalog for all new and migrated workloads (minimize/retire Hive Metastore usage), including consistent catalog/schema conventions, data ownership, and access controls.

2. Declarative Pipelines & Lakeflow (Databricks‑native)

  • Build declarative data pipelines using Databricks Lakeflow and Delta Live Tables (DLT) as the preferred enterprise pattern.
  • Define data quality expectations, freshness SLAs, and validation rules directly within pipelines.
  • Leverage DLT capabilities for:
    • automated dependency management
    • data quality enforcement
    • lineage and observability
    • operational simplicity at scale

3. Inbound Data Ingestion Standards (Kafka, Lakeflow Connectors, Auto Loader, Databricks Streaming)

  • Standardize inbound ingestion using:
    • Apache Kafka for event-driven and streaming ingestion (pub/sub, CDC fanout, operational events)
    • Databricks Auto Loader for scalable, incremental file ingestion from cloud object storage with schema inference/evolution
    • Lakeflow Connectors for managed ingestion from SaaS applications and databases (connector-based patterns with governed landing into Bronze)
    • Databricks Structured Streaming (and streaming tables where applicable) for continuous ingestion and low-latency processing into Delta
  • Design resilient ingestion frameworks that support high‑volume customer, marketing, and operational data with schema evolution and fault tolerance.
  • Apply consistent ingestion controls across internal systems and external vendors.

4. CDP, Marketing & Digital Data Integrations

  • Design and deliver customer and marketing data integrations supporting Client’s CDP and activation ecosystem.
  • Build and manage pipelines integrating data from platforms including:
    • Acxiom / BRIDGE
    • LiveRamp
    • Blueshift
    • Ovative
    • Google Analytics
    • Google Ads
    • Meta Ads (Facebook / Instagram)
  • Enable enterprise use cases such as:
    • customer identity resolution
    • audience segmentation and activation
    • marketing attribution and performance analytics
    • personalization and lifecycle analysis

5. Enterprise Data Modeling Methodology (Explicit Standard)

  • Define and enforce Client’s Enterprise Data Modeling methodology, including:
    • Canonical data models for shared enterprise entities (customer, store, product, transaction, vendor)
    • Dimensional modeling (Star / Snowflake schemas) for analytics and reporting
    • Semantic modeling aligned to downstream BI and analytics tools
  • Ensure conformed dimensions and consistent metric definitions across domains.
  • Partner with analytics and business teams to validate business meaning and usability.

6. Semantic Enablement & Consumption (MSTR, Power BI, Unity Catalog)

  • Provision trusted, analytics‑ready datasets for:
    • MicroStrategy (MSTR)
    • Power BI
  • Leverage Unity Catalog as the system of record for:
    • data governance and access control
    • lineage and discovery
    • semantic consistency and certification
    • data classification via Unity Catalog tags (e.g., PII/sensitivity, domain, certification) to drive masking, policy enforcement, and controlled publishing
  • Promote enterprise metric definitions via governed semantic models (e.g., Unity Catalog metrics/semantic layer where adopted) to ensure consistent KPIs across MSTR, Power BI, and downstream consumers.
  • Ensure Gold‑layer datasets are optimized, documented, and certified for enterprise consumption.

7. Orchestration & Integration (ADF, Airflow, Databricks Workflows)

  • Orchestrate end‑to‑end pipelines using Azure Data Factory (ADF) and/or Apache Airflow, integrated with Databricks Workflows.
  • Define dependency management, retry patterns, alerting, and operational ownership for production workloads.

8. PII Protection & 3‑Layer Encryption (Client Standard)

Establish PII protection as a non‑negotiable enterprise standard, including mandatory 3‑layer encryption:

  1. In‑Transit Encryption
    • TLS‑based secure transport for all internal and external transfers.
  2. File‑Level Encryption at Rest
    • Encrypted files and objects for vendor, marketing, and partner exchanges.
  3. Record‑Level / Element‑Level Encryption & Hashing
    • Attribute‑level protection for PII used in CDP, marketing, and analytics workflows.
    • Enforce protections using Unity Catalog controls where applicable (e.g., masking policies and fine‑grained access controls) to ensure governed use across analytics and activation.

Ensure full auditability, regulatory compliance (GDPR, CCPA), and consistent enforcement across platforms and vendors.


9. Secure Vendor & Partner Data Exchange

  • Design and operate secure, high‑volume data exchanges with advertising, marketing, and data partners.
  • Validate keys, credentials, service accounts, and secure repositories (SFTP, cloud object storage).
  • Provide technical direction to vendors to ensure compliant, end‑to‑end delivery under tight timelines.

10. Performance, Reliability & Cost Optimization

  • Optimize Spark, DLT, and SQL workloads for performance, reliability, and cost efficiency.
  • Contribute to production support, incident analysis, and continuous platform improvements.
  • Implement production operational standards using the enterprise toolchain (e.g., New Relic monitoring, PagerDuty incident response/on-call, ServiceNow ticketing), including alerting, runbooks, and SLAs.

11. Leadership & Enablement

  • Mentor engineers and lead architecture reviews across platform, analytics, and marketing teams.
  • Drive adoption of enterprise patterns through documentation, reviews, and enablement sessions.

Required Qualifications

  • 8 years in data engineering, solution architecture, or platform engineering.
  • Deep experience with Azure Databricks, Spark, Delta Lake, Lakeflow, Delta Live Tables, Python/PySpark, SQL.
  • Experience with Kafka, Auto Loader / Auto Streaming, ADF, and/or Airflow.
  • Strong experience in enterprise data modeling, governance, and BI enablement.
  • Proven delivery of secure, compliant, enterprise‑scale data platforms.

Hourly Wage Estimation for Senior Databricks Solution Engineer (EDP & CDP) in Philadelphia, PA
$57.00 to $70.00
If your compensation planning software is too rigid to deploy winning incentive strategies, it’s time to find an adaptable solution. Compensation Planning
Enhance your organization's compensation strategy with salary data sets that HR and team managers can use to pay your staff right. Surveys & Data Sets

What is the career path for a Senior Databricks Solution Engineer (EDP & CDP)?

Sign up to receive alerts about other jobs on the Senior Databricks Solution Engineer (EDP & CDP) career path by checking the boxes next to the positions that interest you.
Income Estimation: 
$108,740 - $126,996
Income Estimation: 
$134,729 - $158,988
Income Estimation: 
$122,257 - $154,284
Income Estimation: 
$143,391 - $179,890
Employees: Get a Salary Increase
View Core, Job Family, and Industry Job Skills and Competency Data for more than 15,000 Job Titles Skills Library

Job openings at Acnovate Corporation

  • Acnovate Corporation Newark, CA
  • SAP S/4HANA Service & Warranty Lead – Automotive Newark California 4- days office at Newark, California is a MUST Visa -any NOTE- Claim adjudication will b... more
  • 3 Days Ago

  • Acnovate Corporation Newark, CA
  • ABAP developer 4- days office at Newark, California is a MUST Visa -any ABAP Lead– Key Responsibilities : Lead the development, enhancement, and maintenanc... more
  • 3 Days Ago

  • Acnovate Corporation Baton Rouge, LA
  • JOB TITLE: Senior Application Programmer The scope of the proposed services will include the following: Development, administration & configuration of a cu... more
  • 5 Days Ago

  • Acnovate Corporation Philadelphia, PA
  • 1) Senior Azure Cloud & AI Engineer Master Data Management (Retail) Location: United States (Hybrid / Remote) Experience: 10–15 Years Job Level: Senior / S... more
  • 7 Days Ago


Not the job you're looking for? Here are some other Senior Databricks Solution Engineer (EDP & CDP) jobs in the Philadelphia, PA area that may be a better fit.

  • Databricks Philadelphia, PA
  • FEQ227R50 At Databricks, our core principles are at the heart of everything we do; creating a culture of proactiveness and a customer-centric mindset guide... more
  • 19 Days Ago

  • E Tech Group Philadelphia, PA
  • At E Tech Group, joining our team means joining a group of passionate and forward-thinking experts. We're one of the largest engineering and system integra... more
  • 20 Days Ago

AI Assistant is available now!

Feel free to start your new journey!