You haven't searched anything yet.
Design and implement Azure cloud-based Data Warehousing and Governance architecture with Lakehouse paradigm
Integrating technical functionality, ensuring data accessibility, accuracy, and security.
Architect the Unity Catalog to provide centralized access control, auditing, lineage, and data discovery capabilities across Databricks workspaces.
Define and organize data assets (structured and unstructured) within the Unity Catalog.
Enable data analysts and etl engineers to discover and classify data, notebooks, dashboards, and files across clouds and platforms.
Implement a single permission model for data and AI assets.
Define access policies at a granular level (rows, columns, features) to ensure secure and consistent access management across workspaces and platforms.
Leverage Delta Sharing to enable easy data sharing across regions, and platforms.
Ensure that data and AI assets can be securely shared with minimal replication, maintaining a unified experience for users.
Monitoring and Observability: utilize AI to automate monitoring, diagnose errors, and maintain data and quality.
Set up alerts for personally identifiable information (PII) detection, and operational intelligence.
Work closely with data scientists, analysts, and engineers to promote adoption of the Unity Catalog.
Provide training and documentation to ensure effective usage and compliance with governance policies
Skills:
Designed data warehouse and data lake solutions along with data processing Pipeline using PySpark using Databricks
Performed Data Modelling on Databricks [Delta Table] for transactional and analytical need.
Designed and developed pipelines to load data to Data Lake
Databricks Platform Proficiency, including its components like Databricks SQL, Delta Live Tables, Databricks Repos, and Task Orchestration.
Deep understanding of data governance principles, especially related to data cataloging, access control, lineage, and metadata management.
Strong SQL skills for querying and managing data
Ability to design and optimize data models for structured and unstructured data.
Understand how to manage compute resources, including clusters and workspaces.
Ability to adapt to changes and emerging trends in data engineering and governance.
Involved in hands on development and configuration of Unity Catalog
Full Time
IT Outsourcing & Consulting
$124k-153k (estimate)
05/25/2024
05/27/2024
speridian.com
ALBUQUERQUE, NM
1,000 - 3,000
2003
Private
ALI HASAN
$200M - $500M
IT Outsourcing & Consulting
Speridian Technologies is a provider of Information Technology business consulting, enterprise applications and business process services.
The job skills required for Senior Data Platform Engineer include SQL, Data Engineering, Data Warehouse, Azure, Data Processing, Data Warehousing, etc. Having related job skills and expertise will give you an advantage when applying to be a Senior Data Platform Engineer. That makes you unique and can impact how much salary you can get paid. Below are job openings related to skills required by Senior Data Platform Engineer. Select any job title you are interested in and start to search job requirements.
The following is the career advancement route for Senior Data Platform Engineer positions, which can be used as a reference in future career path planning. As a Senior Data Platform Engineer, it can be promoted into senior positions as a Database Engineer IV that are expected to handle more key tasks, people in this role will get a higher salary paid than an ordinary Senior Data Platform Engineer. You can explore the career advancement for a Senior Data Platform Engineer below and select your interested title to get hiring information.