What are the responsibilities and job description for the Azure Data Architect position at Jobs via Dice?
Dice is the leading career destination for tech experts at every stage of their careers. Our client, TechVirtue LLC, is seeking the following. Apply via Dice today!
Job Title: Azure Data Architect
Location : Edison New Jersey - Onsite 5 Days
Duration : 12 Months
Primary Skill: Azure Databricks
Domain : Must have Domain experience in Property and Casualty (P&C) insurance.
Responsibilities:
Job Title: Azure Data Architect
Location : Edison New Jersey - Onsite 5 Days
Duration : 12 Months
Primary Skill: Azure Databricks
Domain : Must have Domain experience in Property and Casualty (P&C) insurance.
Responsibilities:
- Develop modern data solutions and architecture for cloud-native data platforms.
- Build cost-effective infrastructure in Databricks and orchestrate workflows using Databricks/ADF.
- Lead data strategy sessions focused on scalability, performance, and flexibility.
- Collaborate with customers to implement solutions for data modernization.
- Create training plans and learning materials to upskill VM associates.
- Build a smart operations framework for DataOps and MLOps.
- Should have 14 years of experience with last 4 years in implementing Cloud native end-to-end Data Solutions in Databricks from ingestion to consumption to support variety of needs such as Modern Data warehouse, BI, Insights and Analytics
- Should have experience in architecture and implementing End to End Modern Data Solutions using Azure and advance data processing frameworks like Databricks etc.
- Experience with Databricks, PySpark, and modern data platforms.
- Proficiency in cloud-native architecture and data governance.
- Strong experience in migrating from on-premises to cloud solutions (Spark, Hadoop to Databricks).
- Understanding of Agile/Scrum methodologies.
- Demonstrated knowledge of data warehouse concepts. Strong understanding of Cloud native databases, columnar database architectures
- Ability to work with Data Engineering teams, Data Management Team, BI and Analytics in a complex development IT environment.
- Good appreciation and at least one implementation experience on processing substrates in Data Engineering - such as ETL Tools, Kafka, ELT techniques
- Data Mesh and Data Products designing, and implementation knowledge will be an added advantage.