What are the responsibilities and job description for the Azure Databricks Developer position at Avance Consulting?
Role:- Azure Databricks Developer
Location:- Plano,TX
Job Type:- Contract
Job Description
Required Qualifications:
- At least 5 years of experience with Information Technology
- Candidates authorized to work for any employer in the United States without employer-based visa sponsorship are welcome to apply. Infosys is unable to provide immigration sponsorship for this role at this time
- Proven years of hands-on experience in Azure, Databricks
- Proficiency in data engineering tools: Spark, Kafka, Airflow, SQL, Python
- Strong understanding of retail or CPG data domains
- Familiarity with vector search and semantic retrieval for retail use cases
- Expertise in data privacy and compliance in consumer-facing environments
- Good experience in end-to-end implementation of data warehouse and data marts
- Strong understanding of Data Warehousing concepts of Data Vault, Dimensional modelling
- Extensive experience with CI/CD platforms such as GitLab CI, GitHub Actions, Azure Pipelines, and Jenkins
- Experience designing and implementing complex solutions for distributed systems
- Experience in leading and mentoring teams
Preferred Qualifications:
- Experience in Teradata
- Proven hands-on experience in Data migration solutions and able to demonstrate the same
- Experience in Relational Modeling, Dimensional Modeling and Modeling of Unstructured Data
- Familiarity with log and monitoring solutions
- Experience in utilizing development containers, unit testing/code quality review/linting - standard best practice
- Good understanding of Agile software development frameworks
- Strong communication and Analytical skills
- Ability to work in teams in a diverse, multi-stakeholder environment comprising of Business and Technology teams
- Experience and desire to work in a global delivery environment
- Experience and detailed knowledge with Master Data Management, ETL, Data Quality, metadata management, data profiling, micro-batches, streaming data loads