What are the responsibilities and job description for the DE Architect - Databricks position at Tredence Inc.?
Role description
Job Title: DE Architect Location: cincinati, OH (Hybrid) Industry/Field: Retail Experience: 7-12 years Company: Tredence About Us: Tredence differentiates itself from other analytics services firms by enabling last mile adoption of insights. We drive true business impact by uniting our strengths in business analytics, data science, and software engineering. We help our clients deliver on the artificial intelligence imperative, in a practical and demystified way. To know more about us, visit: www.tredence.com Job Overview: As a DE Architect at Tredence, you will design and implement scalable Databricks-based data architectures tailored for the retail industry. Success in this role means delivering robust lakehouse solutions that enable actionable insights and drive business impact. You will collaborate closely with cross-functional teams to ensure data quality, governance, and performance meet client expectations. This position plays a key role in advancing Tredence’s mission to demystify AI and analytics through pragmatic engineering excellence. What will your role look like? - Design and oversee the architecture of Databricks environments, focusing on Delta Lake and Lakehouse frameworks for retail clients. - Develop and implement data cataloging, data modeling, and data quality strategies to ensure reliable and accessible data assets. - Lead end-to-end data engineering initiatives using PySpark, Python, and SQL to build scalable and maintainable data pipelines. - Collaborate with data scientists, business analysts, and software engineers to integrate analytics solutions that deliver measurable business outcomes. - Establish best practices for data governance, security, and compliance within Databricks platforms. - Mentor and provide technical guidance to junior engineers and architects on advanced data engineering concepts. - Continuously evaluate emerging technologies and tools to evolve our architecture and improve efficiency. You will need: - Demonstrate 7-12 years of hands-on experience in data engineering architecture within the retail or related industries. - Exhibit strong expertise in Databricks architecture, including Delta Lake and Lakehouse implementations. - Apply advanced knowledge in data cataloging, data modeling, and data quality management to complex data ecosystems. - Utilize proficient programming skills in PySpark, Python, and SQL to design and optimize data pipelines. - Hold a Bachelor’s degree in Computer Science, Engineering, Information Technology, or a closely related field. - Communicate effectively to translate business requirements into technical solutions and architectural frameworks. Good to Have Skills: - Possess certifications related to Databricks, such as Databricks Certified Data Engineer or Professional Data Engineer. - Have experience working in hybrid cloud environments and integrating multi-cloud data architectures. - Understand retail-specific data domains and analytics challenges, including inventory, supply chain, and customer data management. - Demonstrate knowledge of DevOps practices and CI/CD pipelines relevant to data engineering workflows. - Experience with additional big data technologies such as Apache Spark, Kafka, or cloud data services (AWS, Azure, GCP). Why you will love this job: - Grow your career by working on cutting-edge data architectures that solve real, impactful business problems. -