What are the responsibilities and job description for the Solutions Architect position at LanceSoft, Inc.?
Overview
We are seeking an experienced Solution Architect – Data to lead the design, modernization, and optimization of enterprise-scale data systems within a cloud-based data platform. This role is responsible for defining data architecture, ensuring operational excellence, and establishing governance standards to maintain high-quality, reliable, and secure data assets.
The Solution Architect will play a key role in building scalable data platforms, supporting AI-driven solutions, and guiding cloud migration initiatives. This position requires strong technical depth, architectural leadership, and close collaboration with business stakeholders, data engineers, machine learning teams, and domain experts.
The ideal candidate will have a proven track record of delivering end-to-end ETL/ELT pipelines across Databricks, Azure, and AWS environments.
Key Responsibilities
- Design and implement scalable data lake and data architectures using Databricks and cloud-native services.
- Develop metadata-driven, parameterized ingestion frameworks and multi-layer data architectures.
- Optimize data pipelines, workloads, and system performance.
- Define and enforce data governance frameworks, ensuring data integrity, accuracy, and consistency.
- Architect and develop robust ETL/ELT pipelines and data integration solutions.
- Design and support AI-driven data systems, including RAG workflows and prompt engineering.
- Lead cloud migration initiatives from legacy systems to modern cloud data platforms.
- Provide architectural guidance, best practices, and technical leadership across engineering teams.
- Create and maintain documentation, reusable modules, and standardized architecture patterns.
Required Skills & Experience
- Strong expertise with cloud platforms, primarily Azure or AWS.
- Hands-on experience with Databricks.
- Advanced proficiency in Python and SQL.
- Extensive experience building ETL/ELT pipelines and Azure Data Factory (ADF) workflows.
- Proven experience designing data lakes and implementing data governance frameworks.
- Hands-on experience with CI/CD pipelines, DevOps practices, and Git-based development.
- Ability to translate complex business requirements into scalable technical solutions.
Technical Expertise
- Programming: Python, SQL, R
- Big Data: Hadoop, Spark, Kafka, Hive
- Cloud Platforms: Azure (ADF, Databricks, Azure OpenAI), AWS
- Data Warehousing: Redshift, SQL Server
- ETL/ELT Tools: SSIS
Education & Experience Requirements
- Bachelor’s degree in Computer Science, Information Technology, Information Systems, Engineering, or a related field.
- 6 years of hands-on experience in data engineering or .NET development.
Salary : $100 - $110