What are the responsibilities and job description for the Principal Data Solutions Architect-Lead with Security Clearance position at ClearanceJobs?
Seeking a Strong Lead Principal Data Solutions Architect Primary Focus Will Be In Natural Language Processing (NLP), Applying Data Mining Techniques, Doing Statistical Analysis And Building High Quality Prediction Systems. RESPONSIBILITES
- Leading system architecture decisions, ensuring technical alignment across teams, and advocating for best practices in cloud and data engineering.
- Serve as a senior technical leader and trusted advisor, driving architectural strategy and guiding development teams through complex solution design and implementation
- Serve as the lead architect and technical authority for enterprise-scale data solutions, ensuring alignment with strategic objectives and technical standards.
- Drive system architecture design, including data modeling, integration patterns, and performance optimization for large-scale data warehouses.
- Provide expert guidance to development teams on Agile analytics methodologies and best practices for iterative delivery.
- Act as a trusted advisor and advocate for the government project lead, translating business needs into actionable technical strategies.
- Oversee technical execution across multiple teams, ensuring quality, scalability, and security compliance.
- Evaluate emerging technologies and recommend solutions that enhance system capabilities and operational efficiency. REQUIREMENTS/SKILLS:
- Bachelor of Science degree in Computer Science, Engineering or related subject and at least 10 years of experience leading architectural design of enterprise-level data platforms, with significant focus on Databricks Lakehouse architecture.
- Experience within the Federal Government, specifically DHS is preferred.
- Must possess demonstrable experience with Databricks Lakehouse Platform, including Delta Lake, Unity Catalog for data governance, Delta Sharing, and Databricks SQL for analytics and BI workloads.
- Must demonstrate deep expertise in Databricks Lakehouse architecture, medallion architecture (Bronze/Silver/Gold layers), Unity Catalog governance framework, and enterprise-level integration patterns using Databricks workflows and Auto Loader.
- Knowledge of and ability to organize technical execution of Agile Analytics using Databricks Repos, Jobs, and collaborative notebooks, proven by professional experience.
- Expertise in Apache Spark on Databricks, including performance optimization, cluster management, Photon engine utilization, and Delta Lake optimization techniques (Z-ordering, liquid clustering, data skipping).
- Proficiency in Databricks Unity Catalog for centralized data governance, metadata management, data lineage tracking, and access control across multi-cloud environments.
- Experience with Databricks Delta Live Tables (DLT) for declarative ETL pipeline development and data quality management.
- Certification in one or more: Databricks Certified Data Engineer Associate/Professional, Databricks Certified Solutions Architect, AWS, Apache Spark, or cloud platform certifications. DESIRED REQUIREMENTS/SKILLS:
- Expertise in ETL tools.
- Advanced knowledge of cloud platforms (AWS preferred; Azure or GCP a plus).
- Proficiency in SQL, PL/SQL, and performance tuning for large datasets.
- Understanding of security frameworks and compliance standards in federal environments.