What are the responsibilities and job description for the Data Architect position at V2Soft?
Hybrid Position 4 days a week onsite at Dearborn, MI
Skills Required:
Coding, Programming, Google Cloud Platform, Software Development, Data Architecture, Data/Analytics, Big Query, Application Development, Application Architect, Data Modeling, Application Design
Experience Required:
Senior Associate Exp: 6-10 yrs in IT; 4 yrs in concentration
Education Required:
Bachelor''s Degree
Additional Information :
Hybrid Position 4 days a week onsite 1. Build and maintain data pipelines on Google Cloud Platform (Google Cloud Platform) using Dataflow for batch and/or streaming processing workflows 2. Develop and maintain robust data transformation layers using Dataform and/or dbt, following best practices in modeling, testing, documentation, and deployment patterns 3. Design end-to-end enterprise data architectures for large-scale analytics and operational use cases, ensuring scalability, reliability, and governance 4. Translate complex business requirements into conceptual, logical, and physical data models that align with organizational goals and technical constraints 5. Apply deep BigQuery expertise including schema design, partitioning and clustering strategies, and continuous cost and performance optimization 6. Write complex SQL transformations and analytics queries across large-scale datasets with a high degree of accuracy and performance awareness 7. Leverage programming skills (Python, Java, Scala, or equivalent) to support automation, pipeline logic, orchestration, and data utility development 8. Utilize enterprise data modeling tools such as SAP PowerDesigner and/or ERwin to produce well-documented, standards-compliant data models 9. Collaborate within CI/CD and Git-based workflows, including branching strategies, peer code reviews, automated testing, and managed deployments for data and analytics engineering
Skills Required:
Coding, Programming, Google Cloud Platform, Software Development, Data Architecture, Data/Analytics, Big Query, Application Development, Application Architect, Data Modeling, Application Design
Experience Required:
Senior Associate Exp: 6-10 yrs in IT; 4 yrs in concentration
Education Required:
Bachelor''s Degree
Additional Information :
Hybrid Position 4 days a week onsite 1. Build and maintain data pipelines on Google Cloud Platform (Google Cloud Platform) using Dataflow for batch and/or streaming processing workflows 2. Develop and maintain robust data transformation layers using Dataform and/or dbt, following best practices in modeling, testing, documentation, and deployment patterns 3. Design end-to-end enterprise data architectures for large-scale analytics and operational use cases, ensuring scalability, reliability, and governance 4. Translate complex business requirements into conceptual, logical, and physical data models that align with organizational goals and technical constraints 5. Apply deep BigQuery expertise including schema design, partitioning and clustering strategies, and continuous cost and performance optimization 6. Write complex SQL transformations and analytics queries across large-scale datasets with a high degree of accuracy and performance awareness 7. Leverage programming skills (Python, Java, Scala, or equivalent) to support automation, pipeline logic, orchestration, and data utility development 8. Utilize enterprise data modeling tools such as SAP PowerDesigner and/or ERwin to produce well-documented, standards-compliant data models 9. Collaborate within CI/CD and Git-based workflows, including branching strategies, peer code reviews, automated testing, and managed deployments for data and analytics engineering