What are the responsibilities and job description for the Data Architect position at Global Payments Inc.?
Description Summary of This RoleWhat Part Will You Play?
- Data Architecture & Modeling: Design and implement robust data models, leveraging dimensional modeling principles to structure data for performance, usability, and scalability in reporting and analytics.
- Contribute in the development of efficient ETL processes, data pipelines, and workflows to transform, integrate, and load data from multiple sources into the data lake.
- Help solution and optimize data lake solutions that support high query performance and can scale with increasing data volume and complexity.
- Data Governance & Quality: Develop and enforce data quality standards, including validation, transformation, and cleansing rules to ensure data accuracy and consistency across the organization.
- Performance Tuning: Perform detailed performance tuning of ETL processes and query optimization to improve data processing speeds and reduce latency in data access.
- BS in Computer Science, Information Technology, Business / Management Information Systems or related field
- Typically minimum of 2 years - Professional Experience In Coding, Designing, Developing And Analyzing Data. Typically has a basic knowledge and use of one or more languages / technologies from the following but not limited to; two or more modern programming languages used in the enterprise, experience working with various APIs, external Services, experience with both relational and NoSQL Databases
- Expertise in designing star schemas, snowflake schemas, and other dimensional models per Kimball methodology.
- Proven experience with ETL frameworks, data pipeline design, and data warehousing technologies such as Snowflake, BigLake, BigQuery, or similar platforms.
- SQL Proficiency: Deep understanding of SQL, query optimization, and performance tuning.
- Programming Knowledge: Strong proficiency in languages such as Python, Java, or Scala for data processing and pipeline development.
- Experience with Big Data: Familiarity with big data technologies (e.g., Spark, Kafka, DBT, Iceberg, Parquet) is a plus.
- BS in Computer Science, Information Technology, Business / Management Information Systems or related field
- 4 years professional Experience In Coding, Designing, Developing And Analyzing Data
- Familiarity with OpenTelemetry
- Basics to advance level of AI/ML knowledge.