What are the responsibilities and job description for the Senior Data Management Architect position at Jobright.ai?
Jobright is an AI-powered career platform that helps job seekers discover the top opportunities in the US. We are NOT a staffing agency. Jobright does not hire directly for these positions. We connect you with verified openings from employers you can trust.
Job Summary:
NTT DATA is a trusted global innovator of business and technology services, seeking a Data Management Architect to join their team. The role involves creating data architecture, developing data pipelines, ensuring data quality and security, and providing technical leadership in reporting infrastructure.
Responsibilities:
• Create the overall architecture for data storage, transformation, and delivery to reporting tools, including data warehouses, data lakes, and reporting databases.
• Create the logical and physical data models that support reporting requirements, ensuring data is structured in a way that is efficient for analysis and reporting.
• Design and implement the processes (ETL or ELT) that extract, transform, and load data into the reporting environment, ensuring data quality and accuracy.
• Ability to choose appropriate reporting platforms (e.g., Tableau, Power BI, Cognos) and configuring them to work with the underlying data infrastructure.
• Responsible for implementing data quality checks and security measures to protect sensitive information within the reporting environment.
• Mentoring other team members, including data engineers and analysts, in the development and maintenance of the reporting infrastructure.
• Monitoring and Optimizing Performance of the reporting system, identify bottlenecks, and implement solutions to improve efficiency and scalability.
• Collaborating with Stakeholders like business users, data scientists, and other stakeholders to understand the reporting needs and to ensure the reporting infrastructure meets requirements.
• Develop data security best practices using Cloud (AWS/ Azure/GCP)
• Experience in Snowflake or Databricks
• Experience with reporting & analytical tools – Tableau/ Power BI/ Qlik,/Talend
• Optimize Cloud resources for cost, performance, and scalability;Strong proficiency in SQL and experience with relational databases;Proficient in programming languages such as Python or Java or Scala;
• Familiarity with big data technologies like Hadoop, Spark, or Kafka;
• Experience with programming languages – SQL, Python, or AWS Glue. Deep understanding of API connectivity and data streaming architecture.
• Experience with CI/CD/Deployment.
• 5 years of Expertise in extracting, transforming, and loading data from various sources into the reporting environment.
• 5 years of Understanding of data governance and security principles:
• 5 years of Ability to implement data quality checks and security measures to protect sensitive information.
• 5 years of Excellent communication and collaboration skills:
• 5 years of Ability to effectively communicate technical concepts to both technical and non-technical audiences.
• Strong problem-solving and analytical skills:
• Ability to troubleshoot issues and identify areas for improvement in the reporting infrastructure.
Qualifications:
Required:
• 8 years with Strong understanding of data warehousing and data modeling
• 8 years of Expertise in designing and implementing data warehouses, data marts, and other data storage solutions
• Proficiency in SQL and other data querying languages
• 5 years of Ability to write complex SQL queries to extract and transform data for reporting purposes
• 5 years of Experience with reporting tools and platforms
• 5 years of Familiarity with popular reporting tools like Tableau, Power BI, or Cognos
• 5 years of Knowledge of data integration and ETL processes
• 5 years of Expertise in extracting, transforming, and loading data from various sources into the reporting environment
• 5 years of Understanding of data governance and security principles
• 5 years of Ability to implement data quality checks and security measures to protect sensitive information
• 5 years of Excellent communication and collaboration skills
• 5 years of Ability to effectively communicate technical concepts to both technical and non-technical audiences
• Strong problem-solving and analytical skills
• Ability to troubleshoot issues and identify areas for improvement in the reporting infrastructure
• Bachelor’s In Computer Science Or Equivalent Work Experience
• Strong proficiency in SQL and experience with relational databases
• Proficient in programming languages such as Python or Java or Scala
• Familiarity with big data technologies like Hadoop, Spark, or Kafka
• Experience with programming languages – SQL, Python, or AWS Glue
• Deep understanding of API connectivity and data streaming architecture
• Experience with CI/CD/Deployment
• Develop data security best practices using Cloud (AWS/ Azure/GCP)
• Experience in Snowflake or Databricks
• Experience with reporting & analytical tools – Tableau/ Power BI/ Qlik,/Talend
• Optimize Cloud resources for cost, performance, and scalability
• Monitoring and Optimizing Performance of the reporting system, identify bottlenecks, and implement solutions to improve efficiency and scalability
• Collaborating with Stakeholders like business users, data scientists, and other stakeholders to understand the reporting needs and to ensure the reporting infrastructure meets requirements
Preferred:
• GraphQL
Company:
NTT DATA, Inc. is a trusted global innovator of business and technology services. Founded in 1988, the company is headquartered in Plano, Texas, USA, with a team of 10001 employees. The company is currently Late Stage. NTT DATA North America has a track record of offering H1B sponsorships.