What are the responsibilities and job description for the Data Quality Analyst - Entry Level position at Jobright.ai?
Verified Job On Employer Career Site
Job Summary:
Vertiv is a global leader in mission-critical infrastructure technologies for vital applications in data centers and communication networks. The Data Quality Analyst will ensure the reliability, accuracy, and consistency of enterprise data by building and maintaining automated data quality checks and monitoring pipelines.
Responsibilities:
• Develop and Maintain Data Quality Frameworks: Design and implement automated data validation checks, profiling rules, and anomaly detection processes across datasets from multiple systems and sources.
• Perform Data Quality Audits and Root Cause Analysis: Proactively identify and resolve data inconsistencies, duplication, data drift, and schema issues. Perform root-cause analysis to trace data quality issues back to source systems or transformation processes.
• Enhance Data Pipelines and ETL Processes: Work closely with engineering and data teams to ensure data ingestion and transformation processes maintain integrity, completeness, and timeliness standards. Support testing and validation for pipeline changes.
• Collaborate with Stakeholders: Partner with business, IT, and analytics teams to understand critical data quality requirements and to prioritize resolution of data defects that impact key reporting and operational systems.
• Document Data Definitions and Quality Rules: Maintain metadata, data dictionaries, and technical documentation on data lineage, validation logic, and quality scoring.
• Establish Monitoring and Alerting: Implement proactive data quality monitoring dashboards, metrics, and alerting systems to surface real-time issues with production data.
• Support Continuous Improvement and Governance Initiatives: Lead or participate in data stewardship, master data governance, and quality improvement initiatives under Global Business Services programs.
• Tools & Testing: Leverage SQL, Python, or other tools for custom rule building, testing, and diagnostics. Build reusable quality test suites and participate in test automation efforts.
Qualifications:
Required:
• Bachelor’s degree in Information Technology, Computer Science, Mathematics, Engineering, or related field
• 1-3 years of experience in a data quality, data engineering, or related role within a multinational environment
• High proficiency in SQL (MSSQL, Oracle, Impala, etc.) and related programming extensions
• Familiarity with ETL tools and concepts
• Knowledge of data governance, master data management (MDM), and metadata best practices
• Detail-oriented mindset with a passion for ensuring data accuracy and completeness
• Strong communication and collaboration skills with the ability to work cross-functionally
• Strong documentation and reporting skills to communicate findings and improvement opportunities
Preferred:
• Working knowledge of scripting/programming languages (Python preferred) for building custom validation rules or quality automation
Company:
Vertiv designs, builds and services critical infrastructure that enables vital applications for data centers and industrial facilities. Founded in 2016, the company is headquartered in Westerville, Ohio, USA, with a team of 10001 employees. The company is currently Public Company.