What are the responsibilities and job description for the Sr Data Engineer position at Range Resources?
Position Summary
The Senior Data Engineer designs, builds, and maintains the scalable cloud-based infrastructure required to collect, process, and store large volumes of data used for analytics and decision‑making. They develop and manage cloud-based data systems such as data lakes, data warehouses, and ETL/ELT pipelines, ensuring data flows reliably from diverse sources into centralized platforms. They are responsible for cleaning, transforming, and validating data to ensure high levels of quality, consistency, and security. Data engineers collaborate closely with data scientists, analysts, and business stakeholders to understand data requirements and deliver scalable, performance‑optimized solutions. They also monitor, troubleshoot, and continuously improve data infrastructure to support current and future business needs. May coach, review and delegate work to lower-level professionals, as needed.
Responsibilities
The Senior Data Engineer designs, builds, and maintains the scalable cloud-based infrastructure required to collect, process, and store large volumes of data used for analytics and decision‑making. They develop and manage cloud-based data systems such as data lakes, data warehouses, and ETL/ELT pipelines, ensuring data flows reliably from diverse sources into centralized platforms. They are responsible for cleaning, transforming, and validating data to ensure high levels of quality, consistency, and security. Data engineers collaborate closely with data scientists, analysts, and business stakeholders to understand data requirements and deliver scalable, performance‑optimized solutions. They also monitor, troubleshoot, and continuously improve data infrastructure to support current and future business needs. May coach, review and delegate work to lower-level professionals, as needed.
Responsibilities
- Cloud Migration Strategy: Design and execute end-to-end migration strategies to move on-premises SQL Server workloads to Azure, Fabric, Databricks or Snowflake.
- Pipeline Development: Build and maintain scalable ETL/ELT pipelines using Azure Data Factory (ADF), Databricks (PySpark), and Microsoft Fabric.
- Platform Integration: Manage cross-platform data flows, ensuring seamless integration between Snowflake/Databricks/Fabric warehouses and the on-premises SQL Server.
- Architecture Design: Implement Medallion Architecture (Bronze/Silver/Gold) and Lakehouse patterns to support both batch and real-time analytics.
- SQL Server: Build and support complex queries, performance tuning (indexes, query plans), partitioning, and large-volume ETL workloads
- Regular and timely attendance
- Deal professionally and respectfully with coworkers, management and others
- Read, comprehend and follow applicable policies, procedures and directions
- Work with others as part of a team to ensure efficient operations and enhanced productivity
- Safeguard confidential information and disclose only to those in "need-to-know" positions
- Safeguard and enhance Range's assets and business interests
- Consistently perform all job duties at an acceptable level