What are the responsibilities and job description for the Integration Developer position at Permian Resources?
Job Description
Permian Resources (NYSE: PR) is seeking an Integration Developer reporting to the Enterprise Data Senior Manager in Midland, Texas. This role sits at the intersection of data engineering and systems integration — responsible for building and maintaining the technical pipelines, APIs, and automated workflows that power our Databricks data platform and connect our enterprise systems.
The ideal candidate brings strong technical skills in data pipeline development and software engineering, a genuine passion for building solutions that streamline business processes, and the ability to work effectively in a fast-paced, collaborative oil and gas environment. We are open to mid-level candidates who demonstrate strong fundamentals and clear potential, as well as senior candidates ready to hit the ground running.
General Responsibilities
Permian Resources (NYSE: PR) is seeking an Integration Developer reporting to the Enterprise Data Senior Manager in Midland, Texas. This role sits at the intersection of data engineering and systems integration — responsible for building and maintaining the technical pipelines, APIs, and automated workflows that power our Databricks data platform and connect our enterprise systems.
The ideal candidate brings strong technical skills in data pipeline development and software engineering, a genuine passion for building solutions that streamline business processes, and the ability to work effectively in a fast-paced, collaborative oil and gas environment. We are open to mid-level candidates who demonstrate strong fundamentals and clear potential, as well as senior candidates ready to hit the ground running.
General Responsibilities
- Data Pipeline Development: Design, build, optimize, and maintain ETL/ELT pipelines that move and transform large volumes of data from source systems into our Databricks analytics environment.
- Systems Integration: Build and maintain APIs and integration layers (REST, SOAP) that enable seamless data flow between enterprise applications, working across a range of data formats including JSON, CSV, and XML.
- Platform Monitoring & Support: Monitor and support development and production environments within Databricks to ensure system availability, reliability, and data quality.
- Data Modeling: Develop data models and pipelines that support reporting, dashboards, and analytical use cases across multiple business teams.
- System Optimization: Analyze existing integrations and pipelines to identify performance bottlenecks; implement best practices in code quality, data modeling, and architecture.
- Cross-Functional Collaboration: Partner closely with business subject matter experts, data scientists, and IT members to deliver reliable, clean data that meets operational needs.
- Documentation & Standards: Create and maintain clear technical documentation; adhere to coding standards and governance practices that support long-term maintainability and scalability.
- Version Control: Use GitHub or similar version control platforms for collaborative development, code reviews, and repository management.
- Mentorship (Senior Candidates): Provide guidance to less experienced team members and contribute to a culture of knowledge-sharing across the team.
- 5 years of relevant experience in data engineering, software development, or systems integration; mid-level candidates with strong fundamentals and demonstrated potential are encouraged to apply.
- 3 years of experience working in the oil and gas industry is required
- Bachelor's degree (BS/BA) in Computer Science, Engineering, Business Analytics, Statistics, or a related field; equivalent practical experience will be considered.
- Proficiency in Python and/or other modern languages (Java, C , R); solid understanding of object-oriented design principles.
- Hands-on experience with Databricks or a comparable cloud-based data platform (e.g., Snowflake); strong working knowledge of SQL.
- Experience designing and maintaining REST and/or SOAP APIs; familiarity with authentication and security best practices.
- Experience with ETL/ELT frameworks and tools (e.g., dbt); comfortable working with JSON, CSV, and XML formats.
- Strong verbal and written communication skills; ability to translate technical concepts for non-technical stakeholders across all levels of the organization.
- Capable of working independently and integrating effectively within multi-disciplinary teams in a fast-paced environment.
- Experience working with or exploring AI/ML tools, large language models, or AI-assisted development workflows is a plus.
- Significant exposure to upstream oil and gas datasets, systems, and operational workflows is strongly preferred.
- Experience working with or exploring AI/ML tools, large language models, or AI-assisted development workflows is a plus.