What are the responsibilities and job description for the SRE-Data/ETL Engineer position at Jobs via Dice?
Location: The Colony, TX
Description:
Job Title: SRE-data/ETL Engineer
Location: Plano, TX
Duration:- 12 Months
Position Summary:-
The role sits within the SRE team and focuses on data services.
Contact:
This job and many more are available through The Judge Group. Please apply with us today!
Description:
Job Title: SRE-data/ETL Engineer
Location: Plano, TX
Duration:- 12 Months
Position Summary:-
The role sits within the SRE team and focuses on data services.
- Responsibilities include managing data pipelines, system integration, and data warehousing.
- Must have hands-on experience with ETL tools, and SQL query; Snowflake is a plus.
- Must have experience in scripting language - Python or Shell.
- Familiarity with automation and proactive incident prevention in data engineering is preferred.
- Experience with some observability tools would be nice to have.
- Requirements: What we're looking for What you'll be doing: Qualifications/ What you bring (Must Haves): Added bonus if you have (Preferred):
- The Client Operations Center is looking for a passionate and highly motivated Site Reliability Engineer (SRE) - Data Services.
- The SRE Data Services reports to the Manager of the SRE Department .In this role, you will apply software engineering principles to ensure the availability, performance and stability of TFS's enterprise data systems including Autosys, Informatica/ETL functions and Airflow.
- Solid understanding of SRE concepts: SLIs, SLOs, error budgets, incident response.
- 5 years of experience in data engineering or IT operations with a focus on data transfer and orchestration.
- Experience with AWS cloud platform and data services.
- Solid understanding of data warehousing concepts and data modeling.
- Excellent problem-solving, communication, and documentation skills.
- Strong understanding of SRE principles (SLIs/SLOs, error budgets, observability, toil reduction).
- 5 years of overall experience in data engineering.
- Bachelor's degree in information technology or related field.
- Autosys for job scheduling and monitoring.
- Proficiency in scripting languages (Python or Shell).
- Must have strong experience with SQL queries.
- ETL tools (e.g. Informatica , DataStage, Talend etc.).
- Experience with some observability tools: preferred Datadog/Dynatrace etc.
- ITIL Foundation framework.
- Informatica and AWS Data Certifications.
- Familiarity with DevOps practices and tools like Jenkins, Docker, and Kubernetes.
- Apache Airflow for workflow orchestration.
Contact:
This job and many more are available through The Judge Group. Please apply with us today!