What are the responsibilities and job description for the SRE - Data / ETL Engineer position at B12 Consulting?
Job Details
- The role sits within the SRE team and focuses on data services.
- Responsibilities include managing data pipelines, system integration, and data warehousing.
- Must have hands-on experience with ETL tools, and SQL query; Snowflake is a plus.
- Must have experience in scripting language - Python or Shell.
- Familiarity with automation and proactive incident prevention in data engineering is preferred.
- 100% Onsite role in Plano, TX.
- Experience with some observability tools would be nice to have
- The client is looking for a passionate and highly motivated Site Reliability Engineer (SRE) - Data Services.
- The SRE Data Services reports to the Manager of the SRE Department.
- In this role, you will apply software engineering principles to ensure the availability, performance and stability of TFS enterprise data systems including Autosys, Informatica/ETL functions and Airflow. What you ll be doing: Solid understanding of SRE concepts: SLIs, SLOs, error budgets, incident response.
- 5 years of experience in data engineering or IT operations with a focus on data transfer and orchestration.
- Experience with AWS cloud platform and data services.
- Solid understanding of data warehousing concepts and data modeling.
- Excellent problem-solving, communication, and documentation skills.
- Strong understanding of SRE principles (SLIs/SLOs, error budgets, observability, toil reduction).
Qualifications/ What you bring (Must Haves):
- 5 years of overall experience in data engineering.
- Bachelor s degree in information technology or related field.
- Autosys for job scheduling and monitoring.
- Proficiency in scripting languages (Python or Shell).
- Must have strong experience with SQL queries. ETL tools (e.g., Informatica, DataStage, Talend etc.).
- Experience with some observability tools: preferred Datadog/Dynatrace etc.
- Added bonus if you have (Preferred): ITIL Foundation framework. Informatica and AWS Data Certifications.
- Familiarity with DevOps practices and tools like Jenkins, Docker, and Kubernetes. Apache Airflow for workflow orchestration
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.