What are the responsibilities and job description for the Data Engineer position at Shaw Industries and Careers?
Company Overview
Shaw Industries Group, Inc. is a leader in flooring and other surface solutions designed for residential housing, commercial spaces and outdoor environments. Leveraging strengths in design, innovation and operational excellence, the company takes a people-centered, customer-focused, and growth-minded approach to meet diverse market needs. It creates differentiated customer experiences through its expansive portfolio of brands: Anderson Tuftex, COREtec, Shaw Floors, Patcraft, Philadelphia Commercial, Shaw Contract, Shaw Sports Turf, Shawgrass, Southwest Greens, Watershed Geo and more. Headquartered in Dalton, Georgia, Shaw is a wholly owned subsidiary of Berkshire Hathaway, Inc. with more than $6 billion in annual sales and 18,000 associates worldwide.
Job Summary
We are looking for a data engineer to join our enterprise analytics team! This individual will partner with data scientists, analysts and product owners to support data science projects and initiatives across the enterprise – including in our manufacturing, planning/forecasting and customer service areas.
Applicant must be an experienced data pipeline builder who enjoys building solutions from the ground up. We are looking for someone who is self-directed and comfortable supporting the data needs of multiple teams and projects.
Responsibilities:
- Partner with data scientists, data analysts, data stewards and product owners to understand data needs and develop solutions
- Design, build, launch and maintain efficient and reliable data pipelines to support data analysis and machine learning models
- Define and manage standards, guidelines and processes to ensure data quality
- Assemble large, complex datasets based on business requirements
- Identify and deploy various methods and techniques for optimal extraction, transformation and loading of data from a wide variety of data sources
- Help build and maintain best practices for our data engineering strategy – identifying and addressing gaps in existing processes and solutions
Minimum Qualifications:
- Bachelor’s degree in a quantitative field (e.g., Computer Science, Engineering) or a related quantitative field or equivalent work experience is required.
- At least 4-6 years of work experience in data management disciplines including data integration, modeling, optimization and data quality, and/or other areas directly relevant to data engineering responsibilities and tasks.
- Expert SQL programming skills
- Expertise in at least one relevant object-oriented programming language (e.g., Python, Scala)
- Experience building and optimizing data pipelines, architectures and data sets
- Experience with big data tools (Spark, Kafka, etc.)
- Experience in data warehousing and data modeling
- Good knowledge of data management, data integration, and database development techniques with a good understanding of data architecture principles
- Strong analytical and problem solving skills
- Ability to communicate across all levels of the organization and work with diverse project teams
- Experience using cloud-based infrastructure such as Azure and AWS
- Experience in DevOps best practices including CI/CD, process automation and optimization
Preferred Qualifications:
- Experience with designing and implementing real-time pipelines
- Experience with data quality and validation
- Experience working in an Agile environment
Job Type: Full-time
Pay: $99,000.00 - $123,000.00 per year
Benefits:
- 401(k)
- Dental insurance
- Health insurance
- Paid time off
Application Question(s):
- Please confirm that you have 4–6 years of professional, post-academic experience in data management disciplines, including data integration, data modeling, data optimization, and data quality.
- Do you currently, or will you potentially in the future, require the company, Shaw Industries Group, to sponsor an employment-based visa (e.g., H-1B, employment-based green card, etc.) to employ you?
Note: If you are currently on OPT, STEM OPT, EAD OPT, TN visa, F-1 visa, or a similar/another visa status, please answer 'Yes'.
- Are in or around the following areas of GA, TN, AL?
- Please elobrate on your experience related to Snowflake, Data Platform, Data Lakehouse, PySpark, Spark SQL, ML Flow, Lakehouse, Delta Lake, Unity Catalog, Medallion Architecture, and or Kafka.
Work Location: Hybrid remote in Dalton, GA 30721
Salary : $99,000 - $123,000