What are the responsibilities and job description for the Data Engineer with SAP BW position at CX Data Labs?
Hi
Title: Data Engineer with SAP BW
Location: McKinney, TX, Onsite
Type: Full Time
About CX Data Labs:
At CX Data Labs, we believe that systematically understanding and adapting to changing customer preferences enables companies to sustainably grow their business. We help our clients realize this vision through our advisory and implementation services in Digital, Data, and Analytics, bridging the gap between Strategy and Execution. Our secret sauce is our team of highly skilled and knowledgeable "Craftsmen," who are constantly expanding their expertise by participating in industry-shaping initiatives. With our systematic problem-solving approaches and deep technology, data, and analytics expertise, we're leading the charge in driving business success through the refinement of customer-centric processes.
Required Skills and Qualifications:
• Educational Background:
o Bachelor’s or Master’s degree in Computer Science, Information Technology, Data Science, or a related field.
o Certifications in Databricks, Azure, or related technologies is plus.
Technical Skills:
o Proficiency in SQL for complex queries, database design, and optimization.
o Strong experience with PySpark for data transformation and processing.
o Hands-on experience with Databricks for building and managing big data solutions.
o Familiarity with cloud platforms like AWS, Azure, or Google Cloud.
INNOVATION STARTS HERE
o Knowledge of data warehousing concepts and tools (e.g., Snowflake, Redshift).
o Experience with data versioning and orchestration tools like Git, Airflow, or Dagster.
o Solid understanding of Big Data ecosystems (Hadoop, Hive, etc.).
Preferred Qualifications:
• 8 years of relevant work experience in data engineering or software engineering equivalent.
• 3 years of experience in implementing big data processing technology: AWS / Azure / GCP, Apache Spark, Python.
• Experience writing and optimizing SQL queries in a business environment with large-scale, complex datasets.