What are the responsibilities and job description for the Software Engineer (ETL) position at Radial Inc.?
Radial, Inc seeks Software Engineer (ETL) in King of Prussia, PA.
Job Duties: ETL activities in Manhattan, loaded into a data warehouse or business intelligence tools for analysis, helping companies make data-driven decisions on inventory, marketing, and customer segmentation. Ensure robust monitoring is in place to track the performance of your ETL pipelines, especially under the load of Black Friday/Cyber Monday traffic. Architect, implement, and optimize advanced machine learning and deep learning models for large-scale structured and unstructured data. Develop and maintain automated ML pipelines, integrating model training, validation, and deployment using tools like Airflow. Collaborate with data engineers to preprocess massive datasets using distributed processing tools like Spark or Hadoop. Apply e-commerce domain-specific knowledge in areas like marketing analytics, supply chain optimization, customer segmentation, fraud detection, or financial forecasting. Lead the full lifecycle of Data Science projects—from data acquisition and cleaning, through exploratory analysis and modeling, to delivery of actionable insights. Understand the customer requirements, analysis, design, development, and implementation into the system, Experience with Snowflake Data warehouse, deep understanding of Snowflake architecture and processing. Automating, configuring and deploying instance on AWS. Write complex SQL scripts using Statistical Aggregate functions and Analytical functions. Design and develop E2E data flow starting from Source databases to hive tables which includes denormalization in Hadoop platform. Collaborate with data teams to support ML/AI workloads using custom frameworks on EC2/Batch. Work on Snowflake utilities such as Snow SQL, Snow Pipe Tasks, Streams, Time travel, Optimizer, Metadata Manager, data sharing, and stored procedures. Lead migration efforts from legacy data platforms (e.g., Oracle, Netezza, SQL Server) to Snowflake, including schema conversion and data validation. Telecommuting permitted.
Minimum Requirements: Bachelor’s degree, or foreign equivalent, in Computer Science, Engineering (any), Computer information Systems, or closely related quantitative discipline and six (6) years of experience in the job offered or six (6) years of progressively responsible experience in the field of Software Engineering, or related occupation.
Special Skill Requirements:
Apply through the career’s website: https://radial.my.salesforce-sites.com/careers with reference to: Job#: JO-2603-11234.
Job Duties: ETL activities in Manhattan, loaded into a data warehouse or business intelligence tools for analysis, helping companies make data-driven decisions on inventory, marketing, and customer segmentation. Ensure robust monitoring is in place to track the performance of your ETL pipelines, especially under the load of Black Friday/Cyber Monday traffic. Architect, implement, and optimize advanced machine learning and deep learning models for large-scale structured and unstructured data. Develop and maintain automated ML pipelines, integrating model training, validation, and deployment using tools like Airflow. Collaborate with data engineers to preprocess massive datasets using distributed processing tools like Spark or Hadoop. Apply e-commerce domain-specific knowledge in areas like marketing analytics, supply chain optimization, customer segmentation, fraud detection, or financial forecasting. Lead the full lifecycle of Data Science projects—from data acquisition and cleaning, through exploratory analysis and modeling, to delivery of actionable insights. Understand the customer requirements, analysis, design, development, and implementation into the system, Experience with Snowflake Data warehouse, deep understanding of Snowflake architecture and processing. Automating, configuring and deploying instance on AWS. Write complex SQL scripts using Statistical Aggregate functions and Analytical functions. Design and develop E2E data flow starting from Source databases to hive tables which includes denormalization in Hadoop platform. Collaborate with data teams to support ML/AI workloads using custom frameworks on EC2/Batch. Work on Snowflake utilities such as Snow SQL, Snow Pipe Tasks, Streams, Time travel, Optimizer, Metadata Manager, data sharing, and stored procedures. Lead migration efforts from legacy data platforms (e.g., Oracle, Netezza, SQL Server) to Snowflake, including schema conversion and data validation. Telecommuting permitted.
Minimum Requirements: Bachelor’s degree, or foreign equivalent, in Computer Science, Engineering (any), Computer information Systems, or closely related quantitative discipline and six (6) years of experience in the job offered or six (6) years of progressively responsible experience in the field of Software Engineering, or related occupation.
Special Skill Requirements:
- Manhattan Active
- Manhattan E-commerce
- Machine Learning
- Data Science
- AWS
- Airflow
- Snowflake
- Hadoop, Hive, and Sqoop
- SQL/NoSQL Databases including Oracle, Netezza, Teradata and HBase.
- Python
- AWS Glue
- IBM info sphere DataStage (ETL)
- Autosys and Control-M
- UNIX
Apply through the career’s website: https://radial.my.salesforce-sites.com/careers with reference to: Job#: JO-2603-11234.