What are the responsibilities and job description for the Sr. Data engineer with AB Initio and Python position at Accord Technologies Inc?
Title: Sr. Data engineer with AB Initio and Python
Location: Charlotte NC
Duration: 12 months
Position type: W2 contract.
Position Details
We are seeking a highly skilled Data Engineer with extensive experience in AB Initio and Python to join our dynamic data team. The ideal candidate will be responsible for designing, developing, and maintaining scalable data pipelines and architectures to support our data analytics and business intelligence initiatives.
Key Responsibilities
Location: Charlotte NC
Duration: 12 months
Position type: W2 contract.
Position Details
We are seeking a highly skilled Data Engineer with extensive experience in AB Initio and Python to join our dynamic data team. The ideal candidate will be responsible for designing, developing, and maintaining scalable data pipelines and architectures to support our data analytics and business intelligence initiatives.
Key Responsibilities
- Design, develop, and optimize data pipelines and workflows using AB Initio to extract, transform, and load (ETL) data from diverse sources.
- Develop robust automation scripts and data processing solutions leveraging Python.
- Collaborate with data analysts and stakeholders to understand data requirements and translate them into efficient data architectures.
- Ensure data quality, integrity, and security across all data processes.
- Monitor and troubleshoot data workflows, implementing improvements for efficiency and performance.
- Maintain detailed documentation of data pipelines, processes, and architecture.
- Stay current with industry best practices and emerging technologies related to data engineering.
- Proven experience as a Data Engineer with strong expertise in AB Initio.
- Proficiency in Python programming, including libraries such as pandas, NumPy, and airflow or similar workflow management tools.
- Good understanding of data warehousing concepts and experience with data modeling.
- Experience working with large-scale datasets and distributed computing frameworks.
- Familiarity with SQL and NoSQL databases.
- Strong problem-solving skills and attention to detail.
- Excellent communication and teamwork abilities.
- Bachelor’s or Master’s degree in Computer Science, Information Technology, or related field.
- Experience with cloud platforms (AWS, Azure, GCP).
- Knowledge of other ETL tools and scripting languages.
- Certifications related to data engineering or cloud technologies.