What are the responsibilities and job description for the Azure/ETL (Python, PySpark) Developer position at Net2Source (N2S)?
Role: Azure/ETL (Python, PySpark) Developer
Location: Dallas/ hybrid role – in -person client interview if shortlisted
Term: Contract
What you'll do:
- Work closely with source data application teams and product owners to design, implement and support analytics solutions that provide insights to make better decisions
- Implement data migration and data engineering solutions using Azure products and services: (Azure Data Lake Storage, Azure Data Factory, Azure Functions, Event Hub, Azure Stream Analytics, Azure Databricks, etc.) and traditional data warehouse tools.
- Perform multiple aspects involved in the development lifecycle – design, cloud engineering (Infrastructure, network, security, and administration), ingestion, preparation, data modeling, testing, CICD pipelines, performance tuning, deployments, consumption, BI, alerting, prod support.
- Provide technical leadership and collaborate within a team environment as well as work independently.
- Be a part of a DevOps team that completely owns and supports their product
- Implement batch and streaming data pipelines using cloud technologies
- Senior level experience with data bricks and other analytics tools for moving and transforming data from one platform to another.
- Senior level experience with Azure and its services.
- Senior level experience in leading others.
- Senior level experience in formulating solutions and recognizing problems.
- Ability to work with a shift that coincides with the CST time zone.
- Ability to work with a geographically dispersed team.
- Ability to work with Agile methodologies.
- Good written and oral skills.
All you'll need for success
Minimum Qualifications- Education & Prior Job Experience
- Bachelor's degree in Computer Science, Computer Engineering, Technology, Information Systems (CIS/MIS), Engineering or related technical discipline, or equivalent experience/training
- 3 years software solution development using agile, DevOps, operating in a product model that includes designing, developing, and implementing large-scale applications or data engineering solutions
- 3 years data analytics experience using SQL
- 2 years of cloud development and data lake experience (prefer Microsoft Azure) including Azure EventHub, Azure Data Factory, Azure Databricks, Azure DevOps, Azure Blob Storage, Azure Data Lake, Azure Power Apps and Power BI.
Combination of Development, Administration & Support experience in several of the following tools/platforms required:
- Scripting: Python, PySpark, Unix, SQL
- Data Platforms: Teradata, SQL Server, ADLS,
- Azure Data Explorer. Administration skills a plus
- Azure Cloud Technologies: Azure Data Factory, Azure Databricks, Azure Blob Storage, Azure Data Lake, Azure Power Apps and Azure Functions
- CI/CD: GitHub, Azure DevOps, Terraform
- BI Analytics Tool Stack - Cognos, Power BI
Preferred Qualifications- Education & Prior Job Experience
- 5 years software solution development using agile, dev ops, product model that includes designing, developing, and implementing large-scale applications or data engineering solutions.
- 3 years data analytics experience using SQL
- 3 years full-stack development experience, preferably in Azure
- 3 years of cloud development and data lake experience (prefer Microsoft Azure) including Azure EventHub, Azure Data Factory, Azure Functions, ADX, ASA, Azure Databricks, Azure DevOps, Azure Blob Storage, Azure Data Lake, Azure Power Apps and Power BI.
- Airline Industry Experience
Skills, Licenses & Certifications
- Expertise with the Azure Technology stack for data management, data ingestion, capture, processing, curation and creating consumption layers.
- Azure Development Track Certification (preferred)
- Spark Certification (preferred)
Salary : $50 - $52