You haven't searched anything yet.
Job Description
ROLE: SR. DATA ENGINEER / SOLUTUON ARCHITECT
LOCATION: MEXICO
Job Description
Lead data engineering initiatives leveraging cloud services to automate data ingestion pipelines (batch and streaming) along with automated testing, deployments, and monitoring
Architect and design solutions for data engineering initiatives
Serve as a subject matter expert for cloud data platforms and data services
Work with the DataOps team for code migration and continuous improvements
Serve as delivery lead for a team of onsite/offshore data engineers and data analysts
Provide data engineering and data analysis services to business functions to solve problems through the right data and insights
Help educate and train the junior data team to grow in their career
Apply a product mindset and product design thinking to build data as products
Partner with the business to define a value-generation roadmap and deliver MVPs of data products across data domains
Build product-sense for data products while measuring user-adoption, -growth, -retention, and data products quality
Play a key role in building out our enterprise cloud data platform through expert knowledge of cloud service offerings that fit into the Companies Integrated Data Fabric roadmap
Requirements:
Key Skills, Abilities, And Experience Required
8 years of data and technology experience, with at least 4 years in a solution architecture and data engineering.
4 years of Experience with Big Data Architectures, Business Intelligence, Data Engineering, Data Warehousing (Kimball/Inman/RDBMS), Wide Tables, Data Modeling, Report/Dashboard Building
Experience working with any modern data tech stack like Azure Data Factory, Databricks Delta Lakehouse , Azure Synapse Analytics, Google Data Flow, Data Prep, Python based integrations, Integration Run Times, MuleSoft Integration Platform, API led Integrations, etc.
Extensive hands-on experience working with large data sets with hands-on technology skills to design and build robust data analytics solutions using Spark, Delta Lake framework, Azure/GCP Big data services, Databricks and other tools & technologies
Strong analytical and problem-solving skills with a proven track record in gathering and documenting comprehensive business requirements
Extensive experience in data modeling and database design involving any combination of data warehousing and business intelligence systems and tools
Expert knowledge in SQL, Python, PySpark, Databricks, SparkSQL development frameworks
Expert level experience in writing ANSI SQL, T-SQL, PLSQL or HIVE SQL based complex queries and data aggregation and transformation
Experience with API-led data integrations
Experience in working/delivering analytics using Power BI and Tableau
Additional preferred qualifications
Full Time
IT Outsourcing & Consulting
$84k-108k (estimate)
03/20/2023
05/31/2023
in.dynpro.com
Raleigh, NC
<25
IT Outsourcing & Consulting