Demo

Data Platform Engineer

Jobs via Dice
Plano, TX Contractor
POSTED ON 11/25/2025 CLOSED ON 12/19/2025

What are the responsibilities and job description for the Data Platform Engineer position at Jobs via Dice?

Dice is the leading career destination for tech experts at every stage of their careers. Our client, Generis TEK Inc., is seeking the following. Apply via Dice today!

We have Contract role Data Platform Engineer Hybrid for our client at Plano, TX. Please let me know if you or any of your friends would be interested in this position.

Position Details: Data Platform Engineer-Plano TX Hybrid Location : Plano, TX 75074 - Hybrid Project Duration : 3 months (Contract to Hire)

Pay rate : $56/hr. on w2

Job Description:

  • As a Data Platform Engineer, you will be responsible for the design, development, and maintenance of our high-scale, cloud-based data platform, treating data as a strategic product. You will lead the implementation of robust, optimized data pipelines using PySpark and the Data bricks Unified Analytics Platform leveraging its full ecosystem for Data Engineering, Data Science, and ML workflows. You will also establish best-in-class DevOps practices using CI/CD and GitHub Actions to ensure automated deployment and reliability. This role demands expertise in large-scale data processing and a commitment to modern, scalable data engineering and AWS cloud infrastructure practices.

Key Responsibilities:

  • Platform Development: Design, build, and maintain scalable, efficient, and reliable ETL/ELT data pipelines to support data ingestion, transformation, and integration across diverse sources.
  • Big Data Implementation: Serve as the subject matter expert for the Data bricks environment, developing high-performance data transformation logic primarily using PySpark and Python. This includes utilizing Delta Live Tables (DLT) for declarative pipeline construction and ensuring governance through Unity Catalog.
  • Cloud Infrastructure Management: Configure, maintain, and secure the underlying AWS cloud infrastructure required to run the Data bricks platform, including virtual private clouds (VPCs), network endpoints, storage (S3), and cross-account access mechanisms.
  • DevOps & Automation (CI/CD): Own and enforce Continuous Integration/Continuous Deployment (CI/CD) practices for the data platform. Specifically, design and implement automated deployment workflows using GitHub Actions and modern infrastructure-as-code concepts to deploy Data bricks assets (Notebooks, Jobs, DLT Pipelines, and Repos).
  • Data Quality & Testing: Design and implement automated unit, integration, and performance testing frameworks to ensure data quality, reliability, and compliance with architectural standards.
  • Performance Optimization: Optimize data workflows and cluster configurations for performance, cost efficiency, and scalability across massive datasets.
  • Technical Leadership: Provide technical guidance on data principles, patterns, and best practices (e.g., Medallion Architecture, ACID compliance) to promote team capabilities and maturity. This includes leveraging Data bricks SQL for high-performance analytics.
  • Documentation & Review: Draft and review architectural diagrams, design documents, and interface specifications to ensure clear communication of data solutions and technical requirements.

Required Qualifications:

  • Experience: 5 years of professional experience in Data Engineering, focusing on building scalable data platforms and production pipelines.
  • Big Data Expertise: Minimum 3 years of hands-on experience developing, deploying, and optimizing solutions within the Data bricks ecosystem. Deep expertise required in:
  • Delta Lake (ACID transactions, time travel, optimization).
  • Unity Catalog (data governance, access control, metadata management).
  • Delta Live Tables (DLT) (declarative pipeline development).
  • Data bricks Workspaces, Repos, and Jobs.
  • Data bricks SQL for analytics and warehouse operations.
  • AWS Infrastructure & Security: Proven, hands-on experience (3 years) with core AWS services and infrastructure components, including:
  • Networking: Configuring and securing VPCs, VPC Endpoints, Subnets, and Route Tables for private connectivity.
  • Security & Access: Defining and managing IAM Roles and Policies for secure cross-account access and least privilege access to data.
  • Storage: Deep knowledge of Amazon S3 for data lake implementation and governance.
  • Programming: Expert proficiency (4 years) in Python for data manipulation, scripting, and pipeline development.
  • Spark & SQL: Deep understanding of distributed computing and extensive experience (3 years) with Py Sparkand advanced SQL for complex data transformation and querying.
  • DevOps & CI/CD: Proven experience (2 years) designing and implementing CI/CD pipelines, including proficiency with GitHub Actions or similar tools (e.g., Git Lab CI, Jenkins) for automated testing and deployment.
  • Data Concepts: Full understanding of ETL/ELT, Data Warehousing, and Data Lake concepts.
  • Methodology: Strong grasp of Agile principles (Scrum).
  • Version Control: Proficiency with Git for version control.

Preferred Qualifications:

  • AWS Data Ecosystem Experience: Familiarity and experience with AWS cloud-native data services, such as AWS Glue, Amazon Athena, Amazon Redshift, Amazon RDS, and Amazon Dynamo DB.
  • Knowledge of real-time or near-real-time streaming technologies (e.g., Kafka, Spark Structured Streaming).
  • Experience in developing feature engineering pipelines for machine learning (ML) consumption.
  • Background in performance tuning and capacity planning for large Spark clusters.

Salary : $56

If your compensation planning software is too rigid to deploy winning incentive strategies, it’s time to find an adaptable solution. Compensation Planning
Enhance your organization's compensation strategy with salary data sets that HR and team managers can use to pay your staff right. Surveys & Data Sets

What is the career path for a Data Platform Engineer?

Sign up to receive alerts about other jobs on the Data Platform Engineer career path by checking the boxes next to the positions that interest you.
Income Estimation: 
$122,257 - $154,284
Income Estimation: 
$143,391 - $179,890
Income Estimation: 
$168,522 - $211,152
Income Estimation: 
$189,259 - $248,928
Income Estimation: 
$71,122 - $96,652
Income Estimation: 
$92,929 - $122,443
Income Estimation: 
$92,929 - $122,443
Income Estimation: 
$122,257 - $154,284
Income Estimation: 
$122,257 - $154,284
Income Estimation: 
$143,391 - $179,890
This job has expired.
Employees: Get a Salary Increase
View Core, Job Family, and Industry Job Skills and Competency Data for more than 15,000 Job Titles Skills Library

Job openings at Jobs via Dice

  • Jobs via Dice Cheyenne, WY
  • Multiple Cable Technician Openings | Long-Term Opportunity | Cheyenne, WY Job Summary As a Cable Technician , you will support the planning, deployment, an... more
  • 1 Day Ago

  • Jobs via Dice Cheyenne, WY
  • Description Job Summary The Construction Manager will lead the planning, execution, and final commissioning of a project to retrofit an active bitcoin mini... more
  • 1 Day Ago

  • Jobs via Dice Sheridan, WY
  • If you are a current FIB employee, please apply through the Career Worklet in the Employee Portal . This position is located at our Sheridan, WY Downtown b... more
  • 1 Day Ago

  • Jobs via Dice Cheyenne, WY
  • Data Center Technician Job Description We are seeking an experienced Data Center Technician to support the deployment, maintenance, and troubleshooting of ... more
  • 1 Day Ago


Not the job you're looking for? Here are some other Data Platform Engineer jobs in the Plano, TX area that may be a better fit.

  • NTT DATA, Inc. Plano, TX
  • Req ID: 350540 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive... more
  • 7 Days Ago

  • Smart Data Solutions LLC Plano, TX
  • Software Engineer Intern, Platform Architecture Smart Data Solutions, a leading provider of data management, claim routing and workflow solutions to health... more
  • 8 Days Ago

AI Assistant is available now!

Feel free to start your new journey!