What are the responsibilities and job description for the Data Platform Engineer position at USG, Inc.?
Job Title: Data Platform Engineer
Location: Oaks, PA (Hybrid)
Employment Type: Contract
Travel: Up to 25%
Job Description
Seeking a Data Platform Engineer to design and implement scalable data solutions across modern data platforms. This role will focus on building and supporting data pipelines, data lakes, analytics platforms, and streaming solutions within a rapidly evolving technical ecosystem.
The ideal candidate will have strong experience with cloud-based data platforms, distributed storage systems, and modern data integration technologies, along with the ability to collaborate across multiple data engineering stacks.
Key Responsibilities
-
Design and implement data solutions supporting streaming, data lakes, analytics, and modern data architectures.
-
Provide technical leadership by recommending appropriate technologies across application and infrastructure layers.
-
Develop scalable solutions using Python, Java, and Scala.
-
Collaborate across platforms such as Cloudera, Databricks, Snowflake, and AWS.
-
Create technical documentation, including architecture diagrams, sequence diagrams, and system views.
-
Deliver technical presentations explaining architecture and design decisions.
-
Follow Agile development practices throughout the project lifecycle.
-
Design, build, and deploy databases and data storage solutions for enterprise systems.
Basic Qualifications
-
4 years of experience supporting Software Engineering, Data Engineering, or Data Analytics projects.
-
2 years of experience leading teams delivering end-to-end data solutions.
-
Strong experience with data engineering concepts and distributed data architectures.
-
Ability to travel up to 25% if required.
Preferred Skills
-
Hands-on experience with Snowflake, Databricks, AWS, Azure, Google Cloud Platform, or Hadoop ecosystems.
-
Experience with distributed storage systems such as:
-
HDFS
-
S3
-
ADLS
-
GCS
-
Kudu
-
Cassandra
-
ElasticSearch / Solr
-
-
Experience with data integration and streaming technologies:
-
Spark
-
Kafka
-
NiFi
-
StreamSets
-
AWS Data Migration Services
-
Azure Data Factory
-
Google DataProc
-
-
Strong understanding of event-driven architectures and data pipelines.
-
Excellent communication and documentation skills.
-
Bachelor’s or Master’s degree in Computer Science or related field preferred.
eye