What are the responsibilities and job description for the Full time job- Data Architect @Issaquah WA- (/EAD) position at Sophus IT Solutions?
Job Details
Data Architect
Location : Issaquah , WA ( onsite)
Full Time position
Client : COSTCO
UST is searching for a Data Architect who will play a role in designing, developing, and implementing data pipelines and data integration solutions using Python and Google Cloud Platform services.
The opportunity:
Collaborate with cross-functional teams to understand data requirements and design data solutions that meet business needs.
Develop, construct, test and maintain data acquisition pipelines for large volumes of structured and unstructured data. This includes batch and real-time processing
Develop and maintain data pipelines and ETL processes using Python.
Design, build, and optimize data models and data architecture for efficient data processing and storage
Implement data integration and data transformation workflows to ensure data quality and consistency
Monitor and troubleshoot data pipelines to ensure data availability and reliability
Conduct performance tuning and optimization of data processing systems for improved efficiency and scalability
This position description identifies the responsibilities and tasks typically associated with the performance of the position. Other relevant essential functions may be required.
What you need:
Working experience as a Data Engineer
Experienced in migrating large-scale applications from legacy systems to modern architectures.'
Good programming skills in Python and experience with Spark for data processing and analytics
Experience in Google Cloud Platform services such as GCS, Dataflow, Cloud Functions, Cloud Composer, Cloud Scheduler, Datastream (CDC), Pub/Sub, BigQuery, Dataproc, etc. with Apache Beam (Batch & Stream data processing).
Develop JSON messaging structure for integrating with various application
Leverage DevOps and CI/CD practices (GitHub, Terraform) to ensure the reliability and scalability of data pipelines.
Experience with scripting languages like Shell, Perl etc.
Design and build an ingestion pipeline using Rest API.
Experience with data modeling, data integration, and ETL processes
Strong knowledge of SQL and database systems
Familiarity with managing cloud-native databases.
Understanding of security integration in CI/CD pipelines.
Understanding of data warehousing concepts and best practices
Proficiency in working with large-scale data sets and distributed computing frameworks
Strong problem-solving and analytical skills
Excellent communication and teamwork abilities