Recent Searches

You haven't searched anything yet.

1 Big Data Solutions Architect and Client: Stater Bros and Rate: $130/hr and Location: San Berdanhino, CA and Duration: 6+ months Job in San Bernardino, CA

SET JOB ALERT
Details...
Big Data Solutions Architect and Client: Stater Bros and Rate: $130/hr and Location: San Berdanhino, CA and Duration: 6+ months
nFolks Ltd San Bernardino, CA
$122k-155k (estimate)
Full Time 1 Month Ago
Save

nFolks Ltd is Hiring a Big Data Solutions Architect and Client: Stater Bros and Rate: $130/hr and Location: San Berdanhino, CA and Duration: 6+ months Near San Bernardino, CA

Company DescriptionnFolksJob DescriptionBig Data Solutions ArchitectQualificationsBachelorsAdditional InformationJob DescriptionLocal candidate preferred or ability to relocate General Experience-10 years of experience building solution designs and architectures for enterprise Big Data Solutions-3 years of experience in technology consulting preferred-CPG/Retail domains is preferred-Working with all organizational levels to understand requirements and provide thought leadership related to Big Data Solutions-Ability to facilitate, guide, and influence decision makers and stakeholders towards the proper IT architecture-Ability to create presentation materials and simplify complex ideas-Ability to present technology architecture and solution overviews to executive audiences- Drive innovations through hands on proof-of-concept's and prototypes to help illustrate approaches to technology and business problems Functional Experience-Full Software Development Life Cycle (SDLC) of the Big Data Solutions-Experience with data integration and streaming technologies for EDW and Hadoop-Data modeling and database design-Data warehousing and Business Intelligence systems and tools-Open source Hadoop stack-Administration, configuration, monitoring, and performance tuning of Hadoop/Distributed platforms-Big Data and real time analytics platforms-ETL for Big Data- Migration of Legacy data warehouse to Data Lake- Develop guidelines, standards, and processes to ensure the highest data quality and integrity- Understanding of CI/CD in relation to Big Data platform.- Understanding of Containers technologies is a plus-Knowledge/experience of cloud computing infrastructure (e.g. Amazon Web Services EC2, Elastic MapReduce, Azure) Combination of technical skills-Hadoop (HDFS, MapReduce, Hive, Hbase, Pig, Spark)-Cloudera, Hortonworks, MapR,-NoSQL (Cassandra, MongoDB, Hbase)- Git, Nexus- Enterprise scheduler- Kafka, Flume, Strom-Appliances (Teradata, Netezza)-Languages (Java, Linux, Apache, Perl/Python/PHP)-Data Virtualization Education-Bachelor's degree in Computer Science or a related field preferred-Master's degree in a related field preferred Job DescriptionLocal candidate preferred or ability to relocate General Experience-10 years of experience building solution designs and architectures for enterprise Big Data Solutions-3 years of experience in technology consulting preferred-CPG/Retail domains is preferred-Working with all organizational levels to understand requirements and provide thought leadership related to Big Data Solutions-Ability to facilitate, guide, and influence decision makers and stakeholders towards the proper IT architecture-Ability to create presentation materials and simplify complex ideas-Ability to present technology architecture and solution overviews to executive audiences- Drive innovations through hands on proof-of-concept's and prototypes to help illustrate approaches to technology and business problems Functional Experience-Full Software Development Life Cycle (SDLC) of the Big Data Solutions-Experience with data integration and streaming technologies for EDW and Hadoop-Data modeling and database design-Data warehousing and Business Intelligence systems and tools-Open source Hadoop stack-Administration, configuration, monitoring, and performance tuning of Hadoop/Distributed platforms-Big Data and real time analytics platforms-ETL for Big Data- Migration of Legacy data warehouse to Data Lake- Develop guidelines, standards, and processes to ensure the highest data quality and integrity- Understanding of CI/CD in relation to Big Data platform.- Understanding of Containers technologies is a plus-Knowledge/experience of cloud computing infrastructure (e.g. Amazon Web Services EC2, Elastic MapReduce, Azure) Combination of technical skills-Hadoop (HDFS, MapReduce, Hive, Hbase, Pig, Spark)-Cloudera, Hortonworks, MapR,-NoSQL (Cassandra, MongoDB, Hbase)- Git, Nexus- Enterprise scheduler- Kafka, Flume, Strom-Appliances (Teradata, Netezza)-Languages (Java, Linux, Apache, Perl/Python/PHP)-Data Virtualization Education-Bachelor's degree in Computer Science or a related field preferred-Master's degree in a related field preferred

Job Summary

JOB TYPE

Full Time

SALARY

$122k-155k (estimate)

POST DATE

03/14/2024

EXPIRATION DATE

05/13/2024

Show more

nFolks Ltd
Full Time
$94k-119k (estimate)
1 Week Ago
nFolks Ltd
Full Time
$116k-143k (estimate)
2 Weeks Ago
nFolks Ltd
Full Time
$80k-105k (estimate)
2 Weeks Ago