You haven't searched anything yet.
Hi
We are looking for a Big Data Architect IT Consultant Master (Min 5 Yrs. Exp. In Big Data and Total 18 yrs.) 100% ONSITE (Webcam Interview).
Title: Big Data Architect
Length: 9-21Months
Work Address: Washington DC(ONSITE)
Immediate interviews – (Webcam Interview)
Please Note this position is 100% ONSITE
· This role will provide expertise to support the development of a Big Data / Data Lake system architecture that supports enterprise data operations for the District of Columbia government, including the Internet of Things (IoT) / Smart City projects, enterprise data warehouse, the open data portal, and data science applications. This is an exciting opportunity to work as a part of a collaborative senior data team supporting DC's Chief Data Officer. This architecture includes an Databricks, Microsoft Azure platform tools (including Data Lake, Synapse), Apache platform tools (including Hadoop, Hive, Impala, Spark, Sedona, Airflow) and data pipeline/ETL development tools (including Streamsets, Apache NiFi, Azure Data Factory). The platform will be designed for District wide use and integration with other OCTO Enterprise Data tools such as Esri, Tableau, MicroStrategy, API Gateways, and Oracle databases and integration tools.
JOB DESCRIPTION
Responsibilities:
1. Coordinates IT project management, engineering, maintenance, QA, and risk management.
2. Plans, coordinates, and monitors project activities.
3. Develops technical applications to support users.
4. Develops, implements, maintains and enforces documented standards and procedures for the design, development, installation, modification, and documentation of assigned systems.
5. Provides training for system products and procedures.
6. Performs application upgrades.
7. Performs, monitoring, maintenance, or reporting on real- time databases, real-time network and serial data communications, and real-time graphics and logic applications.
8. Troubleshoots problems.
9. Ensures project life-cycle is in compliance with District standards and procedures.
Minimum Education/Certification Requirements:
Bachelor’s degree in Information Technology or related field or equivalent experience
Required/Desired Skills
Candidates must have ALL the “Required” skills in order to be considered for the position. “Desired” or “Highly Desired” skills are a PLUS but may NOT be required.
Skill Matrix (Please fill the last two columns of this matrix)
Experience with Business workflow processes
Required / Desired
Amount
of Experience
Years of Experience
Last Used
Experience implementing Big Data storage and analytics platforms such as Databricks and Data Lakes
Required
5
Years
Knowledge of Big Data and Data Architecture and Implementation best practices
Required
5
Years
Knowledge of architecture and implementation of networking, security and storage on cloud platforms such as Microsoft Azure
Required
5
Years
Experience with deployment of data tools and storage on cloud platforms such as Microsoft Azure
Required
5
Years
Knowledge of Data-centric systems for the analysis and visualization of data, such as Tableau, MicroStrategy, ArcGIS, Kibana, Oracle
Required
10
Years
Experience querying structured and unstructured data sources including SQL and NoSQL databases
Required
5
Years
Experience modeling and ingesting data into and between various data systems through the use of Data Pipelines
Required
5
Years
Experience with implementing Apache data products such as Spark, Sedona, Airflow, Atlas, NiFi, Hive, Impala
Required
5
Years
Experience with API / Web Services (REST/SOAP)
Required
3
Years
Experience with complex event processing and real-time streaming data
Required
3
Years
Experience with deployment and management of data science tools and modules such as JupyterHub
Required
3
Years
Experience with ETL, data processing, analytics using languages such as Python, Java or R
Required
3
Years
Experience with Cloudera Data Platform
Highly desired
3
Years
16 yrs planning, coordinating, and monitoring project activities
Required
16
Years
16 yrs leading projects, ensuring they are in compliance with established standards/procedures
Required
16
Years
Bachelor’s degree in IT or related field or equivalent experience
Required
16
Years
Job Type: Contract
Pay: $70.00 - $80.00 per hour
Schedule:
Work Location: In person
Contractor
$128k-162k (estimate)
02/28/2024
06/25/2024
valsatechcorp.com
Windsor Mill, MD
25 - 50
The job skills required for Big Data Architect || Contract Role || Onsite Role WADC include Data Architecture, Azure, SQL, Data Warehouse, Big Data, Python, etc. Having related job skills and expertise will give you an advantage when applying to be a Big Data Architect || Contract Role || Onsite Role WADC. That makes you unique and can impact how much salary you can get paid. Below are job openings related to skills required by Big Data Architect || Contract Role || Onsite Role WADC. Select any job title you are interested in and start to search job requirements.
The following is the career advancement route for Big Data Architect || Contract Role || Onsite Role WADC positions, which can be used as a reference in future career path planning. As a Big Data Architect || Contract Role || Onsite Role WADC, it can be promoted into senior positions as a Data Architect III that are expected to handle more key tasks, people in this role will get a higher salary paid than an ordinary Big Data Architect || Contract Role || Onsite Role WADC. You can explore the career advancement for a Big Data Architect || Contract Role || Onsite Role WADC below and select your interested title to get hiring information.