What are the responsibilities and job description for the Data Analytics Architect position at Jobs via Dice?
Dice is the leading career destination for tech experts at every stage of their careers. Our client, Stellar IT Solution, is seeking the following. Apply via Dice today!
Our Fortune 500 client is looking for a Data Analytics Architect their project based in Tallahassee, FL
Job Title: Data Analytics Architect
Location: Tallahassee, FL
Duration: Longterm
Job Description:
Stellar IT Solutions is a Global IT Solution provider headquartered in Rockville, MD, with operations in the US and India. Stellar IT Solutions has over 15 years of IT and consulting experience to give cost-effective solutions to many Fortune 500 companies.
Our Fortune 500 client is looking for a Data Analytics Architect their project based in Tallahassee, FL
Job Title: Data Analytics Architect
Location: Tallahassee, FL
Duration: Longterm
Job Description:
- Minimum of Bachelor’s (4-year) degree in Computer Science, Analytics/Data Science, Information Systems, Business Administration, Public Health Informatics, or another related field.
- Current data and/or analytics certification such as Certified Data Management Professional (CDMP). Eighteen or more hours of participation in webinars or conferences over the last 3 years related to data and analytics may be substituted for the certification.
- Five or more years of experience collaborating with various lines of business and the analyst/data science community; must demonstrate an understanding of general business operations related to healthcare and administration, the ability to translate use case requirements into data and analytics
- Five or more years of experience modeling, engineering, implementing, and supporting data warehouses (both normalized and dimensional) including two or more years of experience with Snowflake Data Warehouse.
- Three or more years of experience designing, engineering, implementing, and supporting cloud-based data lakes, including lake layers and bucket structures in AWS S3 and related Apache Foundation tools such as Parquet.
- Two or more years of experience modeling, engineering, implementing, and supporting cloud data lakehouse structures (Delta Lake, Hudi, and Iceberg) utilizing Databricks.
- Seven or more years of experience in data modeling (including Entity Relationship, Logical, Conceptual, and Physical models) for analytical purposes, and data profiling/reverse engineering for both schema-on-read and schema-on-write environments. The experience must include advanced proficiency with Erwin data modeler.
- Five or more years of experience with data pipeline/Integration tools, including source to target mapping, coding, data quality and enrichment transforms, observability, orchestration, performance optimization, and testing, via various methods (i.e., ETL/ELT/batch, CDC, Streaming) using Informatica tools.
- This experience must include two or more years of experience using Informatica IDMC suite of tools including Cloud Data Integration (CDI), Cloud Data Quality (CDQ), Cloud Data Profiling (CDP), and Cloud Data Ingestion and Replication (CDIR).
- Five or more years of experience with SQL programming, three or more years of experience with Python or a similar object-oriented high-level programming language. Experience with AWS Lambda functions is helpful.
- Five or more years of experience engineering relational (both row and columnar store) and NoSQL (i.e., Document, Graph, Vector, Key-Value) databases.
- Three or more years of experience working within the AWS cloud infrastructure.
- Two or more years of experience utilizing Dev or DataOps processes.
- Five or more years of data and analytics testing/quality assurance and acceptance experience, including best practices, tools, and automation.
- Two or more years of experience working with metadata and a data catalog that supports the data lakehouse and data warehouse operations, as well as supporting machine learning.
- Two or more years of experience implementing and supporting Master Data Management (MDM) and Reference Data Management (RDM) in Informatica IDMC (Cloud) Customer and Reference 360 SaaS.
- Four or more years of experience supporting cloud-based Analytics & Business Intelligence (ABI) tools: Qlik and Tableau, including secure data warehouse connections, performance and user access controls.
- Two or more years of experience supporting cloud-based Data Science and Machine Learning platforms (DSML): Dataiku, including statistical model life-cycle management, endpoints, and machine learning.
- Must possess excellent communication skills, including verbal, written and diagramming.
Stellar IT Solutions is a Global IT Solution provider headquartered in Rockville, MD, with operations in the US and India. Stellar IT Solutions has over 15 years of IT and consulting experience to give cost-effective solutions to many Fortune 500 companies.