Recent Searches

You haven't searched anything yet.

8 Data Analytics Engineer - AWS Jobs in Chicago, IL

SET JOB ALERT
Details...
Neal Analytics
Chicago, IL | Full Time
$102k-129k (estimate)
1 Month Ago
Google
Chicago, IL | Full Time
$106k-129k (estimate)
6 Days Ago
SPECTRAFORCE
Chicago, IL | Full Time
$109k-138k (estimate)
1 Week Ago
Aspen Dental
Chicago, IL | Full Time
$117k-144k (estimate)
2 Days Ago
Aon
Chicago, IL | Full Time
$120k-147k (estimate)
1 Month Ago
Aon
Chicago, IL | Full Time
$120k-147k (estimate)
1 Month Ago
CapTech Consulting
Chicago, IL | Full Time
$120k-147k (estimate)
1 Month Ago
CapTech Consulting
Chicago, IL | Full Time
$120k-147k (estimate)
1 Month Ago
Data Analytics Engineer - AWS
Neal Analytics Chicago, IL
$102k-129k (estimate)
Full Time | IT Outsourcing & Consulting 1 Month Ago
Save

sadSorry! This job is no longer available. Please explore similar jobs listed on the left.

Neal Analytics is Hiring a Data Analytics Engineer - AWS Near Chicago, IL

Data Analytics Engineer - AWS

Neal Analytics (Fractal.ai) - Chicago, IL, United States

Data Analytics Engineer- AWS

Fractal is one of the most prominent players in the Artificial Intelligence space. Fractal's mission is to power every human decision in the enterprise and brings AI, engineering, and design to help the world's most admired Fortune 500® companies.

Fractal has more than 3,000 employees across 16 global locations, including the United States, UK, Ukraine, India, Singapore, and Australia. Fractal has consistently been rated as India's best companies to work for, by The Great Place to Work® Institute, featured as a leader in Customer Analytics Service Providers Wave™ 2021, Computer Vision Consultancies Wave™ 2020 & Specialized Insights Service Providers Wave™ 2020 by Forrester Research, and recognized as an "Honorable Vendor" in 2021 Magic Quadrant™ for data & analytics by Gartner.

Education: A bachelor’s degree in Computer Science or related field with technology experience

Experience: 4-8 years

Location: Chicago (open to remote)

Responsibilities:

As a Data Engineer, you will be responsible implementing complex data pipelines and analytics solutions to support key decision-making business processes in our client’s domain. You will gain exposure to a project that is leveraging cutting edge AWS technology that applies Big Data and Machine Learning to solve new and emerging problems for our clients. You will gain a added advantage of working very closely with AWS Professional Services teams directly executing within AWS Services and Technologies to solve complex and challenging business problems for Enterprises.

Key responsibilities include:

  • Work closely with Product Owners and AWS Professional Service Architects to understand requirements, formulate solutions, and implement them.
  • Implement scalable data transformation pipelines as per design
  • Implement Data model and Data Architecture as per laid out design.
  • Evaluate new capabilities of AWS analytics services, develop prototypes, and assist in drawing POVs , participate in design discussions

Requirements:

  • Minimum 3 years’ experience implementing transformation and loading of data from a wide variety of traditional and non-traditional sources such as structured, unstructured, and semi structured using SQL, NoSQL and data pipelines for real-time, streaming, batch and on-demand workloads
  • At least 2 years implementing solutions using AWS services such as Lambda, AWS Athena and Glue AWS S3, Redshift, Kinesis, Lambda, Apache Spark,
  • Experience working with data warehousing data lakes or Lakehouse concepts on AWS
  • Experience implementing batch processing using AWS Glue/Lake formation, & Data Pipeline
  • Experience in EMR/MSK
  • Experience or Exposure to AWS Dynamo DB will be a plus
  • Develop object-oriented code using Python, besides PySpark, SQL and one other languages ( Java or Scala would be preferred)
  • Experience on Streaming technologies both OnPrem/Cloud such as consuming and producing from Kafka, Kinesis
  • Experience building pipelines and orchestration of workflows in an enterprise environment using Apache Airflow/Control M
  • Experience implementing Redshift on AWS or any one of Databricks on AWS, or Snowflake on AWS
  • Good understanding of Dimensional Data Modelling will be a plus.
  • Ability to multi-task and prioritize deadlines as needed to deliver results
  • Ability to work independently or as part of a team
  • Excellent verbal and written communication skills with great attention to detail and accuracy
  • Experience working in an Agile/Scrum environment
Posted On: Saturday, November 19, 2022
Compensation: Open DOE

Job Summary

JOB TYPE

Full Time

INDUSTRY

IT Outsourcing & Consulting

SALARY

$102k-129k (estimate)

POST DATE

03/04/2024

EXPIRATION DATE

03/31/2024

WEBSITE

nealanalytics.com

HEADQUARTERS

BELLEVUE, WA

SIZE

25 - 50

FOUNDED

2011

CEO

CARL ALBRECHT

REVENUE

$5M - $10M

INDUSTRY

IT Outsourcing & Consulting

Related Companies
About Neal Analytics

Since 2011, Neal Analytics, a Microsoft Gold Partner, supports companies with their business transformation initiatives. Neal offers end-to-end consulting services from business strategy to migration and modernization, data science, ML/AI, BI, application development, and RPA. A data and AI technology leader, Neal leverages Agile methodologies and flexible engagement models to deliver measurable customer value. Apply to work at Neal at https://go.nealanalytics.com/jobs

Show more