Demo

R2R Data Engineer(R2R (Record to Report) -- Austin, TX (Onsite)

Smart Folks Inc.
Austin, TX Contractor
POSTED ON 4/14/2026
AVAILABLE BEFORE 5/10/2026

Hi  

Greetings from Smart Folks…!!!

   

My name is Kumar we have a job opportunity for you as R2R Data Engineer(R2R (Record to Report) one of our client based at Austin, TX (Onsite) find the Job description below, if you are available and interested, please send us your word copy of your resume with following detail to  Or please call me on  to discuss more about this position.  

                                   

Job Title: R2R Data Engineer(R2R (Record to Report)
Location: Austin, TX (Onsite)

Duration: 06 – 12 Months

Start Date: ASAP

 

Job Details:        

 

Summary

 

The R2R Data Engineer is a technical expert responsible for architecting and implementing enterprise-scale data infrastructure that powers critical financial analytics and reporting for A Finance. This role requires deep technical expertise in modern data engineering practices, including building high-performance ETL/ELT pipelines, designing scalable data models, and implementing robust data quality frameworks that ensure accuracy and consistency across financial systems.

 

This position demands hands-on experience with cloud-native data platforms, advanced SQL optimization, and programmatic data transformations. The engineer will work cross-functionally with business users, FDT, IS&T, data scientists, and other engineers to develop production-grade data services that support financial close processes, regulatory reporting, and strategic decision-making.

 

You will be working in enterprise data warehouse (Snowflake), Dataiku and lakehouse environments (AWS S3) to design dimensional models, implement data governance policies, and optimize query performance for large-scale financial datasets.

 

Responsibilities

 

  • Design and implement scalable data architectures and dimensional models (star/snowflake schemas) that support financial reporting, analytics, and machine learning use cases 
  • Develop, test, deploy, monitor, document and troubleshoot complex data pipelines using modern orchestration frameworks with proper error handling, logging, and alerting mechanisms 
  • Build and maintain RESTful APIs and microservices for data access and integration with downstream applications (e.g.: Blackline)
  • Implement data quality frameworks including automated validation, reconciliation logic, and anomaly detection to ensure financial data accuracy 
  • Optimize SQL queries and data models for performance in Snowflake, including leveraging clustering keys, materialized views, and query optimization techniques 
  • Design and implement secure data pipelines with end-to-end encryption, role-based access controls, and compliance with data privacy regulations
  • Collaborate with data scientists and ML engineers to build feature stores and data pipelines that support machine learning model training and inference
  • Establish and enforce data engineering best practices including code reviews, testing strategies (unit, integration, data quality tests), and documentation standards 
  • Evaluate and implement emerging technologies in the data engineering space (e.g.: streaming platforms, data quality tools, metadata management solutions) 
  • Participate in on-call rotation to support production data pipelines and resolve critical incidents

 

Key Qualifications

 

Required Technical Skills: 

  • 5 years of advanced Python programming experience including object-oriented design, asynchronous programming, and package development
  • Expert-level SQL skills including complex joins, window functions, CTEs, query optimization, and performance tuning in databases 
  • Hands-on experience designing and implementing data models in Snowflake including time-travel, zero-copy cloning, data sharing, and cost optimization strategies
  • Proven experience building production-grade ETL/ELT pipelines processing large volumes of data 
  • Strong experience with AWS services including S3, Lambda, EC2, IAM, Secrets Manager, and CloudWatch
  • Experience implementing data security controls including encryption at rest/in transit, data masking, tokenization, and row-level security 
  • Hands-on experience with CI/CD pipelines using GitHub
  • Strong Git version control skills including branching strategies, pull requests, and code review processes 
  • Proficiency in shell scripting (Bash) for automation and system administration tasks

 

Preferred Technical Skills: 

  • Experience with streaming data platforms (Kafka, Kinesis, Pub/Sub) and real-time data processing frameworks (Spark Streaming, Flink) 
  • Knowledge of containerization (Docker) and orchestration platforms (Kubernetes, ECS) 
  • Experience with data catalog and metadata management tools (Alation, Collibra, DataHub) 
  • Experience with data quality frameworks (Great Expectations, Soda, Monte Carlo) 
  • Experience building and consuming RESTful APIs using frameworks like FastAPI or Flask

 

Business & Soft Skills:

  • Understanding of financial processes including Record-to-Report (R2R), Order-to-Cash (O2C), Procure-to-Pay (P2P), or financial planning 
  • Experience working with ERP systems (SAP, Oracle Financials) and extracting data from these platforms
  • Strong problem-solving skills with ability to debug complex data issues and performance bottlenecks 
  • Excellent communication skills with ability to explain technical concepts to non-technical stakeholders
  • Experience working in Agile/Scrum environments with cross-functional teams

 

Education and Experience

 

  • Bachelor''s degree in Computer Science, Computer Engineering, Data Engineering, Mathematics, Statistics, or other quantitative discipline required
  • 5 years of professional experience in data engineering roles with demonstrated expertise in building production data systems
  • Master''s degree in related field preferred

 

If you are interested in the position, please fill the details:  

Full Name

 

Contact Number 

 

Email Id 

 

DOB (MMDD) 

 

EX-Wipro (If Yes Please provide EMP id)

 

Current location 

 

Visa status 

 

SSN Last 4 Digits

 

Relocation 

 

Availability for new project  

 

Interview Availability

 

Highest degree / Completion Year 

 

LinkedIn Id

 

Rate

 

 

 

 

 

 

 

 

Shravan Kumar Kataboina

Team Lead – Talent Acquisition
E-mail:

 

A: 

McKinney, Texas (USA. INDIA. Mexico, UK)

P: 

( 1)  Ext - 104 F: 

 

 

  

Smart Folks Inc (A Certified MBE & WBE): Work with the folks, Who are Smart

 

 

 

 

 

Salary : $60

If your compensation planning software is too rigid to deploy winning incentive strategies, it’s time to find an adaptable solution. Compensation Planning
Enhance your organization's compensation strategy with salary data sets that HR and team managers can use to pay your staff right. Surveys & Data Sets

What is the career path for a R2R Data Engineer(R2R (Record to Report) -- Austin, TX (Onsite)?

Sign up to receive alerts about other jobs on the R2R Data Engineer(R2R (Record to Report) -- Austin, TX (Onsite) career path by checking the boxes next to the positions that interest you.
Income Estimation: 
$122,257 - $154,284
Income Estimation: 
$143,391 - $179,890
Employees: Get a Salary Increase
View Core, Job Family, and Industry Job Skills and Competency Data for more than 15,000 Job Titles Skills Library

Job openings at Smart Folks Inc.

  • Smart Folks Inc. Austin, TX
  • Position: Teamcenter Admin with Linux Location: Austin, TX (Onsite) Job Description • Knowledge of Teamcenter Architecture 4-Tier and 2-Tier architecture. ... more
  • 10 Days Ago

  • Smart Folks Inc. Waltham, MA
  • Hi Greetings from Smart Folks…!!! My name is Kumar we have a job opportunity for you as Director, Environment & Release Management – Global Solutions Integ... more
  • 11 Days Ago

  • Smart Folks Inc. Denver, CO
  • Job Details: E2E Blue Planet Architecture understanding Designing Technical solution Ability to build Proof of Concepts Aligning with Business goals Defini... more
  • 15 Days Ago

  • Smart Folks Inc. Edina, MN
  • A “Branch Support Specialist” is responsible for providing technical support for hardware and software issues and requests to End Users at branch locations... more
  • 1 Day Ago


Not the job you're looking for? Here are some other R2R Data Engineer(R2R (Record to Report) -- Austin, TX (Onsite) jobs in the Austin, TX area that may be a better fit.

  • NTT DATA Austin, TX
  • At NTT DATA, we know that with the right people on board, anything is possible. The quality, integrity, and commitment of our employees have been key facto... more
  • 1 Month Ago

  • VAST Data Austin, TX
  • VAST Data is looking for a Senior Systems Engineer to join our growing team! This is a great opportunity to be part of one of the fastest-growing infrastru... more
  • 21 Days Ago

AI Assistant is available now!

Feel free to start your new journey!