What are the responsibilities and job description for the Senior SQL Developer position at Kanak Elite Services?
Hello There,
Wish you a Happy Monday,
My name is Yashmita, and I am a Technical Recruiter at Kanak IT Services LLC. I am reaching out to you regarding the following job opportunity. If you are interested, kindly reply to this email yashmita@kanakits.com with your updated resume.
POSITION : APPLICATION DEVELOPMENT PRINCIPAL (GLIDER WILL BE REQUIRED)
DURATION : 2 YEARS CONTRACT
LOCATION: 100% REMOTE
13-15 YEARS OF EXPERIENCE NEEDED
Must Have
Must Have Skills:
Wish you a Happy Monday,
My name is Yashmita, and I am a Technical Recruiter at Kanak IT Services LLC. I am reaching out to you regarding the following job opportunity. If you are interested, kindly reply to this email yashmita@kanakits.com with your updated resume.
POSITION : APPLICATION DEVELOPMENT PRINCIPAL (GLIDER WILL BE REQUIRED)
DURATION : 2 YEARS CONTRACT
LOCATION: 100% REMOTE
13-15 YEARS OF EXPERIENCE NEEDED
Must Have
Must Have Skills:
- Experience working with with both business and IT leaders
- Teradata
- Databricks
- Spark/Pyspark
- Person is 13-15 years' experience that takes initiative and has command of tools...works in a consultative manner vs waiting for direction and orders.
- Collaborate with business and technical stakeholders to gather and understand requirements.
- Design scalable data solutions and document technical designs.
- Develop production-grade, high-performance ETL pipelines using Spark and PySpark.
- Perform data modeling to support business requirements.
- Write optimized SQL queries using Teradata SQL, Hive SQL, and Spark SQL across platforms such as Teradata and Databricks Unity Catalog.
- Implement CI/CD pipelines to deploy code artifacts to platforms like AWS and Databricks.
- Orchestrate Databricks jobs using Databricks Workflows.
- Monitor production jobs, troubleshoot issues, and implement effective solutions.
- Actively participate in Agile ceremonies including sprint planning, grooming, daily stand-ups, demos, and retrospectives.
- Strong hands-on experience with Spark, PySpark, Shell scripting, Teradata, and Databricks.
- Proficiency in writing complex and efficient SQL queries and stored procedures.
- Solid experience with Databricks for data lake/data warehouse implementations.
- Familiarity with Agile methodologies and DevOps tools such as Git, Jenkins, and Artifactory.
- Experience with Unix/Linux shell scripting (KSH) and basic Unix server administration.
- Knowledge of job scheduling tools like CA7 Enterprise Scheduler.
- Hands-on experience with AWS services including S3, EC2, SNS, SQS, Lambda, ECS, Glue, IAM, and CloudWatch.
- Expertise in Databricks components such as Delta Lake, Notebooks, Pipelines, cluster management, and cloud integration (Azure/AWS).
- Proficiency with collaboration tools like Jira and Confluence.
- Demonstrated creativity, foresight, and sound judgment in planning and delivering technical solutions.
- Spark
- Pyspark
- Shell Scripting
- Teradata
- Databricks
- AWS SQS
- Foresight
- Sound Judgment
- SQL
- Stored Procedures
- Databricks For Data Lake/Data Warehouse Implementations
- Agile Methodologies
- GIT
- Jenkins
- Artifactory
- Unix/Linux Shell Scripting
- Unix Server Administration
- Ca7 Enterprise Scheduler
- Aws S3
- Aws Ec2
- AWS SNS
- Aws Lambda
- AWS ECS
- Aws Glue
- AWS IAM
- Aws Cloudwatch
- Databricks Delta Lake
- Databricks Notebooks
- Databricks Pipelines
- Databricks Cluster Management
- Databricks Cloud Integration (Azure/Aws)
- JIRA
- Confluence
- Creativity