What are the responsibilities and job description for the Hadoop Developer position at Veridian Tech Solutions, Inc.?
Title: Hadoop Developer
Role:Full-Time
Duration: Charlotte, NC (Onsite)
Job Summary
We are seeking an experienced Hadoop Developer to design, develop, and maintain scalable big data solutions using the Hadoop ecosystem. The ideal candidate should have strong expertise in distributed data processing, ETL pipelines, and large-scale data analytics using technologies such as Hadoop, Spark, Hive, Kafka, and SQL.
The candidate will work closely with data engineers, architects, and business teams to build high-performance data platforms and optimize data workflows.
Key Responsibilities
- Design and develop scalable big data applications using Hadoop ecosystem technologies
- Build and maintain ETL/data ingestion pipelines for structured and unstructured data
- Develop data processing solutions using Apache Spark, Hive, and MapReduce
- Work with HDFS for distributed storage and large-scale data management
- Integrate data from multiple sources using Kafka, Sqoop, Flume, or similar tools
- Optimize Hadoop/Spark jobs for performance and scalability
- Collaborate with cross-functional teams to understand business and data requirements
- Perform data validation, troubleshooting, and production support activities
- Ensure data security, governance, and best coding practices
- Monitor and improve system reliability and efficiency
Required Skills
- Strong experience with:
- Hadoop Ecosystem (HDFS, YARN, MapReduce)
- Apache Spark
- Hive
- Kafka
- SQL
- Proficiency in Java, Python, or Scala
- Experience with ETL and big data processing
- Strong understanding of distributed systems and data architectures
- Experience working in Linux/Unix environments
- Knowledge of workflow scheduling tools such as Airflow or Oozie
- Familiarity with cloud platforms such as AWS, Azure, or GCP is a plus