What are the responsibilities and job description for the Hadoop DBA (Hadoop Administrator) position at Jobs via Dice?
Dice is the leading career destination for tech experts at every stage of their careers. Our client, Arbor Tek Systems, is seeking the following. Apply via Dice today!
Job Title: Hadoop DBA (Hadoop Administrator)
Location: Charlotte, NC Onsite
Experience: 10 Years
Employment Type: Contract
Job Summary:
We are looking for an experienced Hadoop DBA / Hadoop Administrator to manage, monitor, and support large-scale Hadoop clusters. The ideal candidate should have strong expertise in cluster management, performance tuning, and big data ecosystem tools
Key Responsibilities:
Job Title: Hadoop DBA (Hadoop Administrator)
Location: Charlotte, NC Onsite
Experience: 10 Years
Employment Type: Contract
Job Summary:
We are looking for an experienced Hadoop DBA / Hadoop Administrator to manage, monitor, and support large-scale Hadoop clusters. The ideal candidate should have strong expertise in cluster management, performance tuning, and big data ecosystem tools
Key Responsibilities:
- Install, configure, and maintain Hadoop clusters (HDFS, YARN)
- Monitor cluster health, performance, and availability
- Manage data storage, replication, and backup strategies
- Perform cluster tuning and optimization
- Troubleshoot and resolve issues related to Hadoop ecosystem components
- Implement security practices (Kerberos, Ranger, Knox)
- Manage and support tools like Hive, HBase, Spark
- Handle upgrades, patches, and capacity planning
- Work closely with data engineers and developers
- Strong experience in Hadoop Administration (HDFS, YARN, MapReduce)
- Experience with Hadoop distributions like Cloudera, Hortonworks, or MapR
- Hands-on experience with Hive, HBase, Spark
- Knowledge of Linux/Unix administration
- Experience with cluster monitoring tools (Ambari, Cloudera Manager)
- Understanding of data storage and distributed systems
- Scripting knowledge (Shell, Python)
- Experience with backup and disaster recovery strategies
- Experience with Kafka or streaming technologies
- Knowledge of cloud platforms (AWS / Azure / Google Cloud Platform)
- Familiarity with DevOps tools (Docker, Kubernetes)
- Experience with data governance and security frameworks
- Bachelor s degree in Computer Science, IT, or related field
- Relevant certifications in Hadoop/Big Data are a plus