What are the responsibilities and job description for the Hadoop Administrator position at Stafford Technology?
Hello! We are seeking candidates for the Data Platform Engineering team to manage and scale Big Data Hadoop and relational database technologies for our client. This role includes administration and infrastructure planning in Hadoop NoSQL, Snowflake, and Kafka environments.
Responsibilities include administering Hadoop NoSQL environments, ensuring availability and performance of business applications, installing, managing, and tuning Hadoop/HBase clusters, developing operational best practices, and optimizing application and Hadoop cluster interaction. The role also involves on-call support rotation and reviewing software/hardware architecture for scalability and performance.
Qualifications required are experience with the Hadoop stack (HBase, Yarn, Spark/MapReduce, Kafka), knowledge in tuning and hardening data platforms, deploying Hadoop clusters, understanding of Linux operating system (Red Hat preferred), and skills in scripting and automation tools (Ansible, Bash, Python, GitLab), and Java coding. A minimum of 7 years of IT experience is necessary. Experience with Snowflake, AWS technologies, and working in a Scrum/Agile framework is desirable.
Job Type: Direct Hire (Includes Visa Sponsorship)
Pay: 125,000 per year
Location: Hybrid, Dublin, Ohio
Job Types: Full-time, Contract
Pay: $125,000.00 per year
Benefits:
- 401(k)
- Dental insurance
- Health insurance
Schedule:
- 8 hour shift
- Monday to Friday
Experience:
- Hadoop: 1 year (Preferred)
Work Location: Hybrid remote in Worthington, OH 43085
Salary : $125,000