What are the responsibilities and job description for the Ab Initio Administration Lead :: MD, VA, DC Area (Hybrid) :: position at 1 Point System?
Job Details
Role : Ab Initio Administration Lead
Duration : 6-month contract with possibility of extension
Location : MUST BE local to MD, VA, DC area - Hybrid role
Alta IT Services is seeking a highly skilled Ab Initio Admin with a robust background in Ab Initio as well as with some experience with Cloudera and AWS to join a dynamic IT team. The Lead Ab Initio ETL Administrator is responsible for leading all the tasks involved in administration of ETL tool (Ab-Initio) in the Cloud. This position also required hands on work. Candidate will support the implementation of a Data Integration/Data Warehouse for the Data products on-prem and in AWS. Position does not have direct reports but is expected to assist in guiding and mentoring less experienced staff. May lead a team of matrixed resources.
Qualification Requirements:
Advanced (expert preferred) level experience in administrating and engineering relational databases (ex. MySQL, PostgreSQL, Mongo DB, RDS, DB2), Big Data systems (ex. Cloudera Data Platform Private Cloud and Public Cloud), automation tools (ex. Ansible, Terraform, Bit Bucket) and experience working cloud solutions (specifically data products on AWS) are necessary.
Require prior experience with AWS Cloud.
At least 10 years of Experienced with all the tasks involved in administration of ETL Tool (Ab Initio)
At least 10 years of Experienced with Advance knowledge of Ab Initio Graphical Development Environment (GDE), Meta Data Hub, Operational Console
Created Big Data pipelines (ETL) from on-premises to Data Factories, Data Lakes, and Cloud Storage such as EBS or S3.
Experience with Advance knowledge of UNIX, Linux, Shell scripting and SQL.
Experience working and troubleshooting issues related to Hive, ICFF, HDFS.
Experience with manage metadata hub-MDH, Operational Console and troubleshoot environmental issues which affect these components
Experience with scripting and automation such as design and develop automated ETL process and architecture and unit testing of the ETL code
Troubleshoot potential issues with Kerberos, TLS/SSL, Models, and Experiments, as well as other workload issues that data scientists might encounter once the application is running.
Strong knowledge of Cloudera (Hadoop, HDFS, YARN, Hive, Spark) administration and management.
Hands-on experience with AWS services (EC2, S3, RDS, VPC, IAM, etc.).
Familiarity with scripting languages (Python, Shell, etc.)
Experience with CI/CD tools (Jenkins, GitLab CI, etc.).
Understanding of containerization technologies (Docker, Kubernetes) is a plus.
Strongly demonstrated knowledge of DB2
Represents team in all architectural and design discussions. Knowledgeable in the end-to-end process and able to act as an SME providing credible feedback and input in all impacted areas. Require tracking and monitoring projects and tasks as the lead.