What are the responsibilities and job description for the Director Data & Analytics position at Perennial Resources International?
Job Description
Join the team responsible for building and expanding industry leading Data and Analytics Platforms.
In this role you will be responsible to conceptualize, design and develop highly re-usable, flexible core data-platform and analytics capability to suffice critical product needs. The ideal candidate should be experienced in building and implementation of data intensive- derived content in the Fixed Income and Equity Domain.
The position demands an innovative hands-on technology leader who will help conceptualize and innovate ideas to support our next generation of products and data initiatives.
The responsibilities include:
• Develop projects since inception and build complex data-pipelines using distributed cluster computing technology
• Build cloud native data infrastructure to stage, curate, link and deliver data to provide actionable insights
• Identify, design, and implement optimized data consumption and delivery with high level of automation for greater scalability using message, event and API driven capabilities
• Design and assemble complex data sets that meet functional /non-functional business requirements
• Participate in development to build scalable product and platforms for commercialization
• Respond to and resolve to production issues
Qualifications:
• Bachelor's degree in Computer Science, Information Systems or Engineering is required, or in lieu, a demonstrated equivalence in work experience; MS in Computer Science is desirable
• A minimum of 6 to 8 years of experience in software development in Python, SQL, Core Java and/or C#, Restful API’s, Apache Spark, RDBMS and/or NoSQL
• Required technical skills: Python, SQL (PostgreSQL preferred), NoSQL (Elastic and or any document database preferred), Messaging system (SQS/Kafka), Distributed Data Processing system (Apache Spark), CI/CD pipeline (Azure devops preferred), Restful API (FastAPI/Flask preferred)
• Desirable technical skills: Data-pipeline job scheduling and monitoring tools (Apache Airflow preferred), Micro-services
• Experienced working in Cloud environment (AWS preferred) for development and production
• Strong understanding of system architecture, object-oriented design, functional constructs, and distributed architectures
• Knowledge of Market & fundamental Data added advantage
• Experience with software development lifecycle (SDLC) methodologies (Agile preferred)
• Proficiency in the development environment, including IDE, source control system, unit-testing tool and defect management tool using automated build and deployment (CI/CD)
• Excellent communication skills are essential, with strong verbal and writing pr