What are the responsibilities and job description for the Scala Data Engineer position at Capco?
CAPCO POLAND
Capco Poland is a global technology and management consultancy specializing in driving digital transformation across the financial services industry. We are passionate about helping our clients succeed in an ever-changing industry.
We also are:
- Experts in banking and payments, capital markets, wealth and asset management
- Focused on maintaining our nimble, agile, and entrepreneurial culture
- Committed to growing our business and hiring the best talent to help us get there
THINGS YOU WILL DO
- assist in the development of a new Azure-based data platform
- forming a part of a wider programme that will help in delivering new capabilities
SKILLS & EXPERIENCES YOU NEED TO GET THE JOB DONE
- Hands on experience with Scala
- Working experience in databricks (including Delta Lake), preferably on Azure
- experience with any one of following Azure tools is added advantage but not required: Azure Data Factory, Databricks, Cosmos DB, Azure Batch, ADLS/Blob Storage, Redis Cache, Azure Event Hub
- Experience of building data pipelines
- CI/CD including tools such as Maven, Gradle, Jenkins, TeamCity, Gitlabs pipelines
- Investment Banking domain knowledge (Financing would be a distinct advantage)
- Bachelors or Master's degree in Computer Science or Information Technology or the international equivalent in computer science, Math, Physics, Engineering or a related field
- Team Player, Creative, Communicative mindset
- Excellent English skills (both written and spoken)
Nice to have:
- experience in Java programming language, including Spring
- experience/knowledge of Spark
- experience/knowledge in Azure Data Factory – ADLS2
- experience in financial services
ONLINE RECRUITMENT PROCESS STEPS
- Screening call with the Recruiter
- Home assignment if required
- Technical/Competencies interview with Capco Hiring Manager
- Client’s interview
- Feedback/Offer