What are the responsibilities and job description for the Data Architect position at ClifyX?
Job Description
Job Title : Data Architect Location : Markham,ON Qualification
• Bachelor’s degree in computer science, or related field (applied mathematics/statistics)
• At least 7 years of data visualization/modelling/migration experience
• Working experience with RDMB, nosql and other database technologies
• Working experience with data platforms (Snowflake/Hadoop, etc)
• Experience with information management o multiple platforms
• Data mining and modelling tools
• Development experience would be a nice to have
• Experience with data management and reporting technologies, predictive analytics, data visualization.
• Creative and analytical problem solving capabilities.
Please list 5 mandatory “MUST HAVE” skills and experience for this requirement. Please include skills related to technical as well as domain and non-technical skills and experience as applicable to the position.
• Collaborate with business and IT to develop a data strategy that meets corporate needs and industry standards/requirements.
• Implement a data architecture and create a data inventory.
• Architect/design data pipelines and how data will flow through the enterprise.
• Maintain a repository of data architecture/applications.
• Help build data models for database structures, analytics and AI applications.
• Develop/enforce database development standards.
• Design methods for ingestion/storage/use of new data sources
• Develop measures that ensure data accuracy/integrity/accessibility.
• 10 years’ experience with Data Modeling
• Experience in Insurance Domain
• 7 to 10 years’ experience in working on Data warehousing or Data Integration projects.
• 5 years of experience with creating data integration strategy.
• 5 years’ experience in data analytics area
• Hands on Experience in Data and System Analysis, source to target entity / attribute mapping.
• Good understanding of all aspects of Enterprise Data Management (traditional data warehousing concepts, data lake concepts.)
• Good exposure to Hadoop (Cloudera/EMR) and associated ecosystem, ETL tools like Informatica PC and BDM/DEI, databases like Oracle, PostgreSQL, SQL Server
• Exposure to AWS services from solution architecture perspective
• Experience building them in Hadoop platform.
• Need to be able to work with AVPs and VPs on regular basis apart from different delivery teams.
• Good written and oral communication skills.
Please list 5 “NICE TO HAVE” but not mandatory skills and experience for this requirement
• Experience using SAS Insurance Data Model for building LDM’s and PDM’s
• Experience with Guidewire Policy, Claim & Billing Center Data Model
Experience with using Big Data platforms such as Hadoop.