What are the responsibilities and job description for the Snowflake Data Lake Architect position at RapidIT, Inc?
Job Description
Please send matching resumes to suresh(at)rapiditinc(dot)com or contact me at six-seven-eight-seven-four-one-nine-four-zero-four
Position: Snowflake Data Lake Architect
100% Remote
The Data Lake Architect will be a part of the Enterprise Data Team responsible for the systems architecture, design, development, and administration of a large, scaled-out, real-time, high performing, multi-tenant Snowflake data lake infrastructure, allowing Performance Food Group to leverage data effectively for internal and external data ingestion, stream processing, advanced analysis, and applications.
Essential Responsibilities:
As the Data Lake Architect, you will:
- Architect, design, develop and administer big data infrastructure platform, primarily based on a Snowflake data platform, ensuring that the infrastructure is highly available and secure.
- Architect and build security compliant user management framework for multi-tenant big data platform.
- Work closely with management and data scientist teams to define and refine the big data platform to achieve company product and business objectives.
- Collaborate with other technology teams and architects to define and develop cross-function technology stack interactions.
- Research and experiment with emerging technologies and tools related to data lake workloads.
- Work with the cross functional teams to establish and reinforce disciplined software development processes and best-practices.
- Administer and support the data lake platform on a day-to-day basis.
- Be proficient with design patterns for data lake data ingestion as well as data lake governance.
- Work closely with the product management and development teams to rapidly translate the understanding of customer data and requirements to product and solutions.
- Set architectural vision and direction across a matrix of teams.
- Be proficient in SQL and ETL processes, ETL and DB performance tuning, table partitioning, shell scripting, driving prototypes and POCs.
- Provide back up support to the DBA team supporting the enterprise data warehouse on Snowflake.