What are the responsibilities and job description for the Data Warehouse Specialist position at Zenith Services Inc.?
Role Summary & Role Description
Candidate will play a crucial role in ensuring the quality and reliability of our software products, with a focus on developing Azure-based applications, data warehouse and ETL basic end systems. Proficiency in Data Warehousing and Snowflake is essential for this role. Candidate must have minimum of 8 years of relevant experience in Data Warehousing and Database application development. Candidate must have hands on experience in Snowflake and Azure. Analyze, Design and Code snowflake objects and views. Create Database Views/materialized views, Indexes, Partitions. Experience in Shell scripting and Autosys/Convoy-M/Cron scheduling, Ability to communicate with Business users, understand and Analyze requirements and Use Cases. Experience working in Agile Teams. Proven problem solving, communication, interpersonal and analytical skills. Ability to work both independently as well as within a team.
Core/Must have skills
Good knowledge in Data Warehouse concepts and different schema types like Star and Snowflake Strong experience in writing complex queries, performance tuning, query optimization and debugging. Minimum 4 years working experience in Snowflake and good knowledge on Snowpipe, DBT Experience in working with Snowflake. Hands-on experience in bulk loading and unloading data, Snowflake tables using COPY command Handling large and complex data sets like JSON, ORC, PARQUET CSV files from various sources like AWS S3. TASK. Snowpipe creation for scheduling/Automate Snowflake jobs good exposure in Snowflake Cloud Architecture, Snow SQL and SNOWPIPE for continuous data loading. Good communication skills, problem solving skills and ability to analyses quickly and come-up with an efficient industry standard solution for a given problem
Good to have skills
Good knowledge on Oracle database and Proficient in SQL Performance tuning. Good knowledge on Bigdata and Sped (Good knowledge on Databricks)