What are the responsibilities and job description for the Snowflake Data Architect position at Tredence Inc.?
Role: Snowflake Data Architect
Experience: 10 years
Location: Pittsburg
About Tredence:
Tredence focuses on last-mile delivery of powerful insights into profitable actions by uniting its strengths in business analytics, data science and software engineering. The largest companies across industries are engaging with us and deploying their prediction and optimization solutions at scale. Head quartered in the San Francisco Bay Area, we serve clients in the US, Canada, Europe, and South East Asia.
We are seeking an experienced data scientist who apart from the required mathematical and statistical expertise also possesses the natural curiosity and creative mind to ask questions, connect the dots, and uncover opportunities that lie hidden with the ultimate goal of realizing the data’s full potential.
Primary Roles and Responsibilities:
Data Source Identification: Supports the understanding of the priority order of requirements and service level agreements. Helps identify the most suitable source for data that is fit for purpose. Performs initial data quality checks on extracted data.
Problem Formulation: Translates business problems within one's discipline to data related or mathematical solutions. Identifies what methods (for example, analytics, big data analytics, automation) would provide a solution for the problem using tools and technologies like cloud technology (Snowflake, Azure), Big Data Technologies (Oracle, MySQL, Spark, Hive) etc. Shares use cases and gives examples to demonstrate how the method would solve the business problem.
Data Strategy: Understands, articulates, interprets, and applies the principles of the defined strategy to unique, moderately complex business problems that may span one or main functions or domains.
Data Quality Management: Promotes and educates others on data quality awareness. Profiles, analyzes, and assesses data quality. Tests and validates data quality requirements. Continuously measures and monitors data quality. Delivers against data quality service level agreements. Manages operational Data Quality Management procedures. Manages data quality issues and leads data cleansing activities to remove data quality defects, improve data quality, and eliminate unused data.
Data Visualization: Generates appropriate graphical representations of data and model outcomes. Understands customer requirements to design appropriate data representation for multiple data sets. Works with User Experience designers and User Interface engineers as required to build front end applications.
Applied Business Acumen: Provides recommendations to business stakeholders to solve complex business issues. Develops business cases for projects with a projected return on investment or cost savings. Translates business requirements into projects, activities, and tasks and aligns to overall business strategy.
Exploratory Data Analysis: Collects and tabulates data and evaluates results to determine accuracy, validity, and applicability. Supports the identification and application of statistical techniques based on requirements. Applies suitable technique under direction from leadership.
Assists in the planning, design and implementation of exploratory data analysis research projects. Understands existing statistical models and identifies and recommends statistical models based on hypothesis.
Monitor performance of individual contributors and help manage their professional development
Provides thought leadership and coaches within data engineering practice
Managed overall risk, mitigating risks/issues with client and leadership and Involved with leadership on resource planning
Responsibilities and Qualifications:
Must have total 8 yrs. in IT and 3 years' in Data warehouse, ETL, BI projects and reporting.
Must have experience at least two end to end implementation of Snowflake cloud data warehouse and 3 end to end data warehouse implementations on-premise preferably on Oracle.
Must have experience working on MicroStrategy, Tableau and other reporting tools.
Expertise in Snowflake – data modelling, ELT using Snowflake SQL, implementing complex stored Procedures and standard DWH and ETL concepts
Expertise in Snowflake advanced concepts like setting up resource monitors, RBAC controls, virtual warehouse sizing, query performance tuning, Zero copy clone, time travel and understand how to use these features
Expertise in deploying Snowflake features such as data sharing, events and lake-house patterns.
Hands-on experience with Snowflake utilities, SnowSQL, SnowPipe, Big Data model techniques using Python.
Experience in Data Migration from RDBMS to Snowflake cloud data warehouse
Deep understanding of relational as well as NoSQL data stores, methods, and approaches (star and snowflake, dimensional modelling)
Experience with data security and data access controls and design
Experience with Azure data storage and management technologies.
Build processes supporting data transformation, data structures, metadata, dependency and workload management.
Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot.
Provide resolution to an extensive range of complicated data pipeline related problems, proactively and as issues surface.
Must have expertise in Azure Platform.
Certified Snowflake cloud data warehouse Architect (Desirable).
Should be able to troubleshoot problems across infrastructure, platform and application domains.
Must have experience of Agile development methodologies.
Strong written communication skills. Is effective and persuasive in both written and oral communication.
Tredence is an equal opportunity employer. We celebrate and support diversity and are committed to creating an inclusive environment for all employees.
Visit our website for more details: https://www.tredence.com/