What are the responsibilities and job description for the Data Warehouse Architect :: W2 Profiles Only :: Remote position at Aorton Inc?
Job Details
Role: Data Warehouse Architect
Location: Woodbridge, NJ or Boston, MA (4 days/week onsite)
Need someone that is architect level not looking for developers
***Must have Snowflake, DBT, and FiveTran***
As a Data Warehouse Modernization Consultant, your primary responsibilities will include designing, deploying, and managing our data warehouse infrastructure using Snowflake, DBT, Fivetran, and other relevant technologies such as AWS Glue, Python, SageMaker, DB2, SQLServer, and Informatica. You will collaborate closely with cross-functional teams to ensure data integrity, optimize performance, and enhance scalability, all while contributing to data warehouse administration efforts.
Key Responsibilities:
Location: Woodbridge, NJ or Boston, MA (4 days/week onsite)
Need someone that is architect level not looking for developers
***Must have Snowflake, DBT, and FiveTran***
As a Data Warehouse Modernization Consultant, your primary responsibilities will include designing, deploying, and managing our data warehouse infrastructure using Snowflake, DBT, Fivetran, and other relevant technologies such as AWS Glue, Python, SageMaker, DB2, SQLServer, and Informatica. You will collaborate closely with cross-functional teams to ensure data integrity, optimize performance, and enhance scalability, all while contributing to data warehouse administration efforts.
Key Responsibilities:
- Design, implement, and maintain Snowflake data warehouse solutions to fulfill business requirements effectively.
- Deploy and optimize DBT models and transformations to streamline data processing and facilitate analytics.
- Manage Fivetran data pipelines to enable seamless data integration and synchronization.
- Utilize AWS Glue for data cataloging, executing ETL jobs, and performing data transformation operations.
- Apply Python and SageMaker for advanced analytics, machine learning initiatives, and data science projects.
- Employ DB2, SQLServer, and Informatica as necessary for data migration, integration, and administration tasks.
- Collaborate with data engineers, analysts, and stakeholders to comprehend data needs and deliver actionable insights.
- Develop and enforce data governance and security protocols to uphold data quality and regulatory compliance.
- Monitor system performance, troubleshoot issues, and implement performance optimization strategies.
- Stay abreast of industry best practices and emerging technologies in data warehousing and cloud computing domains.
- Bachelor's degree in Computer Science, Information Technology, or related field.
- Proven experience working with Snowflake, DBT, Fivetran, AWS Glue, Python, SageMaker, DB2, SQLServer, and Informatica technologies in a data engineering or analytics role with 5-6 years of experience.
- Certifications in Snowflake, DBT, AWS Glue, Python, SageMaker, or related technologies.
- Knowledge of data governance frameworks and data privacy regulations (e.g., GDPR, CCPA).
- Strong SQL skills and experience with data modeling, ETL/ELT processes, and data pipeline management.
- Familiarity with cloud platforms such as AWS, Azure, or Google Cloud Platform.
- Experience with scripting languages (e.g., Python, Bash) for automation and orchestration tasks.
- Excellent analytical, problem-solving, and communication skills.
- Ability to work independently and collaboratively in a fast-paced environment.
AWS Data Solutions Architect (Must have AWS-POC, ETL and Healthcare domain experience) - Only on W2
Patton Labs Inc -
Boston, MA
Data Warehouse Architect
Franklin Fitch -
Boston, MA
Senior Data Architect
Kyte Data -
Boston, MA