Job Posting for Data Engineer at Brilliant Infotech Inc.
Job Details
Role: Data Engineer
Type of Employment: Contract for 6 months with possibility of extension
Location: NJ / NY
Description:
Inbound data integrations e.g. source data replication, ETL, View creation for dashboards and data feeds
Outbound data integrations e.g. flat file data feeds to S3/SFTP, DataShares, etc.
Persons will be responsible for working with Client data architects to design, build and test data integrations, provide support and ensure meticulous documentation of solutions provided. The ideal candidate will have a strong background in enterprise data warehouse architecture, experience with Snowflake Cloud Data Warehouse, and proficiency in DBT ETL processes.
ESSENTIAL DUTIES AND RESPONSIBILITIES:
Design and deliver end-to-end data integration solutions with Enterprise Cloud Data Warehouse.
Data replication from variety of internal and external data sources into Snowflake Data Warehouse Raw layer.
Collaborate with cross-functional teams to understand business requirements and translate them into effective data models.
Experience with AWS Services (S3, Event Bridge, Cloudwatch, etc.) is required.
Experience with Snowflake Cloud Data Warehouse Platform is required:
Experience using Snowflake capabilities such as Snowpipes, Tasks, Streams, Stored Procedures, Data Replication.
Conform to CLIENT best practices for data auditing, user access and data security.
Support client project deliverables:
Support the implementation and integration of data solutions for clients.
Support data mapping activities to align business requirements with data warehouse structures.
Design and implement ETL processes:
Experience using DBT ETL is required.
Experience using Qlik Replication and Compose ETL is optional (less preferred).
Experience using CICD pipelines and code repositories such as Bitbucket is preferred.
Develop and implement efficient ETL processes to support data integration from different engagement platforms.
Optimize data workflows for performance and scalability.
Demonstrate proficiency in SQL; Python scripting skills are a valuable addition:
Write and optimize SQL queries for data extraction, transformation, and loading.
Python skills for automation and scripting are desirable.
Knowledge of Life Sciences or Healthcare business processes is advantageous
Familiarity with Patient Services and Specialty Pharmacy systems and business processes is a plus.
Exhibit strong documentation abilities:
Experience using Jira.
Familiarity with SDLC processes and documentation.
Experience using work management and documentation platforms such as Jira, Confluence is a plus.
Document requirements and conduct peer reviews.
Ensure documentation is accessible and understandable for both technical and non-technical stakeholders.
MINIMUM KNOWLEDGE, SKILLS AND ABILITIES:
At least 3 years with Data Warehouse architecture design and/or data engineering
At least 2 years of experience working with Snowflake cloud data warehouse who is having AWS DBT
Knowledge of or experience with any ETL tool : Informatica, Datastage, Talend or SSIS.
Strong problem-solving and analytical abilities.
Ability to design, execute and communicate solutions to different stakeholders.
Experience with AWS and Python scripting.
Bachelor's or advanced degree in Computer Science, Information Technology, or a related field.
For immediate consideration and interviews please apply here or reply with a copy of resume' to
Keep a pulse on the job market with advanced job matching technology.
If your compensation planning software is too rigid to deploy winning incentive strategies, it’s time to find an adaptable solution.
Compensation Planning
Enhance your organization's compensation strategy with salary data sets that HR and team managers can use to pay your staff right.
Surveys & Data Sets
Sign up to receive alerts about other jobs that are on the Data Engineer career path.
Click the checkbox next to the jobs that you are interested in.
Sign up to receive alerts about other jobs with skills like those required for the Data Engineer.
Click the checkbox next to the jobs that you are interested in.