Works closely with Analytics and Marketing users to understand informational needs and business challenges, document those requirements, and translate into solutions. Partners with the work stream leads to ensure overall cost, delivery, and quality targets are met. We work in a team environment; there is an expectation that this role portrait represents the core responsibilities of the role but other responsibilities may surface based on what needs to be done. Fluctuating work schedule can be expected during nights and weekends depending on system availability and deliverable dates. Frequent computer use at workstation up to 2 hours at a time or for extended periods of time.
Position Responsibilities 5 to 7 years of experience within the field of data engineering or related technical work including business intelligence, analytics
Experience designing and building scalable and robust data pipelines to enable data-driven decisions for the business
Very good understanding of Data warehousing concepts and approaches
Hands-on experience building complex business logics and ETL workflows using Informatica IICS and PowerCenter.
Good Proficient in SQL, PL/SQL and preferably experience in Snowflake
Good Experience in one of the scripting languages: Python or Unix Scripting
Experience in streaming data or injecting data at near real time using methods like Kafka
Hands-on experience in cloud and native technologies such as Glue, Lambda, Kinesis, Lake Formation, S3, Redshift
Experience using Spark EMR, RDS, EC2, Athena, API capabilities, CloudWatch, CloudTrail is a plus
Experience with Business Intelligence tools like Tableau, Cognos, ThoughtSpot, etc is a plus.
Hands on experience in building data lakes
Experience in data cleansing, data validation and data wrangling.
Good experience in Monitoring and improving the performance of process when required
Strong verbal and written communication skills
• Fundamental UNIX/Windows Scripting skills.