What are the responsibilities and job description for the Data Engineer- ETL, SQL Python, Hadoop position at ICONMA, LLC?
Description:
Data Engineer:
This role is responsible for leading the transformation of the enterprise ETL platforms which requires engagement across multiple line of business.
Key responsibilities:
include setting up automations, coordinating delivery, providing visibility of program health, and managing program risks.
This role facilitates sync points between business and technology leaders across, as well as Risk and Compliance partners.
Individuals in the role also ensure delivery meets the clients expectations in terms of the target outcomes, timeline and cost.
Develop and deliver data ETL solutions to accomplish technology and business goals.
Code solutions to ingest, curate, aggregate, integrate, clean, transform, and control data in operational and/or analytics data systems per the defined acceptance criteria.
Assemble large, complex data sets to meet functional reporting requirements.
Build processes supporting data transformation, data structures, metadata, data quality controls, dependency, and workload management.
Define and build reporting applications that enable better data-informed decision-making.
Contribute to existing test suites (integration, regression, performance), analyze test reports, identify any test issues/errors, and triage the underlying cause.
Document and communicate required information for deployment, maintenance, support, and business functionality.
Work closely with business partners to help translate functional requirements into technical approach, design, and decisions.
Create SQL schema objects, complex attributes / metrics, conditional and level metrics, and their use within a report.
Develop Excel Reports & Tableau dashboards and 3rd party application report integration.
Using SQL, Python, Hadoop/Hive and data warehouse concepts /architecture, dimensional modeling, and ETL solution design.
Tune and optimize query performance for large datasets-cubes, caching, aggregate structures within MicroStrategy, Tableau and various RDBMS, Hadoop backend systems.
Requirements:
Bachelors degree or equivalent in Computer Science, Computer Information Systems, Management Information Systems, Engineering (any) or related; and
5 years of progressively responsible experience in the Job offered or a related IT occupation.
Working closely with business partners to help translate functional requirements into technical approach, design, and decisions- Banking & Markets business acumen;
Creating SQL schema objects, complex attributes / metrics, conditional and level metrics, and their use within a report;
Developing Excel & Tableau dashboards and 3rd party application report integration;
Using SQL, Python, Hadoop/Hive and data warehouse concepts /architecture, dimensional modeling, and ETL solution design.
Tuning and optimizing query performance for large datasets-cubes, caching, aggregate structures within MicroStrategy, Tableau and various RDBMS, Hadoop backend systems.
Top MUST Have:
JIRA/Agile Delivery
qTest/Client-ALM
SQL, Python, Hive
As an equal opportunity employer, ICONMA prides itself on creating an employment environment that supports and encourages the abilities of all persons regardless of race, color, gender, age, sexual orientation, citizenship, or disability.
Data Engineer:
This role is responsible for leading the transformation of the enterprise ETL platforms which requires engagement across multiple line of business.
Key responsibilities:
include setting up automations, coordinating delivery, providing visibility of program health, and managing program risks.
This role facilitates sync points between business and technology leaders across, as well as Risk and Compliance partners.
Individuals in the role also ensure delivery meets the clients expectations in terms of the target outcomes, timeline and cost.
Develop and deliver data ETL solutions to accomplish technology and business goals.
Code solutions to ingest, curate, aggregate, integrate, clean, transform, and control data in operational and/or analytics data systems per the defined acceptance criteria.
Assemble large, complex data sets to meet functional reporting requirements.
Build processes supporting data transformation, data structures, metadata, data quality controls, dependency, and workload management.
Define and build reporting applications that enable better data-informed decision-making.
Contribute to existing test suites (integration, regression, performance), analyze test reports, identify any test issues/errors, and triage the underlying cause.
Document and communicate required information for deployment, maintenance, support, and business functionality.
Work closely with business partners to help translate functional requirements into technical approach, design, and decisions.
Create SQL schema objects, complex attributes / metrics, conditional and level metrics, and their use within a report.
Develop Excel Reports & Tableau dashboards and 3rd party application report integration.
Using SQL, Python, Hadoop/Hive and data warehouse concepts /architecture, dimensional modeling, and ETL solution design.
Tune and optimize query performance for large datasets-cubes, caching, aggregate structures within MicroStrategy, Tableau and various RDBMS, Hadoop backend systems.
Requirements:
Bachelors degree or equivalent in Computer Science, Computer Information Systems, Management Information Systems, Engineering (any) or related; and
5 years of progressively responsible experience in the Job offered or a related IT occupation.
Working closely with business partners to help translate functional requirements into technical approach, design, and decisions- Banking & Markets business acumen;
Creating SQL schema objects, complex attributes / metrics, conditional and level metrics, and their use within a report;
Developing Excel & Tableau dashboards and 3rd party application report integration;
Using SQL, Python, Hadoop/Hive and data warehouse concepts /architecture, dimensional modeling, and ETL solution design.
Tuning and optimizing query performance for large datasets-cubes, caching, aggregate structures within MicroStrategy, Tableau and various RDBMS, Hadoop backend systems.
Top MUST Have:
JIRA/Agile Delivery
qTest/Client-ALM
SQL, Python, Hive
As an equal opportunity employer, ICONMA prides itself on creating an employment environment that supports and encourages the abilities of all persons regardless of race, color, gender, age, sexual orientation, citizenship, or disability.
ETL Hadoop Tester
Collabera -
Plano, TX
Data Engineer III - ETL & AWS
JPMorgan Chase -
Plano, TX
Data Engineer III - ETL, AWS
JPMorgan Chase -
Plano, TX