What are the responsibilities and job description for the Data Analytics Engineer Senior - IM Enterprise Data position at Energy Jobline?
Description
Summary:
A Data and Analytics Engineer Senior is responsible for development, expansion, and maintenance of data pipelines of the echo system. A Data and Analytics Engineer uses programming skills to develop, customize and manage integration tools, databases, warehouses, and analytical systems.
The Data and Analytics Engineer Senior is responsible for implementation of optimal solutions to integrate, store, process and analyze huge data sets. This includes an understanding of methodology, specifications, programming, delivery, monitoring, and support standards.
Individual must have extensive knowledge of designing and developing data pipelines and delivering advanced analytics, with open-source Big Data processing frameworks such as Hadoop technologies. Individual must have proven competency in programming utilizing distributed computing principles.
This role will support Data Management and Analytics objectives to deliver high quality, contemporary, best-in-class solutions across CHRISTUS Health.
Responsibilities
Bachelor’s degree in computer science, Engineering, Math, or related field is required.
Master’s degree is .
Minimum eight (8) years of experience in MapReduce, Spark programming.
Minimum eight (8) years of experience developing analytics solutions with large data sets within an OLAP, MPP and columnar architecture.
Minimum eight (8) years of experience with design, architecture and development of Enterprise scale platforms built on open-source frameworks.
Three (3) years of Healthcare IT experience is .
Five (5) years of experience with working in a Microsoft SQL Server environment is .
Certifications in Hadoop or Java are a plus.
Work Schedule
TBD
Work Type
Full Time
EEO is the law - click below for more information:
We endeavor to make this site accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at (phone number removed).
Summary:
A Data and Analytics Engineer Senior is responsible for development, expansion, and maintenance of data pipelines of the echo system. A Data and Analytics Engineer uses programming skills to develop, customize and manage integration tools, databases, warehouses, and analytical systems.
The Data and Analytics Engineer Senior is responsible for implementation of optimal solutions to integrate, store, process and analyze huge data sets. This includes an understanding of methodology, specifications, programming, delivery, monitoring, and support standards.
Individual must have extensive knowledge of designing and developing data pipelines and delivering advanced analytics, with open-source Big Data processing frameworks such as Hadoop technologies. Individual must have proven competency in programming utilizing distributed computing principles.
This role will support Data Management and Analytics objectives to deliver high quality, contemporary, best-in-class solutions across CHRISTUS Health.
Responsibilities
- Meets expectations of the applicable OneCHRISTUS Competencies: Leader of Self, Leader of Others, or Leader of Leaders.
- Responsible for analyzing and understanding data sources, participating in requirement gathering, and providing insights and guidance on data technology and data modeling best practices.
- Analyze ideas and business and functional requirements to formulate a design strategy.
- Act as a tenant to draw out a workable application design and coding parameters with essential functionalities.
- Work in collaboration with the team members to identify and address the issues by implementing a viable technical solution that is time and cost-effective and ensuring that it does not affect performance quality.
- Develop code following the industry's best practices and adhere to the organizational development rules and standards.
- Involved in the evaluation of proposed system acquisitions or solutions development and provides input to the decision-making process relative to compatibility, cost, resource requirements, operations, and maintenance.
- Integrates software components, subsystems, facilities, and services into the existing technical systems environment; assesses the impact on other systems and works with cross-functional teams within information Services to ensure positive project impact. Installs configure and verify the operation of software components.
- Participates in the development of standards, design, and implementation of proactive processes to collect and report data and statistics on assigned systems.
- Participates in the research, design, development, and implementation of application, database, and interface using technologies platforms provided.
- Researching, designing, implementing, and managing programs
- Fix problems arising across the test cycles and continuously improve the quality of deliverables.
- Reference and document each phase of development for further reference and maintenance operation.
- Software Development Life Cycle and process skills
- Algorithm and Data Structure skills
- Critical and analytical thinking skills
- The candidate must also have had experience in large scale data lake data and data warehouse implementations and demonstrate proficiency in open-source technology, for example, Python, Spark, Hive, HDFS, NiFi etc.
- The candidate will additionally demonstrate substantial experience and a deep knowledge of data mining techniques, relational, and non-relational databases.
- Experience with data integration with ETL techniques using Big Data processing frameworks such as Spark, MapReduce, HDFS, Python or R.
- Experience with Big Data querying tools, such as Hive, and Impala.
- Experience with building stream-processing systems, using solutions such as NiFi or Spark-Streaming is .
- Good understanding of Lambda Architecture
- Advanced level of SQL programing techniques for Data Integration and Consumption using MPP and columnar databases.
- Solid understanding of BI and analytics landscape, preferable in large-scale development environments.
Bachelor’s degree in computer science, Engineering, Math, or related field is required.
Master’s degree is .
Minimum eight (8) years of experience in MapReduce, Spark programming.
Minimum eight (8) years of experience developing analytics solutions with large data sets within an OLAP, MPP and columnar architecture.
Minimum eight (8) years of experience with design, architecture and development of Enterprise scale platforms built on open-source frameworks.
Three (3) years of Healthcare IT experience is .
Five (5) years of experience with working in a Microsoft SQL Server environment is .
Certifications in Hadoop or Java are a plus.
Work Schedule
TBD
Work Type
Full Time
EEO is the law - click below for more information:
We endeavor to make this site accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at (phone number removed).
Principal Data Engineer (Azure)
Tiger Analytics -
Dallas, TX
Data Engineer - Snowflake
Tiger Analytics -
Dallas, TX
Data Quality Engineer
Loopback Analytics -
Dallas, TX