What are the responsibilities and job description for the ETL Developer position at SAIC Career Site?
Description
SAIC is seeking experienced, results-oriented, mission-driven ETL Developers with specialized focus on Data Engineering to perform data model design, data formatting, and ETL development optimized for efficient storage, access, and computation in support of National Security objectives.
Responsibilities include, but are not limited to:
-
Active participant of Agile teams responsible for increasing innovation capacity while driving the velocity of development of data ingestion and data analysis.
-
Responsible for synchronized efforts with other tasks in assembling data technologies to control the flow of data from source to value, with the goal of speeding up the process of deriving value and insight.
-
The ideal candidate will have a passion for unlocking the secrets held by a dataset and solid understanding and experience with developing, automating, and enhancing all parts of the data pipeline to include ingestion, processing, storage, and exposing data for consumption.
-
The Data Engineer also implements data tests for quality and also focuses on improving inefficient tooling and adopting new transformative technologies, while maintaining operational continuity.
Qualifications
Required:
-
Active TS/SCI with Polygraph Clearance
-
Bachelor’s Degree in Computer Science, Information Systems, Engineering (additional years of experience can be substituted for degree)
-
14 years of overall professional experience with Bachelors Degree. 12 Years and Masters Degree, PhD and 9 years Experience
-
3 years of hands-on Development experience using Python to ETL data
-
3 years' experience using and ingesting data into SQL and NoSQL database systems
-
ETL experience, to include formats such as XML, JSON and YML and normalizing data and high-volume data ingestion.
-
Familiarity with the NEXIS platform
Desired:
-
Experience with Apache NiFi Databricks
-
Experience with Databricks
-
Experience programming in Apache Spark, PySpark, Java,
-
Familiarity with building containerized services (e.g. via Docker)
-
Familiarity with data conditioning
-
Experience developing and maintaining data processing flows.
-
Experience with Amazon Web Services (AWS)
-
Experience with CI/CD pipeline
-
Experience with Agile Methodologies and Kanban Framework
-
Experience with utilizing relational databases including the use of MySQL and/ or Oracle for designing database schemas
-
Experience with Linux, REST services, and HTTP
Covid Policy: SAIC does not require COVID-19 vaccinations or boosters. Customer site vaccination requirements must be followed when work is performed at a customer site.
Salary : $7 - $0