What are the responsibilities and job description for the Data Engineer position at SSi People?
Title: Senior Data Engineer
Duration: 6 months contract with further chances of extension
Locations: Kent, WA/ Merritt Island, FL/Huntsville, AL(Hybrid ; 3 days in a week)
Responsibilities:
- Federate Data Ingestion and Curation: Lead the efforts to streamline data ingestion and curation processes, ensuring efficient and scalable data handling within the Palantir Foundry platform.
- Hydrate Ontology: Spearhead the development and maintenance of the ontology, facilitating a structured and intuitive understanding of the data landscape.
- Define Data Pipeline Standards: Establish common standards for data pipelines, focusing on data quality, RBAC, and data classification to maintain integrity and security.
- Develop Data Patterns and Standards: Create and implement patterns and standards for CDC, incremental loads, and real-time data streaming, enhancing the platform's responsiveness and flexibility.
- Accelerate Delivery and Self-Service: Drive the acceleration of data pipeline delivery and promote data self-service consumption, enabling faster decision-making and innovation.
- DevOps Practices: Design and implement DevOps practices for data operations, ensuring continuous integration, continuous delivery, and automated testing are at the core of the data engineering workflow.
Technology Stack Experience:
- Programming Languages - Python, PySpark, Java, SQL
- Databases - Postgres, Oracle
- Cloud Services – AWS, Azure, GCP (storage, compute, databases, catalogs, streaming, replication, Queueing & Notification, Logging & Monitoring service) – any one of the cloud platforms is good but AWS is preferred.
- Big Data Frameworks
- Metadata Catalogs – DataHub, Informatica, Collibra (experience in any one is good)
- Data Quality Platforms – Anomolo, Informatica, BigEye (experience in any one is good)
- Event Platforms – Kafka, MSK
- Data Platforms – Palantir Foundry, Databricks, Snowflake (experience in any one is good but Foundry experience will be a bonus)
Qualifications:
- Bachelor's or Master's degree in Computer Science, Data Engineering, or a related field.
- 8-12 years required in the data engineering & data management with a strong focus on data platforms like Databricks, Snowflake or Palantir Foundry
- Extensive knowledge of data ingestion, ETL processes, and data modeling.
- Strong problem-solving skills and the ability to work with complex data integration challenges.
- Experience with data reconciliation, data pipeline & data monitoring, and performance optimization.
Software Development Engineer, Kinesis Data Streams
Jooble -
Pacific, WA
Software Development Engineer, Kinesis Data Streams
Jooble -
Auburn, WA
Software Development Engineer, Kinesis Data Streams
Jooble -
Federal Way, WA