What are the responsibilities and job description for the Data Engineer /Data Platform Engineer position at nfolks?
Job Description
hi,
Data Platform Engineer
Duration: 6-12 months extensions
Location: Remote (PST time zone)
Responsibilities:
· Architect, create, and deploy necessary infrastructure within our hybrid-cloud environment to scale data applications and pipelines
· Accountable for the technical decision making and owning engineering execution in the team as it relates to data
· Engineering and DevOps responsibilities
· Understand current technical bottlenecks and champion solutions for long-term scalability
· Developing and integrating best practices to increase the quality and velocity of deployments
· Partnering with engineering teams to evangelize and improve standardized Release Engineering processes
· Build and maintain data platform codebase and iterate over its lifecycle for scalability and adoption org-wide
· Working in an agile culture
Requirements:
· B.S/M.S. in Computer Sciences or equivalent field, and 4 years of relevant experience with data platform engineering
· Expertise in designing, building, and maintaining Kubernetes clusters and containers as it relates to Data Pipelines and CI/CD processes
· Expertise configuring and maintaining CI/CD pipelines (for this role, we would be needing CI/CD processes for our IaC in AWS using Terraform, dbt deployment, and our data platform)
· Experience in data orchestration tools like Airflow
· Experience with public cloud providers such as Google Cloud, AWS, Azure (certifications are a plus)
· High fluency with Python and other common programming languages
· Experience with logging, monitoring and alerting systems and tools
· Excellent debugging skills in a Linux environment
· Experience with credential management in code
· Experience with version control systems (GitHub, Stash etc..) and deployment tools
Preferred Platform Experience:
· Snowflake, BigQuery (or similar data warehouse)
· Apache Airflow (or similar orchestration tools)
· dbt and ELT paradigms
· Dataiku or SageMaker AI/ML platform
· Jenkins or similar CI/CD
· GitHub, GitLab, BitBucket, etc.
· Candidate filtering
· Find candidates who have experience in cloud/hybrid deployments using Kubernetes and scalable cloud infrastructure.
· Experience with Airflow, AWS, dbt, data pipelines, and Snowflake (or very similar) a must.
· Experience with Dataiku/SageMaker is a huge plus
Sincerely,
HR Manager
nFolks Data Solutions LLC
Phone:  lt;/span> Email: