What are the responsibilities and job description for the DevOps Data Engineer position at a Engineer?
Onebridge is a consulting firm with an HQ in Indianapolis, and clients are dispersed throughout the United States and beyond. We have an exciting opportunity for a highly skilled DevOps Data Engineer to join an innovative and dynamic group of professionals as our next employee-owner.
This position is a long-term remote contract role operating on EST. Candidates located in Indianapolis, IN or regional to the Midwest will be given preference.
DevOps Data Engineer | About You
As a DevOps Data Engineer, you are responsible for developing and building a task scheduling process within Snowflake. You will identify production issues, implement integrations, and work with clients to develop end-to-end data-driven solutions. You possess impeccable data literacy skills, an impressive background in data engineering, and a proven ability to create and reverse engineer complex solutions. Critical thinking and advanced problem-solving skills are core behaviors among the team.
DevOps Data Engineer | Day-to-Day
- Participate in the workstream planning process including inception, technical design, data migration, and transformation to deliver end-to-end solutions.
- Work on the Snowflake Platform Scrum Team by developing and building task scheduling processes within Snowflake.
- Create and maintain process documentation, procedures, and policies.
- Leverage Airflow to coordinate, orchestrate, and develop data flows with Snowflake.
- Utilize best-in-class ETL and ELT practices to perform data conversions, imports, and exports of data within and between internal and external software systems.
- Merge BI platforms with enterprise systems and applications.
DevOps Data Engineer | Skills & Experience
- 8 years of progressive experience in Data Architecture and Data Engineering.
- Possess strong SQL skills including query tuning and optimization.
- Expertise with Apache Airflow data engineering technology is required.
- Expert-level skills with DevOps microservices tools such as Kubernetes and/or Docker are essential.
- Exposure to creating, executing, and troubleshooting complex, automated processes used to configure servers and/or deploy code.
- Outstanding comprehension of Multiple Cloud Platforms and Tools including Fivetran, Talend, Snowflake, AWS, Azure, GCP, etc.
- Scripting fluency with programs such as Ruby, PHP, Pearl, and Python.
- Firsthand experience operating on a large Snowflake Migration project using SnowSQL.
100% Employee-Owned & a Best Place to Work in Indiana, since 2015.
#DICE