What are the responsibilities and job description for the DevOps Engineer position at IO Datasphere?
Job Description
IO Datasphere, Inc. has been providing project management, software development and IT staff augmentation services to our clients in the Midwest and throughout the U.S. since 1996. We are an approved vendor on contract to provide software development and IT staff augmentation services for the States of Illinois, Michigan, Minnesota, Iowa, and Wisconsin. We also provide these services to businesses, as well as local and county governments, in the Midwest and nationwide.
Our client is looking for a DevOps Engineer to implement technologies that supports the data strategy.
Tasks
- 30% - Implement technologies that supports the data strategy
- Develop, implement the Data Lake/Lakehouse architecture while providing guidance on best practices.
- 30% - Define and implement data pipelines that enable analytics and operational dashboards (30%) –
- Design and develop high quality batch and real time data pipelines at a senior level.
- 20% - Define and Implement Data Engineering standards and best practices that improve quality and efficiency
- Define and promote data engineer standards, best practices, and design patterns.
- 10% - Define and Implement Data Engineering workflows that improve quality and reduce the time to release new features
- Design and implement CI/CD workflows and pipelines.
- 10% - Ensure the team can support ongoing development and operations of new technology
- Mentor other members of the team in new technology and skills.
Required Soft Skills
- Setting best practices for data engineering
- Mentorship of team of data engineers
- Clear and proactive communication - verbal/written
- Taking initiative and working autonomously with some direction
Note: This position is 100% Remote. Wisconsin residency not required.
Contract: 6 Months
Skills Required:
- 2-3 years - Experience with Azure Data Factory
- 2-3 years - Experience creating ELT Pipelines in Azure Synapse Analytics and Data Bricks
- 2-2 y ears - Experience with languages including Python, SQL
- Experience with Spark Notebook
- Experience with Lakehouse Architecture using medallion design (bronze, silver, gold)
- Experience with CI/CD pipelines in Azure
- Experience implementing an environment from ground up
- Experience addressing data quality and profiling
***Rate depends on experience
*** Local candidates preferred, not mandatory
*** Candidates authorized to work in the US are encouraged to apply. We are not sponsoring H1B candidates at this time.
*** Companies submitting candidates should only submit direct W2 employees for this position. This includes H1B visa candidates and therefore your company must be the visa sponsor.
Please submit your resume by using the "URL" below
Salary : $0