What are the responsibilities and job description for the Data Engineer position at InterWorks?
If you’re a passionate problem-solver who loves diving into complex data sets, possesses broad programming savvy and excels at building reliable data pipelines, you’ll love being a data engineer at InterWorks. As part of our Data team, you’ll empower our customers to make well-informed, data-driven decisions. Together, we’ll combine expert application of new software and technologies with a unique perspective on bridging the traditional Business-IT divide. Technical stuff aside, you’ll be part of close-knit and helpful team that delivers nothing but the best work to our customers and each other.
Please be advised that this role is required to be located in the InterWorks Oklahoma City office in Oklahoma City, Oklahoma. Remote work or telecommuting arrangements outside of this jurisdiction are not permissible for this position.
Salary range commensurate with experience and qualifications, Data Engineer/Data Architect: $80,000-$140,000
What You’ll Do
- Tackle diverse projects that range in duration from a few days to a few months for clients ranging from local businesses to the Fortune 500
- Design, develop and implement large scale, high-volume, high-performance data infrastructure and pipelines for Data Lake and Data Warehouse
- Design cloud-native data pipelines, automation routines, and database schemas that can be leveraged to do predictive and prescriptive machine learning
- Communicate ideas clearly, both verbally and through concise documentation, to various business sponsors, business analysts and technical resources
- Build and implement ETL frameworks to improve code quality and reliability
- Build and enforce common design patterns to increase code maintainability
- Work with disparate data sources (relational databases, flat files, Excel, HDFS/Big Data systems, high-performance analytical databases, etc.) to unify client data
What You’ll Need
Must-Haves:
- Excellent SQL fluency
- Strong ETL proficiency using GUI-based tools or code-based patterns
- Understanding of data-modeling principles
- Passion for delivering compelling solutions that exceed client expectations
- Excellent verbal and written communication
- Strong problem-solving skills
- Business acumen
- A thirst to learn
- Adaptability and flexibility in changing situations
What We’d Like You to Have
- Experience with software engineering practices
- Experience with modern data-engineering practices and frameworks
- Experience with integration from semi-structured file and API sources
- Matillion, Fivetran, DBT or other ETL tools
- AWS / Microsoft Azure cloud exposure
- Snowflake / Databricks / Amazon Redshift / Google BigQuery / Azure Synapse
Salary : $80,000 - $140,000