What are the responsibilities and job description for the Sr. Data Engineer position at HOLT CAT?
Job Summary: As a Senior Data Engineering our Data Solutions team, you will have a chance to design intelligent, automated data systems using advanced data engineering knowledge in the data warehousing space to redefine best practices with a cloud-based approach to scalability and automation. In partnership with business intelligence analysts, you will work backwards from our business questions to build reliable and scalable data solutions to meet the business needs. By scaling up our client’s data ecosystem around cloud-based CRM, ERP, and data platforms, we will improve speed, lower the total cost of ownership, and provide a unified view of entities.
This is an opportunity to revolutionize industrial technology space by partnering with different clients in industrial, construction, heavy equipment, renewables, oil & gas, and manufacturing sectors. You will be building new skills and strengthening your expertise through hands-on experience building robust data pipelines for modern enterprise platforms such as Salesforce and Microsoft Dynamics 365 to develop cloud-based data solutions.
The incumbent in this position is expected to model the following practices daily: 1) Demonstrate alignment with the company's mission and core business values; 2) Collaborate with key internal/external resources; Participate in ongoing self-development.
Essential Functions:
- Develops, evaluates, and influences effective and consistent productivity and teamwork to ensure the delivery of Legendary Customer Service (LCS)
- Models, promotes, reinforces, and rewards the consistent use of HOLT’s Values Based Leadership (VBL) tools, models, and processes to ensure alignment with our Vision, Values, and Mission
- Design, develop, implement, test, document, and operate large-scale, high-volume, high-performance data structures for business intelligence analytics.
- Create and propose technical design documentation, which includes current and future ETL functionality, database objects affected, specifications, and flows and diagrams to detail the proposed implementation.
- Implement data structures using best practices in data modeling to provide on-line reporting and analysis using business intelligence tools and a logical abstraction layer against large, multi-dimensional datasets and multiple sources.
- Understand existing databases and warehouse structures in order best determine how to consolidate and aggregate data in an efficient and scalable way.
- Design and code all aspects of data solutions using cloud-based tools to build out a data warehouse.
- Design ETL/ELT processes and data pipelines to bring data from various sources into a central data repository.
- Work closely with Integration Architects, Data Modelers, application teams, and vendors to develop optimal solutions.
- Analyze new/disparate data sources for integration with existing datasets to tell a comprehensive BI story
- Improve business process agility and outcomes, drive innovation, and reduce time to market for our innovative IT solutions.
Knowledge, Skills, and Abilities:
- Capable of speaking articulately to a breadth of topics such as RDBMS, NoSQL, Azure data store technologies, ETL, data warehousing, data modeling, role-based access, etc.
- Background in supporting business intelligence teams by providing subject matter expertise and guidance in modern data engineering
- Advanced SQL and query performance tuning skills.
- Experience with at least one database architecture that use massively parallel processing such as Redshift, Azure Synapse or Snowflake for processing queries
- Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing of large data sets
- Ability to work with ambiguous requirements and drive clarity by collaborating with business groups
- Drive and desire to learn and grow both technical and functional skill sets. High energy, stamina, enthusiasm, organization, and curiosity.
- Innovative thinker who is positive, proactive, and readily embraces change
- Detail-oriented individual with the ability to rapidly learn and take advantage of new concepts, tools, and technologies. Ability to quickly ramp up on new clients, their business needs and support engagements.
- “Get it done” mentality, with the ability to deliver against achievements, regardless of the challenges thrown their way.
- Ability to manage workload, multiple priorities, excellent problem solving, and troubleshooting skills.
- Be comfortable working in a matrix environment and foster motivation within the project team to meet tight deadlines.
Education and Experience:
- Bachelor’s degree in Information Technology, or related field preferred but not required
- 5 years of experience designing, developing, and deploying data solutions using BI technologies
- 2 years of coding experience with modern programming or scripting language (Python, Scala, Java, C# etc.).
- Experience in developing/operating large-scale ETL/ELT processes with on-prem and cloud platforms; database technologies; data modeling.
- Experience developing/operating highly available, distributed systems of data extraction, ingestion, and processing of large data sets from multiple systems.
- Experience operating a very large data environment that may include data lake, lakehouse, datawarehouse, etc.
- Experience developing and implementing data models for data warehouse or related data processes for heavy equipment rental and dealership operations.
- Basic programming skills such as JavaScript and powershell scripting. Experience with source control and build technologies (e.g., Azure DevOps, GIT)
- Active Snowflake, Informatica or Azure certifications
- Experience working in an Agile environment
Supervisory Responsibilities:
- None
Travel:
Up to 20% with occasionally overnight stay