What are the responsibilities and job description for the Data Engineer position at Delviom LLC?
Job Description
Data Engineer
Bethlehem, PA, Holmdel, NJ or New York, NY (Hybrid)
Contract to hire- will keep them for a year or longer before going perm.
They want them to work out of NY, NJ or the Bethlehem PA office on a hybrid- will consider a fully remote candidate if they are real strong but they have to somewhat close to one of those 3 locations because they will be needed onsite 2-3 days per month.
Location: Ideally, we are looking for a candidate who can work hybrid at one of these locations.....Bethlehem, PA, Holmdel, NJ or New York, NY. We will consider remote candidates, however, they MUST be willing to work onsite a few times per month as needed. EST business hours are required and preference is for the individual to live on the east coast so they can easily come to one of the offices each month.
Photo ID's will be required prior to any interviews being scheduled. This is due to multiple instances we have encountered where the candidate who interviews for the role is not the same person who shows up on assignment. Any candidate who cannot attend a video interview will be disqualified immediately.
Job Description:
The Client iseeking an experienced Data Engineer to be part of our Data and Analytics organization. You will be playing a key role in building and delivering best-in-class data and analytics solutions aimed at creating value and impact for the organization and our customers. As a member of the data engineering team, you will help developing and delivery of Data Products with quality backed by best-in-class engineering. You will collaborate with analytics partners, business partners and IT partners to enable the solutions.
You will:
• Architect, build, and maintain scalable data & analytics pipelines for machine learning models, reports, dashboard, and other analytics solutions.
• Design, develop and implement low-latency, high-availability, and performant data applications and recommend & implement innovative engineering solutions.
• Design, develop, test and debug code in Java, Python, and other languages as per Guardian standards to improve the business processes.
• Design and develop code to create reliable and scalable data pipelines which can support various types of consumers including products like Customer Data Platform.
• Build and maintain integrations with different SaaS tools as required to activate customer centric data and insights.
• Apply and provide guidance on software engineering techniques like design patterns, code refactoring, framework design, code reusability, code versioning, performance optimization, and continuous build and Integration (CI/CD) to make the data analytics team robust and efficient.
• Performing all job functions consistent with Guardian policies and procedures, including those which govern handling PHI and PII.
• Collaborate with cross-functional teams to identify highest value activities for Advanced Data and Analytics group.
• Work closely with various IT and business teams to understand systems opportunities and constraints for maximally utilizing Guardian Enterprise Data Infrastructure.
• Develop relationships with business team members by being proactive, displaying an increasing understanding of the business processes and by recommending innovative solutions.
• Communicate project output in terms of customer value, business objectives, and product opportunity.
You have:
• 5 years of experience with Bachelors / master’s degree in computer science, Engineering, Applied mathematics or related field.
• Extensive hands-on development experience in one or more of programming languages like Java or Python and familiarity with SQL and bash.
• Extensive experience in all stages of software development and expertise in applying software engineering best practices.
• Familiarity with building and deploying scalable data pipelines to develop and deploy Data Solutions using Python, SQL, PySpark, Java.
• Familiar in designing and developing backend RESTful Webservices (APIs) using Microservices architecture to generate JSON/XML response using Java spring or Python.
• Experience in developing APIs using Python or Java to operationalize Machine Learning models and data assets.
• Experience with any of the backend frameworks like Spring Integration, Spring MVC etc.
• Experience in integrating database with backend layer using Spring DAO or Python equivalent JPA frameworks.
• Familiarity with API Gateways like APIGEE to secure webservice endpoints.
• Familiarity with concurrency and parallelism.
• Extensive Experience in SQL query and optimization.
• Familiarity with Data pipelines and ML development cycle.
• Experience in creating and configuring continuous integration/continuous deployment using pipelines to build and deploy applications in various environments and use best practices for DevOps to migrate code to Production environment.
• Familiarity with creating and modifying tables to store data in OLAP database like Hive and utilize data warehouse platforms like AWS Redshift, Databricks.
• Ability to investigate and repair application defects regardless of component: front-end, business logic, middleware, or database to improve code quality, consistency, delays and identify any bottlenecks or gaps in the implementation.
• Experience with unit testing frameworks like Mockito, Powermock and Junit for test driven development approach for logic implemented in Java. Ability to write unit tests in python using unit test library like pytest.
Additional Qualifications (nice to have):
• Have experience with building Customer 360 solution either in data warehouse or using products like Customer Data Platform.
• Experience in building data pipelines supporting customer centric marketing use cases and experience building integrations with marketing and engagement tools to activate the customer insights.
• Familiarity with identity resolution concept and familiarity with either building custom identity resolution solutions or using identity resolution product.
Release Comments: Hybrid resource is preferred, but will accept remote candidates, however, remote candidates MUST be available to work onsite as needed which could be anywhere from 1-3 times per month.