What are the responsibilities and job description for the ETL Developer position at High5?
ETL Developer
with AWS, Java and Spark experience
Location: Philadelpha, PA
UST Global® is looking for talented and creative ETL Developer who takes responsibility and ownership in providing software solutions and contributing to the overall success of the team
Responsibilities:
- Hands-on architecture/development of ETL pipelines using our internal framework written in Java
- Hands-on architecture of real time REST APIs or other solutions for streaming data from Graph using Spark
- Interpret data, analyze results using statistical techniques and provide ongoing reports
- Develop and implement databases, data collection systems, data analytics and other strategies that optimize statistical efficiency and quality
- Acquire data from primary or secondary data sources and maintain databases/data systems
- Identify, analyze, and interpret trends or patterns in complex data sets
- Filter and clean data by reviewing reports and performance indicators to locate and correct problems
- Work with management to prioritize business and information needs
- Locate and define new process improvement opportunities
Requirements:
· At least 8 years of experience architecting and implementing complex ETL pipelines preferably with Spark toolset.
· At least 4 years of experience with Java particularly within the data space
· Technical expertise regarding data models, database design development, data mining and segmentation techniques
· Good experience writing complex SQL and ETL processes
· Excellent coding and design skills, particularly in Java/Scala and Python and or Java.
· Experience working with large data volumes, including processing, transforming and transporting large-scale data
· Experience in AWS technologies such as EC2, Redshift, Cloud formation, EMR, AWS S3, AWS Analytics required.
· Big data related AWS technologies like HIVE, Presto, Hadoop required.
· AWS certification is preferable: AWS Developer/Architect/DevOps/Big Data
· Excellent working knowledge of Apache Hadoop, Apache Spark, Kafka, Scala, Python etc.
· Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy
· Good understanding & usage of algorithms and data structures
· Good Experience building reusable frameworks.
· Experience working in an Agile Team environment.
· AWS certification is preferable: AWS Developer/Architect/DevOps/Big Data
- · Excellent communication skills both verbal and written