What are the responsibilities and job description for the Scala Spark Developer position at DATAECONOMY?
Job Description
Role: Scala Spark Developer
Experience: 5 – 7 Years
Job Location: New York, NY
Job Description:
- Good working experience with Scala/Spark Developer/Programmer
- Candidate should have hadoop architectural knowledge
- Must have been implemented using SparkSQL & Scala in Spark framework for processing the data
- Hands-on experience in the Cloudera/Hortonworks framework
- Good Spark Architectural Knowledge
- Any programming background(Good to have it in Scala)
- Candidate should have knowledge on queries execution using HIVE & Architectural knowledge on the same
- Knowledge in debugging spark clusters under yarn mode
- Able to pick up any framework on a need basis
Required Skills:
- Experience in Scala Programming on Spark Framework
- Strong expertise in Hadoop Architecture and Spark Cluster.
- Experience on SparkSQL, Cloudera, Hive
- Strong communication and Interpersonal skills
- Should be strong team member
Associate, Java / Scala Dev
Morgan Stanley -
New York, NY
Sr Software Engineer - Scala
5014 Disney Streaming Technology LLC -
New York, NY
Senior Software Engineer (Scala)
5014 Disney Streaming Technology LLC -
New York, NY