What are the responsibilities and job description for the Java Solr/Spark Developer position at Protos IT?
Job Description
title: Java Solr/Spark Developer
location: REMOTE (If candidate is not on the East coast, they’ll be expected to work east coast hours)
duration: 6 months, extension, Possible FTE.
work schedule?: Standard
Security Clearance: Public trust
hiring a Java Developer as part of a team of database and application development professionals. This work involves development, operations and maintenance of various enterprise systems for data collection from various channels, data enrichment, ETL, enterprise search, information retrieval and analytical systems. This person will play a significant role in implementing, adopting, and evaluating technologies developed both in-house and by others.
The successful candidate will design, develop, operate and maintain software applications, primarily using Java, Apache Solr, Spark, Flume and SQL in both Windows and Linux environments for a government program. The developer will recommend, implement enhancements to the existing application and operation. Software designs should be documented, reviewed in enterprise design review board. Automated unit tests are routinely developed and maintained as the software evolves. Where necessary, user guide is generated or updated to reflect changes in the software functionality. Documentation, code, and unit tests are subjected to peer review. Be part of configuration management team to critically review deliverables by self and others. Occasional interaction with on-site customer representatives will be part of the job.
Basic Qualifications
• Bachelor’s degree in computer science, engineering, or other science discipline
• 1 year of professional experience in Java/J2EE software development including REST APIs/web services, data contract with JSON/XML.
• 1 year of DevOps experience with Apache Solr/Apache Spark or other enterprise search/Big Data platform
• Understand and leverage common architectural styles, design patterns and modern industry practices
• Design, develop and deliver software components for production systems in agile sprint cycles.
• Understand and apply quality techniques and practices (automated unit testing, test driven design/development, continuous integration, delivery)
• Basic administrative, scripting experience in both Windows and Linux environments
• Maintain and operate the deployed applications, come up with performance monitors, tuning and improvements.
• Ability to obtain and maintain a Public Trust clearance
Preferred Qualifications
• Experience with large data ingestion using Apache Spark, Flume/Kafka.
• Experience in relational database programming in MS-SQL server, T-SQL, stored procedures, and functions.
• Experience in AWS/Azure cloud services such as S3, IAM, EC2, Azure Monitor, Alerts, CloudWatch
• Experience in tools such as Azure DevOps, VS Code, Maven, Git, JIRA, JMeter and Jenkins.
• Ability to write shell scripts, automation, basic administration, scheduling of jobs in Linux environment.
• Strong desire to learn and continuous advancement in Search, Big Data, Data Science and Machine Learning technologies and applying in the software development.
• Knowledge in developing user interfaces and web pages.
Salary : $70 - $80