What are the responsibilities and job description for the DevOps Engineer position at Octo Career Center?
You…
As a DevOps Engineer, you will be joining the team that is deploying and delivering a cloud-based, multi-domain Common Data Fabric (CDF), which provides data sharing services to the entire DoD Intelligence Community (IC). The CDF connects all IC data providers and consumers. It uses fully automated policy-based access controls to create a machine-to-machine data brokerage service, which is enabling the transition away from legacy point-to-point solutions across the IC enterprise.
Us…
We were founded as a fresh alternative in the Government Consulting Community and are dedicated to the belief that results are a product of analytical thinking, agile design principles and that solutions are built in collaboration with, not for, our customers. This mantra drives us to succeed and act as true partners in advancing our client’s missions.
Program Mission…
The CDF program is an evolution for the way DoD programs, services, and combat support agencies access data by providing data consumers (e.g., systems, app developers, etc.) with a “one-stop shop” for obtaining ISR data. The CDF significantly increases the DI2E’s ability to meet the ISR needs of joint and combined task force commanders by providing enterprise data at scale. The CDF serves as the scalable, modular, open architecture that enables interoperability for the collection, processing, exploitation, dissemination, and archiving of all forms and formats of intelligence data. Through the CDF, programs can easily share data and access new sources using their existing architecture. The CDF is a network and end-user agnostic capability that enables enterprise intelligence data sharing from sensor tasking to product dissemination.