GCP Data Architect (Hadoop)
(USA - New Jersey)
Contract
Remote until Covid
• Data Architect (Hadoop) who is strong on Big Data and Hadoop Ecosystems
• The Hadoop Data Architect is responsible for managing the full life-cycle of a Hadoop solution which includes creating the requirements analysis, the platform selection, design of the technical architecture, design of the application design and development, testing, and deployment of the proposed solution
• Demonstrate extensive skills and success in the implementation of technology projects within a professional environment, with a particular focus on data engineering
• Experienced in major big data solutions like Hadoop, MapReduce, Hive, HBase, MongoDB, Cassandra. Quite often they also need to have experience in big data solutions like Impala, Oozie, Mahout, Flume, ZooKeeper and/or Sqoop
• The Data Architect should Present and persuade the design architecture to the various stakeholders (Customer, Server, Network, Security and other teams )
• The Hadoop Data Architect should provide technical and process leadership for projects, defining and documenting information integrations between systems and aligning project goals with reference architecture
• The Hadoop Data Architect should Lead end-to-end Hadoop implementation at large enterprise environments
• Provide technical leadership and governance of the big data team and the implementation of the solution architecture in following Hadoop ecosystem (Hadoop Ecosystems, Map Reduce, Pig, Hive, HCatalog, Tez, Spark, Hbase, Accumulo, Storm, Kafka, Flume, Falcon, Atlas, Oozie, Ambari, Hue, Security – Kerberos, Ranger, Knox, Oracle ASO, HDFS encryption, hosting platform - GCP)
• Configure and tune production and development Hadoop environments with the various intermixing Hadoop components
• GCP Experience including and not limited to MPP systems, Database systems, ETL and ELT systems and Data Flow compute Good to have skills
• Strong Google Cloud Platform Data Components – BigQuery, BigTable, CloudSQL, Dataproc, Data Flow, Data Fusion, Etc
• Experience building data intensive applications in GCP and Prior experience developing, building and deploying on GCP
Click the checkbox next to the jobs that you are interested in.
Disaster Recovery Planning Skill
Cloud Security Skill
DataFielder Inc, LLC, Livingston, NJ
Lead Data Engineer - GCP Migration (Java specialized)
ApTask, Short Hills, NJ