Senior Data Engineer

Englewood, NJ Full Time
POSTED ON 5/23/2024

Job Details

Client: NBCUniversal

Role: Sr Data Engineer

Location: Englewood Cliffs, NJ onsite 3 days/week

Visa: USC

Contract:W2

Duration: 6-12 month contract to hire

Interview: 2 rounds

  • Round 1 Senior Manager/Phone Call
  • Round 2 Team/In-Person Interview (Technical). Live coding in Python and PySpark.

Must-haves

  • 5-8 years of experience as a Data Engineer
  • Python and PySpark experience at the programming (OOP) level
  • Experience with Kafka for batch processing and/or streaming pipelines
    • Relative experience with Google Cloud Platform Dataproc, Google Cloud Platform Pub/Sub, RabbitMQ, AWS Redis, or Spark Streaming is okay
  • Proficiency with AWS utilizing Lambda, S3, SQS, SNS, and Kinesis
  • AWS EMR or Databricks experience

Plus

  • Airflow and DAG AWS Services
  • AWS Certification
  • Lead experience and ability to mentor junior developers

Day-to-Day

A Large Media Client is seeking a Data Engineer with strong Python and AWS technical expertise to join their Data Engineering team. The Data Engineer will play a crucial role in designing, implementing, and maintaining scalable data pipelines and solutions on the AWS platform. Your work will involve handling large volumes of diverse data, ensuring its accuracy, and providing real-time streaming pipelines and batch processing. The day-to-day will involve building and optimizing pipelines for processing terabytes of data on a varying increments, including monthly, weekly, daily, hourly. This includes both batch processing (daily schedules, hourly updates) and real-time streaming pipelines. The Data Engineer will work closely with cross-functional teams (Cyber, Cloud Ops, AWS, Databricks, partners, and vendors) to address technical issues and resolutions. The Data Engineer will utilize Terraform to write infrastructure code and AWS services, including but not limited to Lambda, SQS, SNS, S3, and Airflow.

Client: NBCUniversal

Role: Sr Data Engineer

Location: Englewood Cliffs, NJ onsite 3 days/week

Visa: USC

Contract:W2

Duration: 6-12 month contract to hire

Interview: 2 rounds

  • Round 1 Senior Manager/Phone Call
  • Round 2 Team/In-Person Interview (Technical). Live coding in Python and PySpark.

Must-haves

  • 5-8 years of experience as a Data Engineer
  • Python and PySpark experience at the programming (OOP) level
  • Experience with Kafka for batch processing and/or streaming pipelines
    • Relative experience with Google Cloud Platform Dataproc, Google Cloud Platform Pub/Sub, RabbitMQ, AWS Redis, or Spark Streaming is okay
  • Proficiency with AWS utilizing Lambda, S3, SQS, SNS, and Kinesis
  • AWS EMR or Databricks experience

Plus

  • Airflow and DAG AWS Services
  • AWS Certification
  • Lead experience and ability to mentor junior developers

Day-to-Day

A Large Media Client is seeking a Data Engineer with strong Python and AWS technical expertise to join their Data Engineering team. The Data Engineer will play a crucial role in designing, implementing, and maintaining scalable data pipelines and solutions on the AWS platform. Your work will involve handling large volumes of diverse data, ensuring its accuracy, and providing real-time streaming pipelines and batch processing. The day-to-day will involve building and optimizing pipelines for processing terabytes of data on a varying increments, including monthly, weekly, daily, hourly. This includes both batch processing (daily schedules, hourly updates) and real-time streaming pipelines. The Data Engineer will work closely with cross-functional teams (Cyber, Cloud Ops, AWS, Databricks, partners, and vendors) to address technical issues and resolutions. The Data Engineer will utilize Terraform to write infrastructure code and AWS services, including but not limited to Lambda, SQS, SNS, S3, and Airflow.

Client: NBCUniversal

Role: Sr Data Engineer

Location: Englewood Cliffs, NJ onsite 3 days/week

Visa: USC

Contract:W2

Duration: 6-12 month contract to hire

Interview: 2 rounds

  • Round 1 Senior Manager/Phone Call
  • Round 2 Team/In-Person Interview (Technical). Live coding in Python and PySpark.

Must-haves

  • 5-8 years of experience as a Data Engineer
  • Python and PySpark experience at the programming (OOP) level
  • Experience with Kafka for batch processing and/or streaming pipelines
    • Relative experience with Google Cloud Platform Dataproc, Google Cloud Platform Pub/Sub, RabbitMQ, AWS Redis, or Spark Streaming is okay
  • Proficiency with AWS utilizing Lambda, S3, SQS, SNS, and Kinesis
  • AWS EMR or Databricks experience

Plus

  • Airflow and DAG AWS Services
  • AWS Certification
  • Lead experience and ability to mentor junior developers

Day-to-Day

A Large Media Client is seeking a Data Engineer with strong Python and AWS technical expertise to join their Data Engineering team. The Data Engineer will play a crucial role in designing, implementing, and maintaining scalable data pipelines and solutions on the AWS platform. Your work will involve handling large volumes of diverse data, ensuring its accuracy, and providing real-time streaming pipelines and batch processing. The day-to-day will involve building and optimizing pipelines for processing terabytes of data on a varying increments, including monthly, weekly, daily, hourly. This includes both batch processing (daily schedules, hourly updates) and real-time streaming pipelines. The Data Engineer will work closely with cross-functional teams (Cyber, Cloud Ops, AWS, Databricks, partners, and vendors) to address technical issues and resolutions. The Data Engineer will utilize Terraform to write infrastructure code and AWS services, including but not limited to Lambda, SQS, SNS, S3, and Airflow.

Client: NBCUniversal

Role: Sr Data Engineer

Location: Englewood Cliffs, NJ onsite 3 days/week

Visa: USC

Contract:W2

Duration: 6-12 month contract to hire

Interview: 2 rounds

  • Round 1 Senior Manager/Phone Call
  • Round 2 Team/In-Person Interview (Technical). Live coding in Python and PySpark.

Must-haves

  • 5-8 years of experience as a Data Engineer
  • Python and PySpark experience at the programming (OOP) level
  • Experience with Kafka for batch processing and/or streaming pipelines
    • Relative experience with Google Cloud Platform Dataproc, Google Cloud Platform Pub/Sub, RabbitMQ, AWS Redis, or Spark Streaming is okay
  • Proficiency with AWS utilizing Lambda, S3, SQS, SNS, and Kinesis
  • AWS EMR or Databricks experience

Plus

  • Airflow and DAG AWS Services
  • AWS Certification
  • Lead experience and ability to mentor junior developers

Day-to-Day

A Large Media Client is seeking a Data Engineer with strong Python and AWS technical expertise to join their Data Engineering team. The Data Engineer will play a crucial role in designing, implementing, and maintaining scalable data pipelines and solutions on the AWS platform. Your work will involve handling large volumes of diverse data, ensuring its accuracy, and providing real-time streaming pipelines and batch processing. The day-to-day will involve building and optimizing pipelines for processing terabytes of data on a varying increments, including monthly, weekly, daily, hourly. This includes both batch processing (daily schedules, hourly updates) and real-time streaming pipelines. The Data Engineer will work closely with cross-functional teams (Cyber, Cloud Ops, AWS, Databricks, partners, and vendors) to address technical issues and resolutions. The Data Engineer will utilize Terraform to write infrastructure code and AWS services, including but not limited to Lambda, SQS, SNS, S3, and Airflow.

If your compensation planning software is too rigid to deploy winning incentive strategies, it’s time to find an adaptable solution. Compensation Planning
Enhance your organization's compensation strategy with salary data sets that HR and team managers can use to pay your staff right. Surveys & Data Sets

Sign up to receive alerts about other jobs that are on the Senior Data Engineer career path.

Click the checkbox next to the jobs that you are interested in.

Income Estimation: 
$93,774 - $125,712
Income Estimation: 
$124,365 - $162,248
Income Estimation: 
$119,242 - $150,559
Income Estimation: 
$139,217 - $174,411
Income Estimation: 
$139,217 - $174,411
Income Estimation: 
$162,787 - $204,090

Sign up to receive alerts about other jobs with skills like those required for the Senior Data Engineer.

Click the checkbox next to the jobs that you are interested in.

  • Bug/Defect Analysis Skill

    • Income Estimation: $86,561 - $112,265
    • Income Estimation: $89,378 - $119,179
  • Computer Simulation Skill

    • Income Estimation: $88,626 - $106,718
    • Income Estimation: $88,696 - $110,860
View Core, Job Family, and Industry Job Skills and Competency Data for more than 15,000 Job Titles Skills Library

Job openings at Genisis Technology Solutions Inc

Genisis Technology Solutions Inc
Hired Organization Address Atlanta, GA Contractor
Job Summary: We are seeking an experienced Enterprise Architect to join our team. The ideal candidate will be responsibl...
Genisis Technology Solutions Inc
Hired Organization Address Carol Stream, IL Contractor
Desired Competencies - Oracle EBS Technical Consultant with extensive experience in Design, Development, Enhancement, an...
Genisis Technology Solutions Inc
Hired Organization Address Carol Stream, IL Other
Role: Oracle Functional Location: Carol stream, IL, onsite Visa : USC, GC, H4EAD. Position: W2 Experience: 7 years Funct...
Genisis Technology Solutions Inc
Hired Organization Address Carol Stream, IL Full Time
Role: Oracle SCM Functional Consultant Location: Carol Stream, IL Experience: 8 Contarct : W2 Technical role : We are se...

Not the job you're looking for? Here are some other Senior Data Engineer jobs in the Englewood, NJ area that may be a better fit.

Senior Data Engineer

Insight Global, Englewood, NJ

Senior Data Engineer - VP

Hispanic Technology Executive Council, Rutherford, NJ