What are the responsibilities and job description for the Data Engineer position at Umanist Staffing?
Overview
The Data Engineer plays a crucial role in the organization by creating and maintaining the architecture for data ingestion, processing, and transformation. This role is essential for ensuring that data flows seamlessly from source systems to data repositories or data warehouses, enabling data-driven decision making and analytics.
Key Responsibilities
The Data Engineer plays a crucial role in the organization by creating and maintaining the architecture for data ingestion, processing, and transformation. This role is essential for ensuring that data flows seamlessly from source systems to data repositories or data warehouses, enabling data-driven decision making and analytics.
Key Responsibilities
- Design and implement scalable and reliable ETL processes to process large volumes of data.
- Develop and maintain data pipelines and workflows.
- Collaborate with data scientists and analysts to understand data requirements and provide the necessary infrastructure.
- Optimize data storage and retrieval for performance and cost-effectiveness.
- Manage and monitor data quality, integrity, and security.
- Build and maintain data infrastructure on cloud platforms such as AWS.
- Utilize data modeling techniques to ensure efficient storage and retrieval of data.
- Implement and optimize data processing algorithms and workflows using tools like Hadoop and Spark.
- Conduct performance tuning and troubleshooting of data processing systems.
- Document the data architecture, processes, and workflows.
- Bachelor's degree in Computer Science, Engineering, or a related field.
- Proficiency in programming languages such as Python, Java, or Scala.
- Strong SQL skills for data querying and manipulation.
- Experience with ETL tools and processes.
- Understanding of data modeling and database design principles.
- Experience working with big data technologies and frameworks.
- Hands-on experience with cloud platforms like AWS, Azure, or GCP.
- Familiarity with distributed computing tools such as Hadoop and Spark.
- Knowledge of data warehousing concepts and best practices.
- Ability to collaborate effectively with cross-functional teams and communicate technical concepts to non-technical stakeholders.
Salary : $50 - $65