What are the responsibilities and job description for the Data Engineer position at Altar'd State?
Objective
Who Are We?
Altar’d State is a rapidly growing women’s fashion brand with more than 100 boutiques throughout the country. We are a place of respite for the modern-day woman and offer a distinctive shopping experience with the latest fashion finds, the most sought after clothing and accessories and delightful home decor. At the heart of our brand is our mission to change the world. A portion of every purchase is donated to various philanthropic organizations on a local and global scale. We strive to uplift and inspire others to join our movement to stand out. for good. Our culture is Passionate, Committed, Entrepreneurial, Caring, Driven, Confident and Trusting.
Our Mission
“Stand Out. For Good”. At Altar’d State, those four words are more than just than just our motto, they are why we exist. From Mission Mondays, where 10% of our net proceeds go directly to local charitable organizations, to our long-standing partnership with Coprodeli USA, in which we are in the process of building 22 schools for impoverished communities in Peru, we are committed to giving back in meaningful ways to those in need.
The Role
As Altar’d State continues its mission to Stand Out For Good, we are looking for our next Great Data Engineer. The Data Engineer is responsible for designing, implementing, and maintaining data pipelines and integration solutions in a rapidly growing retail environment. This individual will be a member of an agile Scrum team that implements the full SDLC lifecycle of software application integration solutions and database warehousing projects. The ideal candidate has had success in creating integration and/or database solutions in the past and is looking to move up to the next level of enterprise automation with intelligent data pipelines and a data warehouse running on cloud infrastructure.
Job Requirements
Major Responsibilities
- Build data pipelines to integrate applications with batch and near real-time methods
- Create data models and be able to document or catalog the data models
- Build ETL pipelines to populate data warehouse facts and dimensions
- Construct pipelines that can expose web service as APIs for integrations
- Write custom code for software integrations as required or necessary
- Monitor and maintain integration pipelines
Required Experience
- 3-10 years of experience working with databases, and data integration
- Experience reading and writing JSON, SOAP, XML CSV data formats
- Able to construct web service requests and process web service responses
- Experience with web service security protocols like OAUTH2, OpenID Connect, JWT or SAML
- Able to construct and execute SQL queries
- Experience authoring Python scripts
- Demonstrated ability to test data pipelines and validate data
- Excellent organizational, analytical, written and interpersonal communication skills
- Ability to handle information of sensitive and confidential nature in a professional manner
- Ability to analyze information and evaluate results to choose the best solution to solve problems
Qualifications
Preferred Experience
- Experience with iPaaS and tools used for integrations such as, but not limited to: SnapLogic, Dell Boomi, Mulesoft Anypoint, Zapier, Jitterbit, Oracle Data Integrator, Oracle Integration Cloud
- Apache Spark or PySpark programming
- Microsoft Azure Cloud Storage
Education
- Bachelor’s Degree in Computer Science, Engineering, or Information Technology