Experienced Data Engineer - Data Analytics and Pipeline Development Specialist for Innovative Data Solutions at blithequark
Introduction to blithequark At blithequark, we are revolutionizing the way data is utilized to drive business decisions. As a leader in the retail industry, we recognize the importance of harnessing the power of data to stay ahead of the curve. Our IT department is responsible for the technical future of blithequark, and we are committed to providing a family-oriented, employee-driven environment where our team members can thrive and succeed. With operations in fourteen countries, we are proud to be ranked seventh in Forbes' "World's Best Employers" list.Job Overview We are seeking an experienced Data Engineer - Data Analytics to. As a Data Engineer, you will be responsible for developing and operationalizing data pipelines to make data available for consumption (reports and advanced analysis). This includes data ingestion, data transformation, data validation/quality, data pipeline optimization, coordination, and collaboration with DevOps engineers during bolthires/CD. The ideal candidate will have a strong foundation in programming and SQL, as well as expertise in data storage, modeling, cloud, data warehousing, and data lakes.Key Responsibilities Design, build, and deliver automated data pipelines from a variety of internal and external data sources Collaborate with product owners, engineering, and data platform teams to design, build, test, and automate data pipelines that are relied upon across the organization as the single source of truth Develop and maintain optimal data pipeline architecture Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery Implement big data and NoSQL solutions by developing scalable data processing platforms to drive high-value insights to the organization Support the development of Data Dictionaries and Data Taxonomy for product offerings Demonstrate strong understanding of coding and programming concepts to build data pipelines (e.g., data transformation, data quality, data integration, etc.) Build data models with Data Modeler and develop data pipelines to store data in defined data models and formats Demonstrate strong understanding of data integration methods and tools (e.g., Extract, Transform, Load (ETL)/Extract, Load, Transform (ELT)) and database architecture Identify ways to improve data reliability, efficiency, and quality of data management Lead ad-hoc data retrieval for business reports and dashboards Review the integrity of data from various sources Manage database design including installing and upgrading software, and maintaining significant documentation Monitor database activity and resource utilization Perform peer review for another Data Engineer's work Essential Qualifications To be successful in this role, you will need: 3+ years of experience designing and operationalizing data pipelines with large and complex datasets 3+ years of hands-on experience with Informatica PowerCenter 3+ years of experience in Data Modeling, ETL, and Data Warehousing 3+ years of hands-on experience with Informatica IICS 3+ years of experience working with Cloud technologies, such as ADLS, Azure Data Factory, Azure Databricks, Databricks Live Table, Spark, Azure Synapse, Cosmos DB, and other big data technologies Broad experience working with various data sources (SQL, Oracle database, flat files (csv, delimited), Web API, XML)Advanced SQL skills.Strong understanding of relational databases and business data; ability to write complex SQL queries against multiple data sources Strong understanding of database storage concepts (data lake, relational databases, NoSQL, Graph, data warehousing) Able to work in a high-speed agile development environment Scheduling flexibility to meet the needs of the business, including weekends, holidays, and 24/7 on-call duties on a rotational basis Preferred Qualifications While not required, the following qualifications are preferred: BA/BS in Computer Science, Engineering, or equivalent programming/services experience Azure Certifications Experience implementing data integration strategies, such as event/message-based integration (Kafka, Azure Event Hub), ETL Experience with Git/Azure DevOps Experience delivering data solutions through agile software development methodologies Familiarity with the retail industry Excellent verbal and written communication skills Experience working with SAP integration tools, including Bodies Experience with UC4 Job SchedulerCareer Growth Opportunities and Learning Benefits At blithequark, we are committed to the growth and development of our team members.As a Data Engineer, you will have the opportunity to work on complex and challenging projects, collaborating with experienced professionals in the field. You will also have access to training and development programs, including mentorship, workshops, and conferences, to help you stay up-to-date with the latest technologies and trends in the industry. Work Environment and Company CultureOur work environment is dynamic and fast-paced, with a focus on innovation and collaboration. We believe in fostering a culture of openness, transparency, and respect, where everyone feels valued and empowered to contribute.We offer a range of benefits, including flexible working hours, remote work options, and a comprehensive compensation package. Compensation, Perks, and Benefits We offer a competitive salary of $26 per hour, as well as a range of benefits, including:Comprehensive health insurance Retirement savings plan Paid time off and holidaysProfessional development opportunities Flexible working hours and remote work options Conclusion If you are a motivated and experienced Data Engineer looking for a new challenge, we encourage you to apply for this exciting opportunity at blithequark.As a leader in the retail industry, we offer a dynamic and supportive work environment, with opportunities for growth and development. Don't miss out on this chance to and take your career to the next level. and let's work together to shape the future of data analytics!