Job Description
Description :
Coolgradient is a fast-growing green-tech start-up with a clear vision to make our digital footprint more sustainable.
Why?
Every swipe, every like, every TikTok video we upload, every Zoom call we have, every Netflix video we watch, or everything we buy online is all processed and depends on… data centers. However, these data centers consume large amounts of energy due to the underlying technology we use every day all day.
How?
We have developed an AI-based analytics platform that captures the entire data center (DC) infrastructure—"from roof to room"—to bring the whole DC into a more optimal state. This platform saves energy, water, and scope 3 emissions while increasing reliability and sustainability in data centers across countries like Germany, France, the UK, Australia, and The Netherlands.
We are looking for a Senior Data / Analytics Engineer that can help us develop and scale our modern data warehouse aka 'data / analytics factory'
About the Role
At Coolgradient, we're working to redefine the digital footprint by making data centers more energy efficient. As a Senior Data Engineer, you'll be instrumental in guiding our AI-based platform that optimizes data center infrastructure, ensuring reliable and sustainable operations. Your leadership will shape the architecture, standardize processes, and enhance data pipeline automation, enabling our team to deliver impactful data solutions that meet the evolving demands of the industry globally.
Perks:
- Working on making the world of data centers more sustainable
- Enjoy a competitive salary, pension scheme, holiday allowance, and disability insurance.
- Our goal is to increase our impact and grow. We want you to grow with us and offer an Employee Stock Option Plan.
- A hybrid home-office-remote policy with flexible working hours where we value your regular presence to enjoy team dynamics, but we like to support the flexibility that fits your daily rhythm and preference.
- Lots of mobility options, where we provide a public transportation subscription, or company bike, or we’ll reimburse your travels when you prefer to use your own means of mobility.
- Lots of opportunities for professional growth and development
- Join activities like meetups or (business) events.
- Work abroad with the team, where we combine good weather, a great environment, and good food.
- And above all: a fun and enthusiastic team that values a diverse and transparent culture.
Visa:
At our company, we highly appreciate and encourage diversity. We believe it is crucial to achieving success and being the responsible company we want to be. However, our company currently cannot sponsor any work visas.
Requirements :
What You’ll Do
- Architect Analytical Data Solutions: Design comprehensive data architecture patterns and automate data pipelines with a focus on scalability, performance, and integration of Data Science models.
- Design Data Models that scale: Design, develop, and optimize dimensional / 3NF / Data Vault data models to enhance the accuracy and utility of our data.
- Optimize our Data Warehouse: Design and optimize our Snowflake data warehouse to ensure efficient data storage and retrieval. Optimize queries and overall data pipeline performance to maintain system efficiency and responsiveness.
- Mentorship: Mentor and coach junior data engineers, fostering a culture of continuous learning and improvement.
- Collaborate Across Teams: Work closely with other departments to streamline data automation and ensure seamless data flow across the organization.
- Integrate MLOps: Guide the integration of MLOps with data pipelines, enhancing the platform's machine learning capabilities.
- Develop efficient ELT Pipeline patterns: Create templates for robust ELT pipelines using dbt to handle billions of records, ensuring high performance and scalability.
Job requirements
- Degree in Computer Science, Engineering or a related field.
- Ability to design data architectures, patterns, and automation workflows.
- At least 5 years of experience in data design and ETL/ELT pipeline building.
- Strong knowledge of dimensional data modelling
- Proficiency in SQL, Python, and Jinja coding using dbt.
- Advanced expertise in Snowflake and Azure cloud platforms.
- Be able to optimize the performance of Snowflake / know how how to optimize SQL so it can scale
- Problem-solving skills to standardize data processes across diverse data centers
- Proactive attitude with a focus on continuous improvement.
- Ability to coach junior / medior team members effectively.
- Optional: be able to use Data Vault modelling
- Optional: be able to utilize Git, Visual Studio Code and Jira for version control development workflows and task management
- Optional: Experience or interest in graph technology (e.g., Neo4j).
Benefits
Working Location
1062 HG, Kon. Wilhelminaplein 1, Kon. Wilhelminaplein 1, 2741 EA Waddinxveen, Netherlands