Job Description
Python
SQL
Data Warehouse
Kafka
dbt
snowflake
Azure Cloud

Description :

Coolgradient is a fast-growing green-tech start-up with a clear vision to make our digital footprint more sustainable.


Why?

Every swipe, every like, every TikTok video we upload, every Zoom call we have, every Netflix video we watch, or everything we buy online is all processed and depends on… data centers. However, these data centers consume large amounts of energy due to the underlying technology we use every day all day.


How?

We have developed an AI-based analytics platform that captures the entire data center (DC) infrastructure—"from roof to room"—to bring the whole DC into a more optimal state. This platform saves energy, water, and scope 3 emissions while increasing reliability and sustainability in data centers across countries like Germany, France, the UK, Australia, and The Netherlands.

We are looking for a Medior Data / Analytics Engineer that can help us develop and scale our modern data warehouse aka 'data / analytics factory'


About the Role

At Coolgradient, we're driven by our mission to create a more sustainable digital footprint. As a Medior Data / Analytics Engineer, you'll directly support this goal by developing our AI-based analytics platform that optimizes data center infrastructure to save energy and improve reliability. By joining our team, you'll collaborate with other engineers to refine our data platform, contribute to process automation, and deliver reliable, governed data/information products that help transform data centers globally.


Perks:

  • Working on making the world of data centers more sustainable
  • Enjoy a competitive salary, pension scheme, holiday allowance, and disability insurance.
  • Our goal is to increase our impact and grow. We want you to grow with us and offer an Employee Stock Option Plan.
  • A hybrid home-office-remote policy with flexible working hours where we value your regular presence to enjoy team dynamics, but we like to support the flexibility that fits your daily rhythm and preference.
  • Lots of mobility options, where we provide a public transportation subscription, or company bike, or we’ll reimburse your travels when you prefer to use your own means of mobility.
  • Lots of opportunities for professional growth and development
  • Join activities like meetups or (business) events.
  • Work abroad with the team, where we combine good weather, a great environment, and good food.
  • And above all: a fun and enthusiastic team that values a diverse and transparent culture.


Visa:

At our company, we highly appreciate and encourage diversity. We believe it is crucial to achieving success and being the responsible company we want to be. However, our company currently cannot sponsor any work visas.

Requirements :

  • Degree in Computer Science, Engineering or a related field.
  • Solid expertise in SQL and familiarity with Python.
  • Experience with Snowflake and dbt, including Jinja coding.
  • Strong knowledge of dimensional data modelling and query optimization to enable building models that are efficient / can scale
  • A proactive attitude toward continuous improvement.
  • Experience in handling data in complex environments.
  • Familiarity with the Azure cloud.
  • Optional: be able to utilize Git, Visual Studio Code and Jira for version control development workflows and task management
  • Optional: Experience or interest in graph technology (e.g., Neo4j).


What You’ll Do

  • Onboard customers: You will help onboard new data centers and map their data according to our platform's standards, ensuring seamless integration into our analytics system.
  • Process and Model Data: Utilize tools like Snowflake and dbt to process and model data effectively, turning raw data into insights that drive decisions.
  • Optimize ELT Pipelines: Develop and maintain efficient ELT pipelines capable of managing billions of records, focusing on scalability and performance.
  • Improve Data Models: Work on improving and extending our current dimensional data models to enhance data analysis and utility.
  • Enhance Performance: Optimize queries and the overall performance of data pipelines to ensure high efficiency and responsiveness.
  • Data Availability: Take responsibility for maintaining the data warehouse to ensure continuous data availability for data product consumers.
  • Automate Data Processes: Collaborate with senior engineers to automate and generalize data processing, increasing efficiency and reducing manual workload.
  • Integrate Data Science Models: Contribute to integrating our data science models and processes into our data pipelines, enhancing our platform's capabilities.
  • Explore New Technologies: Optional: Bring interest or experience with graph technology (e.g., Neo4j) to explore innovative data structuring and analysis techniques.
Elle Santos · HR OfficerActive this week
Preview

Benefits

Vacation Leave
Commuter Checks & Assistance
Diversity Program
Professional Development
Flexible Hours
Travel Concierge
Work from Home
Working Location

1062 HG, Kon. Wilhelminaplein 1, Kon. Wilhelminaplein 1, 2741 EA Waddinxveen, Netherlands

Posted on 17 July 2024