Are you the person that likes to prepare the data for consumption? Whether it's “classical” ETL or a data pipeline for (near) real time analytics? Are you ready to extend your skills further in cloud data warehousing technologies, data automation, and data mesh concepts? Then don’t hesitate to apply for a challenging future in modern data & analytics.

What You’ll Do

  • Collaborate within a team and alongside clients to design and develop modern data platform solutions for analytics
  • Engage with both business and technology stakeholders to understand data needs, capture requirements, and implement cloud-based data warehouses, data lakes, data lakehouses, and other data solutions
  • Design and build ELT/ETL processes by developing data pipelines using cloud-native services (AWS/GCP/Azure), open-source tools (like Airflow and Python), or utilizing  ETL and data fabric tools
  • Create data and metadata models to support ad-hoc and pre-built reporting
  • Address ELT/ETL and data platform challenges, rectify BI reporting discrepancies, and uphold overall system stability
  • Design and develop scalable data automation frameworks to accelerate and automate the development cycles of data warehouses and data marts while assuring data quality, data consistency and compliance in the development process
  • Acquire hands-on experience with new data platforms and programming languages

What You’ll Bring

  1. Bachelor’s degree in IT, applied mathematics, statistic or other relevant field, or an equivalent combination of education and practical experience
  2. 3+ years experience in data engineering/data warehousing, preferably within a consulting environment
  3. Demonstrated expertise in ETL, data warehousing, data ingestion, data profiling, and data visualization. Additionally, have a good understanding of different data model variations, such as 3NF, Dimensional, Data Vault, and more
  4. Experience working with SQL
  5. 2+ years of hands-on experience with data management and integration tools, such as Informatica, Talend, SAP Datasphere and Azure Data Factory
  6. Exposure to open source and proprietary cloud data pipeline tools such as Airflow, Glue and Dataflow
  7. Experience working with data warehouses such as Redshift, SQL DW, BigQuery, and Snowflake
  8. Working knowledge of agile development including DevOps concepts
  9. Strong analytical problem-solving ability
  10. Passion to quickly learn new technologies and methodologies

Non-negotiable: strong communication skills in Dutch and English

What’s your score on this list? More than 6/10? Get in touch!

Working at Quest for Knowledge

Every day, since 1999, Quest for Knowledge is working on improving the return on data. We were the first company in Europe to partner with gurus like Ralph Kimball, deploying activities in the UK, the Benelux and Sweden. Today, that attitude is unchanged: we constantly look for best practices and solutions in analytics. This resulted in the mastering and implementation of modern data and analytics architectures. This approach optimizes the return on data assets from the past as well as those from today and the future.

Our consultants practice what we preach: as Quest for Knowledge is focused on closing the knowledge gap that comes with rapid technological developments, we do exactly the same with our new hires. When you choose Quest for Knowledge as your employer, we will analyze your skills and determine a development roadmap with you. Be it technical skills, customer facing skills, or any other aspect of your skill set you may wish to develop. Because working at Quest for Knowledge is also working on your own growth and your future as well as growth for our customers.

If you feel up to the challenge, send your application to people@q4k.com. Tell us briefly why you are interested in this position, along with your CV or a link to your LinkedIn profile.

How can we help you?

Copyright ©2023-2024 quest for knowledge