Verve Group is hiring a

Data Warehouse Engineer

Berlin, Germany

Who We Are

Verve has created a more efficient and privacy-focused way to buy and monetize advertising. Verve is an ecosystem of demand and supply technologies fusing data, media, and technology together to deliver results and growth to both advertisers and publishers–no matter the screen or location, no matter who, what, or where a customer is. With 30 offices across the globe and with an eye on servicing forward-thinking advertising customers, Verve’s solutions are trusted by more than 90 of the United States’ top 100 advertisers, 4,000 publishers globally, and the world’s top demand-side platforms. Learn more at www.verve.com.

Who You Are

As a Data Warehouse Engineer, you will work for the whole Verve Ecosystem, collaborating with the Data teams of different departments. You will work with operational, business, financial, and other KPI data. The work will include analyzing business requirements and data sources, designing architecture and data models, developing and testing ETL processes, and implementing the data warehouse. Ensuring its performance, scalability, and reliability is also crucial. Additionally, you will monitor and troubleshoot the data warehouse, maintain and update it, and ensure data quality, integrity, and security in collaboration with other engineers, analysts, and stakeholders.


This position is available with a hybrid working model in our Berlin, Hamburg or Amsterdam offices (with office days on Tuesdays, Wednesdays, and Thursdays, with free lunches provided for on these days)!


What You Will Do

  • Design, build, implement, and maintain the data warehouse and self-service business intelligence reporting solutions

  • Maintain the current data warehouse and shape its evolution by adopting new technologies under the guidance of a mentor

  • Help to build the next-generation reporting data warehouse pipeline with robust and steady data flow with efficient costs

  • Provide technical expertise on Google Cloud Platforms Big Data technologies including BigQuery, Looker, Dataflow, Cloud Composer

  • Coordinate with business departments, analysts and systems engineering; lead automation, implement review code

  • Deliver high-quality results, meet timelines, and be agile

  • Make appropriate recommendations on the management of data extraction, and analysis


Apply for this job

Please mention you found this job on AI Jobs. It helps us get more startups to hire on our site. Thanks and good luck!

Get hired quicker

Be the first to apply. Receive an email whenever similar jobs are posted.

Ace your job interview

Understand the required skills and qualifications, anticipate the questions you may be asked, and study well-prepared answers using our sample responses.

Engineer Q&A's
Report this job
Apply for this job