Data Engineer

About us

ZAGENO was founded in 2015, ZAGENO is eliminating bottlenecks to breakthrough science and research through curated connections linking scientists and their institutions’ procurement offices with leading life science vendors. World-renowned academic and government research facilities, global pharmaceutical companies, and biotechs large and small are already users of our unique online biotech marketplace, currently comprising more than three million products.

What do we do?

  1. We help scientists choose the optimal laboratory kits and materials for each unique experiment setup.
  2. We make purchasing transactions more efficient for both buyers and sellers.
  3. For our vendor partners, we provide a valuable, expanded sales channel.
Our growing ZAGENO team of 60 in Cambridge, USA, and Berlin, Germany, includes experts in science, e-commerce, systems integration, and customer support to enable smarter, faster processes — allowing more time and resources for value-added science and better research results. Find out more on

About the Role

We are opening a new tech hub in Wroclaw, Poland! Therefore, we are looking for engineers to join our tech team!

We are looking for an experienced Data Engineer to help us scale and expand our platform to meet increasing demand, harvest bigger data, deliver better pipelines, and remain at the top of the market.

Your responsibilities will include:
  • Develop, maintain and support our Postgres, Reddis, and BigQuery infrastructure to provide data layer for cross-functional teams
  • Own our data pipelines - ETL, on-demand, pipelines, and dashboards from both internal and external sources.
  • Help us architect and build a brilliant, intuitive, and scalable data model. Feed it with pipelines. Groom it with ETLs. Watch it grow.
  • Work with our experienced team of engineers to build, maintain, and improve the availability of historical, real-time, and third party data.
  • Join us in feature development, working with our scientists and stakeholders to grow our system.
Our stack:
  • Python, some Javascript, and Golang
  • Django, BigQuery, Postgres, Reddis and ElasticSearch
  • Google Cloud Platform, Kubernetes, Jenkins, Tableau, Kibana

About You

  • 3+ years experience with Postgres, including extensions
  • 3+ years experience with ETL pipelines
  • 3+ years of experience with data modeling and data warehousing solutions
  • Professional experience with BigQuery or similar is necessary 
  • Experience with big-data solutions is a big plus (HBase, CouchBase, Hypertable etc.)
  • Experience coding in either Python, Golang or similar is a big plus
  • Experience with Kubernetes is a plus
  • E-commerce experience is a plus
  • Experience with managed Google services is a plus
You will find the team a great fit if:
  • You have an agile mindset willing to contribute to the team goals
  • You understand that trust (goes both ways) is key for a healthy professional relationship with both your team and lead
  • You are an honest and selfless player in a fair and selfless team. We will help you whenever you need it, and we would expect you to do the same with us. It's not about you. It's about all of us (all for 1 and 1 for all)
  • You bring your ideas to the table, and you stand your ground when you believe in them. But also, you give others the chance to discuss their ideas
  • You are excited about pushing forward for continuous improvement, scalability, and performance
  • Last, but not least, you are passionate and motivated about what you do and why you do it

Our Benefits

  • Your choice of hardware. Linux or OSX? (We stock mechanical keyboards and at least one of us has a non-POSIX shell installed.)
  • Competitive salary
  • Flexible working hours
  • The opportunity to grow into your role and to develop a career within the company