Olá from beautiful Münster, Germany! I'm Jakob de Maeyer, a physicist by training and beach ultimate player by heart who spends his days writing software as a freelance data and product engineer.

The problems I solve are usually heavy on data and algorithms, revolving around
  • acquiring data from a variety of places,
  • mixing, matching, and making sense of it,
  • transforming it in into an easily digestable form,

I love simple solutions, Python, automating the boring stuff, agonizing over elaborate data processing puzzles, and did I mention Python?

Looking to start a project? Let's talk.


Featured work

Syntinels

SYNTINELS (formerly SALES2B) is a Münster-based start-up which provides businesses with B2B sales leads based on analysis of their current customer base. Their backbone is an extensive database of metrics for almost all companies active inside Germany, Austria, and Switzerland.

To facilitate further expansion of this database, I helped them set up a new data collection platform, wrote solutions to gather information from some tricky origins, and prototyped concepts for on-demand data acquisition.

Bundesministerium für Bildung und Forschung

Among all the software projects I have developed, Bright Sky is probably the one I am the most proud of. Not because it is particularly intricate or ground-breaking (what it does is fairly straightforward), but because it allowed me to give back to the open source community.

Bright Sky is a simple and free-to-use JSON API to many of the weather observations published by Germany's meteorological service (DWD). It involves regularly polling the DWD open data server for updates, parsing and normalizing weather observations and forecasts, and finally making the results available in a public-facing API. Bright Sky currently handles more than a million requests daily. It's development was sponsored by the Bundesministerium für Bildung und Forschung as part of the Prototype Fund program.

Summary GmbH

As a company revolving around developing, importing, and branding products sourced in Southeast Asia, Summary deals with a metric shit-ton of logistics: from coordinating local suppliers, to constantly tracking the whereabouts of hundreds of containers, to distributing and selling from almost a dozen warehouses.

Together we have developed many solutions in three major areas: simplified information management and interactive business intelligence through a new interface to their ERP, automated tracking of fast-changing logistics information like shipment ETAs, and elaborate matching of in-stock or on-the-way goods against a myriad of both customer orders and sales forecasts to guide logistics and purchase decisions.

FoodTracks

Located at Münster's lovely municipal port, working for FoodTracks marked the only time in my life where I actually worked from the office of a company I was writing software for. They build web-based controlling solutions for bakeries with a focus on early problem recognition in sales and containing food waste.

I was FoodTracks' lead backend developer during their early years and involved in almost all products. At the core of these was a pipeline that monitors and loads data from a wild bouquet of client-hosted ERP systems and securely gets it to FoodTracks' site. From there, the sales data is normalized into a common format, enriched with external information and custom data models, and finally served through REST interfaces, email notifications, or pdf reports.

Scrapinghub

Scrapinghub (nowadays Zyte) is where I learned the ropes of software development, and I am eternally grateful to them. The environment and guidance they provided immensely helped me develop fundamental skills for my software career, like working autonomously (they're a fully remote 200-person company!), making sound architectural choices, and leading teams.

Among many other things I started and grew to four full-time developers a solution to collect the inventory of 20+ ticket vendors with single-seat resolution and multiple daily refreshes, managed and lead developed an invite-only SEO scraping product with a customer-facing API, and was lead maintainer of the command line client for Scrapy Cloud, their flagship product.

Max-Planck-Institut für Dynamik und Selbstorganisation Logo

Before my career in software engineering, I studied the marvelous field of Physics, and Göttingen's Max Planck Institute for Dynamics and Self-Organization is where I concluded it with my Master's thesis. This included building and evaluating a novel experiment tracking 40,000 coalescing oil droplets on water.

In the process I developed two software projects: one to obtain full-resolution images from consumer-level DSLRs at high frame rates not available in existing software solutions; another to simplify and modularize the task of batch image processing, in my case using custom computer vision solutions to detect and track the oil droplets.