I have been working with a lot of different technologies over the years. I have been working with Java for quite some time, but fell in love with the simplicity and conciseness of Kotlin. I have been working with Go as well, mostly when implementing CLIs or Google Cloud Functions, which I really like as well, but it requires a different paradigm when developing.

When doing frontend applications, which I sometimes contribute to, I use TypeScript to ensure the benefits of a typing system over JavaScript. I have been working with Python for a long time, and I mostly use that when creating simple scripts or data analysis. I have been working with a lot of different databases, but mostly with PostgreSQL. I have been using all major public clouds for several years, but I have the most experience with GCP and AWS.

As always, if a specific technology is a better fit for the problem, I will choose that, even if that means that I need to learn a new technology. Balance the time spent learning a new technology with the time saved by using it.


Kotlin: experienced
I know the stdlib very well and I can use coroutines / flows; but I'm always eager to improve my Kotlin skills.

Java: experienced
Java 17+; though I work mostly in Kotlin.

Go: skilled
mostly used Golang for writing cross-platform / architecture CLIs and for serverless functions.

Python: experienced
I mostly use Python for scripting and data analysis.

TypeScript: skilled
better than writing JS of course; I use TypeScript for frontend apps and for certain environments that require node.

HashiCorp HCL: experienced
really like this IaC language, which I use for Terraform and Nomad.

Transport protocols

REST APIs: experienced
client and server apps in various languages and frameworks, resource oriented design and following the Zalando API guidelines.

gRPC: experienced
client and server, using protobuf and buf.build, in Java, Kotlin, Python, Golang, TypeScript.

GraphQL: skilled
mostly client side, some server experience on the JVM.

Build tools and package managers

Gradle: experienced
I have set up many multimodule Gradle projects, , including artifact signing and publishing (private registry and Sonatype).

Maven: experienced
Multimodule Maven projects with dependency management and plugin management are not unfamiliar with me, including artifact signing and publishing.

npm: skilled
Using packages and publishing packages, nothing special here.

Go modules: skilled


Spring Boot: skilled
at my last job, I have been using vanilla Kotlin mostly.

Ktor: skilled
briefly used Ktor server, but it was missing gRPC support, therefore switched to vanilla Kotlin; used Ktor client quite often.

Next.js: starter
I have done (simple) modifications to various Next.js apps.

React: starter
I have done a React course and have explored the usage of gRPC Unary RPCs with TanStack Query.


OpenAPI: experienced
Used code generators for various languages, Java, Kotlin (fixed a bug in the Kotlin code generator for deserializing lists of a class), Python.

OAuth 2.0: skilled / experienced
Set up the Identity Provider Keycloak with multiple realms and multi-tenancy realms various grant types: authorization code, client credentials, refresh.

JSON Schema: experienced
Whenever I have to write JSON or YAML, I make sure to use the JSON Schema if available. Also vice versa, I try to provide JSON Schemas whenever possible.

Conventional Commits: experienced
I have set up and managed CI/CD pipelines using Semantic Release and Conventional Commits.

Databases, queuing systems and other storage layers

I have worked with all different kinds of databases, but mostly with PostgreSQL. It is easy, fast and reliable to set up and has all the basic needs for a new app. As always, I decide the best fit for the job, so if a different database technology fits better, I will go with that.

PostgreSQL: experienced
The de facto OS solution for a RDBMS, which I have used for many JVM apps.

BigQuery: skilled
Both a consumer (directly and from Google Cloud Dataflow / Apache Beam) and producer (batch loading data from Google Cloud Storage) of data.

Snowflake: starter

Redis: experienced
Used Lua scripts for an atomic operation for concurrent modifications, used pubsub for broadcasting new keys.

SQLite: experienced

Cassandra: starter

Kafka: experienced
Used, tweaked and upgraded various Kafka clusters, on-site and inside Kubernetes clusters (through Strimzi). I've used Kafka for data producing and consuming applications, which consisted of either single or multiple replicas.


Protocol Buffers: experienced
my preference for serialization (if the use case allows it), especially combined with gRPC and buf.build.

JSON: experienced
On the JVM, I'm familiar with Jackson and GSON.

Apache Avro: skilled
both JSON and binary based, familiar with `avsc` and `avdl` formats.

Infrastructure and fundamentals

DNS: skilled
know how to set up various DNS records for different purposes; i.e. a single domain with multiple DNS zones.

HTTP/2: skilled
end-to-end HTTP/2 traffic between client and apps running in Google Kubernetes Engine through an ingress and reverse proxy.

Load Balancing: skilled
mostly with reverse proxies and Google Cloud Load Balancing.


I have been using all major public clouds for several years, but I have the most experience with GCP and AWS. I really like working with GCP, as it’s very developer friendly and fast.

GCP: experienced
Cloud Functions, Cloud Run, Cloud SQL, Pub/Sub, GKE, Secrets Manager, Load Balancing, Cloud DNS, Cloud KMS, Cloud Storage, Stackdriver, Cloud Endpoints and more.

AWS: skilled
EC2, Redshift, Athena, Glue, S3, and more.

Azure: skilled
AKS, VMs, Azure DevOps, CosmosDB.


Automation is both my hobby (home automation) and part of my work. I like to set up CI/CD and from my peers I have received the feedback that I am good at it.

GitHub Actions: experienced
Set up complex CI/CD pipelines for GitOps, ensuring an always green main branch, enabling collaboration between team members.

GitLab CI: experienced
Similar to GitHub Actions, I have performed tasks with the same goal.

Semantic Release: experienced
Introduced various teams with the inclusion of Semantic Release in their workflow, which enabled them to use Conventional Commits to determine the next semantic version of their application. Codebases consisted of Java, Kotlin, Python, Terraform, Golang, TypeScript, and many more.

Renovate Bot: experienced
Introduced various teams with Renovate Bot, which enabled them with automatic dependency upgrades, regardless of the language and package manager they were using. Configured Renovate Bot centrally in organizations, to facilitate re-usability (DRY). I also do presentations around this topic, feel free to reach out if you're interested.


Docker / OCI format: experienced
Built multi-architecture images (Docker Hub, ghcr.io, private registries), both containers and Helm packages shipped in OCI format.

Kubernetes: experienced
Used GKE, k3s, EKS, AKS.

Helm: skilled
Created and used various Helm repositories.

Terraform: experienced
Automate everything is one of my mottos. I have created many Terraform modules to enable re-usability for various teams. I have also used Terraform to create complex large infrastructures across a microservices architecture.

Vault: starter
Used together with Kubernetes with the Vault Secrets Injector to facilitate secret management.

Prometheus: experienced
Experience with both pull and pushed based metric collection, Kubernetes experience with Pod/Service Monitors.

Grafana: experienced
Created many dashboards, and I'm also running my own Grafana instance at home, to collect data that devices at home generate.

Keycloak: experienced
Multiple realms and single realms with multi-tenancy.

Strimzi: experienced
Kafka cluster management within Kubernetes.

Envoy Proxy: experienced
Created various Envoy Proxies, mostly as sidecar containers, to enable TLS offloading, JWT verification, header additions, CORS handling, etc.

NGINX: skilled
Used NGINX for various temporary reroutes, and also created an Authorization Server that abused Firebase Authentication to circumvent high cloud costs with NGINX and JavaScript.

buf.build: experienced
Used buf.build since the very beginning. At first only for Protobuf file linting and breaking change detection, later also used the Buf Schema Registry and Remote Packages for easy Protobuf generated artifacts for various languages.

Make: skilled
Whenever I need some scripts for a repository, I prefer to create a Makefile.


GitOps: experienced
Automate all the things. I am a big fan of automating infrastructure, because of predictability, portability, and maintainability.

DevOps: experienced
You build it, you maintain it, you take care of it. Your code, your responsibility.

Platform Engineering: experienced
I have been part of many teams that facilitated and built self-service tools for internal dev teams.

Iterative design: experienced
Start small, improve, repeat.

Miscellaneous and development tools

It does not really make sense to specify an experience level for the tools below. I am using them all on a daily basis, and I am very comfortable with them. I left all the chat apps out of this list, but I have used the lot.

Linux / GNU

Visual Studio Code

IntelliJ IDEA











Tools that I have been using in the past, but have not been using for quite some time.




Apache Beam

Apache Flink



All icons are licensed under the CC0 license and belong to their respective owners.