Technology stack
Cutting edge technologies continuously adapting to cover new functionalities that were never thought possible before.
Cloud native
at scale
Our multi-environment cloud infrastructure hosted in Azure is based on micro-services deployed on Docker containers. This allows us to scale our software components in a flexible, resilient and automated way through Kubernetes clusters.
We read and process data from several sources and rely on Kafka for an event driven architecture that avoids data loss and provides both real-time speed and historical analysis post-processing capabilities.
The input is processed by our ETLs implemented in Rust and Python distributedly, delivering the analysis outcomes to our ML applications and frontend layer.




Production ready Artificial Intelligence
We build on MLOps pipelines that are optimized for scale, efficiency, and control. We rely on common Python frameworks (Pandas, Spark) to enable (big) data analysis and features engineering, that are sunk in our feature store.
For model building and training we rely on state-of-the-art libraries such as PyTorch, scikit-learn or ONNX. Every experiment and model deployment is tracked by MLFLow, which allows us to use a fine model selection and standardized offline validation procedure.
Finally, we rely on a complete observability platform powered by Azure Monitor and Grafana that monitors our deployed APIs, providing all the insights necessary for online validation and A/B testing.





Solid software built
to last
We deliver our ML applications with the latest web technologies. Our core is developed in React, the type safety of TypeScript and the SSR framework of Nextjs. We like to style our components with Tailwind CSS.
Our backends are polyglots, Rust, Python and Node. We deliver the best possible performance running the same codebase on the cloud or on premise with great performance even on a Raspberry Pi.
We have a fully automated CI/CD pipeline and follow the best DevOps practices, implementing reviews, linting and automated testing. As a small team, we follow a flexible kanban-based methodology. We work on GitHub and love anything that makes our lives easier as developers, e.g., GitHub’s co-pilot.




