KS-005616
Leistungsbeschreibung
Algo Factory: DevOps, storage & logs concept
The Algo Factory is a cloud infrastructure and framework based on Azure Kubernetes Service, AzureML, MLFlow, Postgres and Python packages, that enable our typical Algo Trading workflows through deployment and artifacts. This framework is actively used and we would like to expand our DevOps capabilities, storage and logs concept.
As a relatively new framework, our apps do not have capacity for providing 24/7 support. During strategy workshops, we’ve defined that a solution, probably reaching agreements with 3rd party providers, makes sense. This, however, needs conceptualisation and operationalisation.
Additionally, while we store logs in all our applications in many forms (f.e. Azure ML scheduled runs, application logs, Kubernetes logs etc.) we’ve identified at least one gap (Kubernetes services) in which it might be required to store logs for longer period than the default 30 days.
Also, we currently have many trading alerts and models running in the Algo Factory, but for which we do not store predictions in a file or database. The idea will be that all our alerts should now make use of the provisioned Postgres DBs to store predictions and enable more analytics on top.
Anforderungen
In essence, the project involves the following concrete activities and deliverables:
- Conceptualize and operationalize (by aligning with relevant stakeholders and 3rd party providers) 24/7 support for Algo Trading domain applications, with different escalation levels.
- Conceptualize and operationalize logging concept and harmonization of applications, guaranteeing 5 years of data logs for all Algo Trading applications in Azure (f.e. in storage account or log analytics), in a cost-effective way.
- Recording our trading signals or alerts in a Postgres database and/or Snowflake to enable post-event analytics is cornerstone to tune our signals, and train models using them.
- Create infrastructure as code from Algo Factory resource groups, facilitating migration to different Azure tenant.
- Maintain and expand existing infrastructure, accounting for desired business objectives (f.e. creating Databricks instances)
- Conceptualize and operationalize (by setting up relevant Azure resources and automations) logging concept and harmonization of applications, guaranteeing 5 years of data logs for all Algo Trading applications in Azure Kubernetes Service, in a cost-effective way.
- Conceptualize and operationalize records of existing trading signals or alerts in existing infrastructure (f.e. Postgres database, storage account and/or Snowflake), to enable post-event analytics.
- Evaluate and operationalize usage of existing big data sets (f.e. tick data) so business analysts and / or data scientists are able to use it for their analysis.
Über den Auftraggeber
Start: 05.01.2026
End: 03.07.2026
Offsite hours: 960
utilisation: 100%
Only nearshore candidates are being sought!

