Practical ML

Open Source MLOps infrastructure available in the cloud or on-premise. Machine Learning APIs, based on state-of-the-art models.

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15
import skipper.EventProducer
@app.task(name='api.workflow') def workflow(payload):
p_json=json.loads(payload)
task=p_json['task']
queue=helper.call(task)
event_p=EventProducer(params)
resp=event_p.call(queue,payload)
resp_json=json.loads(resp)
celery_log.info(task)
return resp_json
# Katana ML
# Celery async API func
# Workflow API func
# Data processing
# Workflow queue name
# Skipper event producer
# Event publishing
# Event response processing
# Celery queue logging
# Returning event result
Runnning on Open Source
Logo
Logo
Logo
Logo
Logo
Image Description
Features
Simple generic API

I can use simple and reliable API. REST API endpoint runs async and sync calls.

Image Description
Features
Pluggable ML services

I can run any ML workload. Skipper MLOps infra runs ML services as Docker containers.

Image Description
Features
Reliable ML services architecture

I can run the system, even when part of the services are down. This is ensured by event-based communication.

Image Description
Features
System traceability

I can trace and see what happens in the system. All events are logged through a dedicated container.

Run your MLOps solution on Skipper

Skipper is a simple and flexible ML workflow engine.

Image Description
https://github.dev/katanaml/katana-skipper
Image Description

Designed for
ML services

Many ML projects are implemented with Jupyter notebooks. It is really hard to scale and maintain such solutions when deployed to production. We recommend investing in MLOps early. Develop your ML solution with Skipper to be ready for production deployment.

Data processing

Data is processed in separate container, this makes it easier to apply future changes.

Prediction service scaling

Prediction service is configured to run in separate Kubernetes Pod for better scalability.

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17
import tensorflow as tf
import numpy as np
from skipper_lib import events
def run_training(self, data):
dataset=prepare(data)
m=build_model()
о=tf.keras.optimizers.SGD()
m.compile(o,loss,metrics)
h=model.fit(dataset,epochs)
model.evaluate(dataset_val)
model.save(path)