Create an account


Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Fedora - Create a containerized machine learning model

#1
Create a containerized machine learning model

After data scientists have created a machine learning model, it has to be deployed into production. To run it on different infrastructures, using containers and exposing the model via a REST API is a common way to deploy a machine learning model. This article demonstrates how to roll out a TensorFlow machine learning model, with a REST API delivered by Connexion in a container with Podman.

Preparation


First, install Podman with the following command:

sudo dnf -y install podman

Next, create a new folder for the container and switch to that directory.

mkdir deployment_container && cd deployment_container

REST API for the TensorFlow model


The next step is to create the REST-API for the machine learning model. This github repository contains a pretrained model, and well as the setup already configured for getting the REST API working.

Clone this in the deployment_container directory with the command:

git clone https://github.com/svenboesiger/titanic_tf_ml_model.git

prediction.py & ml_model/


The prediction.py file allows for a Tensorflow prediction, while the weights for the 20x20x20 neural network are located in folder ml_model/.

swagger.yaml


The file swagger.yaml defines the API for the Connexion library using the Swagger specification. This file contains all of the information necessary to configure your server to provide input parameter validation, output response data validation, URL endpoint definition.

As a bonus Connexion will provide you also with a simple but useful single page web application that demonstrates using the API with JavaScript and updating the DOM with it.

swagger: "2.0" info: description: This is the swagger file that goes with our server code version: "1.0.0" title: Tensorflow Podman Article consumes: - "application/json" produces: - "application/json" basePath: "/" paths: /survival_probability: post: operationId: "prediction.post" tags: - "Prediction" summary: "The prediction data structure provided by the server application" description: "Retrieve the chance of surviving the titanic disaster" parameters: - in: body name: passenger required: true schema: $ref: '#/definitions/PredictionPost' responses: '201': description: 'Survival probability of an individual Titanic passenger' definitions: PredictionPost: type: object

server.py & requirements.txt


server.py  defines an entry point to start the Connexion server.

import connexion app = connexion.App(__name__, specification_dir='./') app.add_api('swagger.yaml') if __name__ == '__main__': app.run(debug=True)

requirements.txt defines the python requirements we need to run the program.

connexion tensorflow pandas

Containerize!


For Podman to be able to build an image, create a new file called “Dockerfile” in the deployment_container directory created in the preparation step above:

FROM fedora:28 # File Author / Maintainer MAINTAINER Sven Boesiger <[email protected]> # Update the sources RUN dnf -y update --refresh # Install additional dependencies RUN dnf -y install libstdc++ RUN dnf -y autoremove # Copy the application folder inside the container ADD /titanic_tf_ml_model /titanic_tf_ml_model # Get pip to download and install requirements: RUN pip3 install -r /titanic_tf_ml_model/requirements.txt # Expose ports EXPOSE 5000 # Set the default directory where CMD will execute WORKDIR /titanic_tf_ml_model # Set the default command to execute # when creating a new container CMD python3 server.py

Next, build the container image with the command:

podman build -t ml_deployment .

Run the container


With the Container image built and ready to go, you can run it locally with the command:

podman run -p 5000:5000 ml_deployment

Navigate to http://0.0.0.0:5000/ui in your web browser to access the Swagger/Connexion UI and to test-drive the model:

Of course you can now also access the model with your application via the REST-API.

Reply



Possibly Related Threads…
Thread Author Replies Views Last Post
  Fedora - Create a wifi hotspot with Raspberry Pi 3 and Fedora xSicKxBot 0 1,579 08-12-2020, 09:36 AM
Last Post: xSicKxBot
  Fedora - Learning about Partitions and How to Create Them for Fedora xSicKxBot 0 1,638 01-21-2020, 03:57 AM
Last Post: xSicKxBot
  Fedora - Create virtual machines with Cockpit in Fedora xSicKxBot 0 1,715 11-27-2019, 11:12 PM
Last Post: xSicKxBot
  Fedora - Fedora job opening: Fedora Community Action and Impact Coordinator (FCAIC) xSicKxBot 0 1,680 07-11-2019, 10:39 AM
Last Post: xSicKxBot
  Fedora - Contribute at the Fedora Test Day for Fedora Modularity xSicKxBot 0 1,824 03-28-2019, 11:12 PM
Last Post: xSicKxBot
  How Machine Learning Will Change Software Development xSicKxBot 0 1,842 01-26-2019, 02:31 AM
Last Post: xSicKxBot
  An Introduction to the Machine Learning Platform as a Service xSicKxBot 0 1,673 01-17-2019, 05:32 AM
Last Post: xSicKxBot
  Machine Learning, Biased Models, and Finding the Truth xSicKxBot 0 1,704 11-27-2018, 05:55 PM
Last Post: xSicKxBot
  Using Text Mining and Machine Learning to Enhance the Credit Risk Assessment Process xSicKxBot 0 1,664 10-05-2018, 12:48 AM
Last Post: xSicKxBot
  Fedora - How to use Fedora Server to create a router / gateway xSicKxBot 0 1,574 08-03-2018, 10:56 PM
Last Post: xSicKxBot

Forum Jump:


Users browsing this thread:
1 Guest(s)

Forum software by © MyBB Theme © iAndrew 2016