Create an account


Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Fedora - Create a containerized machine learning model

#1
Create a containerized machine learning model

<div style="margin: 5px 5% 10px 5%;"><img src="http://www.sickgaming.net/blog/wp-content/uploads/2018/11/create-a-containerized-machine-learning-model.png" width="682" height="1024" title="" alt="" /></div><div><p>After data scientists have created a machine learning model, it has to be deployed into production. To run it on different infrastructures, using containers and exposing the model via a REST API is a common way to deploy a machine learning model. This article demonstrates how to roll out a <a href="https://www.tensorflow.org">TensorFlow</a> machine learning model, with a REST API delivered by <a href="https://connexion.readthedocs.io/en/latest/">Connexion</a> in a container with <a href="https://fedoramagazine.org/running-containers-with-podman/">Podman</a>.</p>
<p><span id="more-22888"></span></p>
<h2>Preparation</h2>
<p>First, install Podman with the following command:</p>
<pre>sudo dnf -y install podman</pre>
<p>Next, create a new folder for the container and switch to that directory.</p>
<pre>mkdir deployment_container &amp;&amp; cd deployment_container</pre>
<h2>REST API for the TensorFlow model</h2>
<p>The next step is to create the REST-API for the machine learning model. This <a href="https://github.com/svenboesiger/titanic_tf_ml_model">github repository</a> contains a pretrained model, and well as the setup already configured for getting the REST API working.</p>
<p>Clone this in the deployment_container directory with the command:</p>
<pre>git clone https://github.com/svenboesiger/titanic_...l.git</pre>
<h4>prediction.py &amp; ml_model/</h4>
<p>The <a href="https://github.com/svenboesiger/titanic_tf_ml_model/blob/master/prediction.py">prediction.py</a> file allows for a Tensorflow prediction, while the weights for the 20x20x20 neural network are located in folder <a href="https://github.com/svenboesiger/titanic_tf_ml_model/tree/master/ml_model/titanic"><em>ml_model</em>/</a>.</p>
<h4>swagger.yaml</h4>
<p>The file swagger.yaml defines the API for the Connexion library using the <a href="https://github.com/OAI/OpenAPI-Specification/blob/master/versions/2.0.md">Swagger specification</a>. This file contains all of the information necessary to configure your server to provide input parameter validation, output response data validation, URL endpoint definition.</p>
<p>As a bonus Connexion will provide you also with a simple but useful single page web application that demonstrates using the API with JavaScript and updating the DOM with it.</p>
<pre>swagger: "2.0" info: description: This is the swagger file that goes with our server code version: "1.0.0" title: Tensorflow Podman Article consumes: - "application/json" produces: - "application/json" basePath: "/" paths: /survival_probability: post: operationId: "prediction.post" tags: - "Prediction" summary: "The prediction data structure provided by the server application" description: "Retrieve the chance of surviving the titanic disaster" parameters: - in: body name: passenger required: true schema: $ref: '#/definitions/PredictionPost' responses: '201': description: 'Survival probability of an individual Titanic passenger' definitions: PredictionPost: type: object</pre>
<h4>server.py &amp; requirements.txt<em><br />
</em></h4>
<p><a href="https://github.com/svenboesiger/titanic_tf_ml_model/blob/master/server.py"><em>server.py</em></a>  defines an entry point to start the Connexion server.</p>
<pre>import connexion app = connexion.App(__name__, specification_dir='./') app.add_api('swagger.yaml') if __name__ == '__main__': app.run(debug=True)</pre>
<p><a href="https://github.com/svenboesiger/titanic_tf_ml_model/blob/master/requirements.txt"><em>requirements.txt</em></a> defines the python requirements we need to run the program.</p>
<pre>connexion tensorflow pandas</pre>
<h2>Containerize!</h2>
<p>For Podman to be able to build an image, create a new file called “Dockerfile” in the <strong>deployment_container</strong> directory created in the preparation step above:</p>
<pre>FROM fedora:28 # File Author / Maintainer MAINTAINER Sven Boesiger &lt;[email protected]&gt; # Update the sources RUN dnf -y update --refresh # Install additional dependencies RUN dnf -y install libstdc++ RUN dnf -y autoremove # Copy the application folder inside the container ADD /titanic_tf_ml_model /titanic_tf_ml_model # Get pip to download and install requirements: RUN pip3 install -r /titanic_tf_ml_model/requirements.txt # Expose ports EXPOSE 5000 # Set the default directory where CMD will execute WORKDIR /titanic_tf_ml_model # Set the default command to execute # when creating a new container CMD python3 server.py</pre>
<p>Next, build the container image with the command:</p>
<pre>podman build -t ml_deployment .</pre>
<h2>Run the container</h2>
<p>With the Container image built and ready to go, you can run it locally with the command:</p>
<table class="wysiwyg-macro">
<tbody>
<tr>
<td class="wysiwyg-macro-body">
<pre>podman run -p 5000:5000 ml_deployment</pre>
</td>
</tr>
</tbody>
</table>
<p>Navigate to <a href="http://0.0.0.0:5000/">http://0.0.0.0:5000/ui</a> in your web browser to access the Swagger/Connexion UI and to test-drive the model:</p>
<p><img class="alignnone size-large wp-image-23037" src="http://www.sickgaming.net/blog/wp-content/uploads/2018/11/create-a-containerized-machine-learning-model.png" alt="" width="616" height="925" /></p>
<p>Of course you can now also access the model with your application via the REST-API.</p>
</div>
Reply



Forum Jump:


Users browsing this thread:
1 Guest(s)

Forum software by © MyBB Theme © iAndrew 2016