Creating ML model, saving it, and creating Flask API


PYTHON CODE FOR CREATING AND SAVING MODEL:
# Part 1

from sklearn.neighbors import KNeighborsClassifier

X = [[0], [1], [2], [3]] # Features
y = [0, 0, 1, 1] # Labels / Classes 

neigh = KNeighborsClassifier(n_neighbors=3)
neigh.fit(X, y) # Returns "KNeighborsClassifier(...)"

print(neigh.predict([[1.1]])) # Prints "[0]"

print(neigh.predict_proba([[0.9]])) # Prints "[[0.66 0.33]]". O.66 probability that it is from class "0".

# Part 2
# Saving the model.

from joblib import load, dump

dump(neigh, "model.joblib") # It will create the file "model.joblib" in the present working directory.

# Ref 1: https://scikit-learn.org/stable/modules/generated/sklearn.neighbors.KNeighborsClassifier.html
# Ref 2: https://stackoverflow.com/questions/33497314/sklearn-dumping-model-using-joblib-dumps-multiple-files-which-one-is-the-corre#

EXECUTION LOGS:
(base) C:\Users\ashish\Desktop\demo>dir

 Directory of C:\Users\ashish\Desktop\demo

03/21/2020  10:37 AM               678 creating_and_saving_ml_model.py
               1 File(s)            678 bytes
               2 Dir(s)  39,185,403,904 bytes free

(base) C:\Users\ashish\Desktop\demo>python creating_and_saving_ml_model.py
[0]
[[0.66666667 0.33333333]]

(base) C:\Users\ashish\Desktop\demo>dir

03/21/2020  10:37 AM               678 creating_and_saving_ml_model.py
03/21/2020  10:38 AM               784 model.joblib

SERVER PYTHON CODE:

from joblib import load
from flask import Flask, jsonify, request
from flask_cors import CORS, cross_origin

model_path = r'model.joblib'

app = Flask(__name__)
cors = CORS(app)
app.config['CORS_HEADERS'] = 'Content-Type'

@app.route("/classify", methods=['POST'])
@cross_origin()
def pleaseClassify():
 print("Content-Type: " + request.headers['Content-Type'])
 if request.headers['Content-Type'] == 'text/plain':
  return "Text Message: " + request.data

 elif request.headers['Content-Type'] == 'application/json':
  print("inputArray: {}".format(request.json['inputArray']))

 elif request.headers['Content-Type'] == 'application/octet-stream':
  f = open('./binary', 'wb')
  f.write(request.data)
  f.close()
  print("Binary message written!")
  
 elif request.headers['Content-Type'] == 'application/x-www-form-urlencoded; charset=UTF-8':
  print(request.form)
  
 else:
  return "415 Unsupported Media Type ;)"
 
 predicted_label = clf.predict(request.json['inputArray'])
 
 prediction_proba = clf.predict_proba(request.json['inputArray'])
 
 predictions = { "label": predicted_label, "proba": prediction_proba } 

 #return Response(str(predictions), status=200, mimetype='application/json') # This line throws error in JSON parsing in browser
 return jsonify(str(predictions))

if __name__ == "__main__":
 model_path = model_path.replace('\\', '/')
 clf = load(model_path)
 app.run(host = "0.0.0.0", port = 65535)

CLIENT PYTHON CODE:

import requests
headers = {'content-type': 'application/json'}

URL = "http://127.0.0.1:65535/classify"
r = requests.post(url = URL, data = {}, json = { "inputArray": [[0.95]] }, headers = headers)
print("Response text: " + r.text)


CODE FILES LOCATION:
(base) C:\Users\ashish\Desktop\demo>dir

03/21/2020  11:03 AM               240 client.py
03/21/2020  10:39 AM               680 creating_and_saving_ml_model.py
03/21/2020  10:38 AM               784 model.joblib
03/21/2020  11:02 AM             1,474 server.py
      
SERVER LOGS:

(base) C:\Users\ashish\Desktop\demo>python server.py
 * Serving Flask app "server" (lazy loading)
 * Environment: production
   WARNING: This is a development server. Do not use it in a production deployment.
   Use a production WSGI server instead.
 * Debug mode: off
 * Running on http://0.0.0.0:65535/ (Press CTRL+C to quit)
Content-Type: application/json
inputArray: [[0.95]]
127.0.0.1 - - [21/Mar/2020 11:00:42] "[37mPOST /classify HTTP/1.1[0m" 200 -


CLIENT LOGS:

(base) C:\Users\ashish\Desktop\demo>python client.py
Response text: "{'label': array([0]), 'proba': array([[0.6667, 0.3333]])}"


CODE:
Link to Code

No comments:

Post a Comment