Neural Network Workshop – Lab 8 What do I do with the Model?

Save the Model to Disk

Save the Model

  1. Create a JSON representation of your model and store it in a variable called model_json.
  2. Write the JSON representation to a file called model.json in the path specified in the variable outpath.

Reference Material

https://docs.python.org/3.6/library/os.path.html

https://www.pythonforbeginners.com/files/with-statement-in-python

https://www.pythonforbeginners.com/files/reading-and-writing-files-in-python

Hint 1

Serialize the model to JSON like this:

model_json = model.to_json()

[collapse]
Hint 2

Write the model to disk:

with open(os.path.join (outpath, "model.json"), "w") as json_file:
    json_file.write(model_json)

[collapse]
Full Solution
# serialize model to JSON
model_json = model.to_json()
with open(os.path.join (outpath, "model.json"), "w") as json_file:
    json_file.write(model_json)

[collapse]

[collapse]
Pickle the Scalar

  1. Import the Pickle library.
  2. Using the dump method of Pickle write the standscalar to disk in a file named standardscalar.pickle in outpath.

Reference Material

https://docs.python.org/3.6/library/os.path.html

https://www.pythonforbeginners.com/files/with-statement-in-python

https://docs.python.org/2/library/pickle.html

Hint 1

Imports:

import pickle

[collapse]
Hint 2

Pickle the standardscalar:

with open(os.path.join (outpath, "standardscalar.pickle"), 'wb') as handle:
    pickle.dump(sc, handle, protocol=pickle.HIGHEST_PROTOCOL)

[collapse]
Full Solution
#serialize the scalar to pickle
import pickle
with open(os.path.join (outpath, "standardscalar.pickle"), 'wb') as handle:
    pickle.dump(sc, handle, protocol=pickle.HIGHEST_PROTOCOL)

[collapse]

[collapse]
Save the Weights

  1. Use the save_weights method of the Sequencer class to serialize the weights to disk.

Reference Material

https://keras.io/models/about-keras-models/

https://docs.python.org/3.6/library/os.path.html

Full Solution

Set outpath to the folder out in the current folder:

# serialize weights to HDF5
model.save_weights(os.path.join (outpath, "model.h5"))

[collapse]

[collapse]

 

Predict for New Input

Send inputs to your model to perform a prediction

  1. Use the predict method of the Sequential model to generate a new prediction.  Don’t forget to scale your inputs with the same StandardScalar used prior to training the model. Use the following parameters:
  • Route To Market_Other = 0
  • Route To Market_Reseller = 1
  • Route To Market Telecoverage = 0
  • Route To Market Telesales = 0
  • Sales Stage Change Count = 9
  • Total Days Identified Through Closing = 119
  • Total Days Identified Through Qualified = 97
  • Revenue From Client Past Two Years = 0
  • Ratio Days Identified To Total Days = .81
  • Ratio Days Validated To Total Days = .644
  • Ratio Days Qualified To Total Days = .245
  • Deal Size Category = 6
  1. Print either win or loss based on the results of the prediction.

Reference Material

https://keras.io/models/sequential/

http://scikit-learn.org/stable/modules/generated/sklearn.preprocessing.StandardScaler.html

Hint 1

Scale your input like this:

sc.transform(np.array([[???]])))

[collapse]
Hint 2

Make the prediction like this:

new_prediction = model.predict(???)

[collapse]
Hint 3

Output the result:

if (new_prediction > 0.5): 
    print ("Won") 
else: 
    print ("Loss")

[collapse]
Full Solution
#Make a prediction
new_prediction = model.predict(sc.transform(np.array([[0,1,0,0,9,119,97,0,.81,.644,.245,6]])))
if (new_prediction > 0.5):
    print ("Won")
else:
    print ("Loss")

[collapse]

[collapse]

 

Lab Complete!

 

Extra Credit – Output the model

Visualize the model to a PNG file

# Visualize with keras
# to png
from keras.utils import plot_model
plot_model(model, to_file=os.path.join (outpath, 'model.png'))

[collapse]
Visualize the model to the console

# to console
from IPython.display import SVG
from keras.utils.vis_utils import model_to_dot

SVG(model_to_dot(model, show_shapes = True, show_layer_names = True).create(prog='dot', format='svg'))

[collapse]