How to bundle Tensorflowjs model in npm package

dannadori
3 min readOct 5, 2021

Note:
This article is also available here.(Japanese)
https://zenn.dev/wok/articles/0021_bundle-tensorflowjs-model

Introduction

I’ve been creating npm packages to run various machine learning models with WebWorker. But, after npm install, I had to copy the tensorflowjs model, which made the installation a bit tedious.

I was wondering if I could somehow bundle and release it with webpack, and I was able to do so, so I’d like to show you how to do it here.

Note: When bundling models created by others, please be careful about the license. Also, once a model is bundled, it becomes difficult to replace it with a new model later on.

Premise

I’m assuming webpack 5 is used, I haven’t tried it with webpack 4.

Creating npm package

First, place the tensorflowjs model in the root of the npm package as a preparation. In this case, we will place it under a folder called resources.

$ ls resources/bisenetv2-celebamask/
group1-shard1of3.bin group1-shard2of3.bin group1-shard3of3.bin model.json

The next step is to set the rules for bundling these model files in webpack.config.js. With webpack5, url-loader and file-loader are no longer needed, and the configuration has been simplified.

It is better to use asset/source for JSON and asset/inline for binary.

rules: [
{ test: /\.ts$/, loader: 'ts-loader' },
{ test:/resources\/.*\.bin/, type:"asset/inline"},
{ test:/resources\/.*\.json/, type:"asset/source"}
],

The code for the npm package looks like this

import * as tf from '@tensorflow/tfjs';

import modelJson from "../resources/bisenetv2-celebamask/model.json" // <------ (1)
import modelWeight1 from "../resources/bisenetv2-celebamask/group1-shard1of3.bin" // <------ (2)
import modelWeight2 from "../resources/bisenetv2-celebamask/group1-shard2of3.bin" // <------ (2)
import modelWeight3 from "../resources/bisenetv2-celebamask/group1-shard3of3.bin" // <------ (2)

export class BiseNetV2{
model: tf.GraphModel | null = null
canvas = document.createElement("canvas")

init = async () => {
const modelJson2 = new File([modelJson], "model.json", {type: "application/json"}) // <------ (3)
const b1 = Buffer.from(modelWeight1.split(',')[1], 'base64') // <------ (4)
const modelWeights1 = new File([b1], "group1-shard1of3.bin") // <------ (4)
const b2 = Buffer.from(modelWeight2.split(',')[1], 'base64') // <------ (4)
const modelWeights2 = new File([b2], "group1-shard2of3.bin") // <------ (4)
const b3 = Buffer.from(modelWeight3.split(',')[1], 'base64') // <------ (4)
const modelWeights3 = new File([b3], "group1-shard3of3.bin") // <------ (4)
this.model = await tf.loadGraphModel(tf.io.browserFiles([modelJson2, modelWeights1, modelWeights2, modelWeights3])) // <------ (5)
}
predict = async (targetCanvas: HTMLCanvasElement, processWidth: number, processHeight: number): Promise<number[][]> => {
<snip...>
}

}

(1) imports a JSON file.
(2) imports a binary file.
(3) imports the data of the JSON file as a file.
(4) imports the data of a binary file as a file.
(5) specifies a file and loads a tensorflowjs model.
In this example, GraphModel is loaded, but LayersModel can be loaded in the same way.

Creating Application using this npm package

If you just want to use it normally, you can import the package and just use new(). However, packages bundled with tensorflowjs models usually tend to be large in size. For this reason, it is best to load it only when you are going to use it.

If you are using create-react-app to create your application, you can split the code so that it is loaded only when you use it. If you do the import function as follows, create-react-app will split the module into different chunks.

const mod = await import('bisenetv2-js')
const lib = new mod.BiseNetV2()

Repository

The contents described here are stored in the following repository.

https://github.com/w-okada/bundle-tensorflowjs

Follow the readme and try to run the commands.

The amount of code is small, so you should be able to understand it in about 10 minutes while running it.

I am very thirsty!!

Finally

We have introduced how to bundle tensorflowjs models. However, as we mentioned at the beginning, bundling tensorflowjs models has good points and bad points. It may make installing packages easier, but it also has the following disadvantages.

  • Replacing models becomes more difficult.
  • The size of the package increases.
  • Models that cannot be modified cannot be bundled (probably)

We ask that you make a good decision on whether you want to bundle or not.

--

--