AI in container (Made with Fooocus)

Run pycoral inside docker

How to accelerate code in a contenarised environment

Filippo Valle
3 min readJan 17, 2024

--

Problem

I wanted to run inference of Fashion MNIST using the new Google Coral acccelerator.

In the realm of edge computing, the Coral Edge TPU stands out as a potent hardware accelerator for machine learning tasks. In this article, we delve into a Python script that harnesses the capabilities of Coral Edge TPU to make predictions on a test dataset. Not only that, but it also provides a visual representation of the model’s performance through a confusion matrix. Let’s break down the code and understand how Coral Edge TPU is seamlessly integrated into the workflow.

Device

Google coral is a TPU accelerator via USB

Google coral (Image by Coral website)

Dockerfile

The container uses Debian Buster, to be sure it is compatible with all the necessary dependencies, as the base image and installs necessary dependencies, including Coral Edge TPU packages, Python packages from a requirements.txt file, and key utilities. It copies two scripts, convert.sh and run.sh, to the /home/ directory. The run.sh script is set as the default command when the container starts. The image is configured for Coral Edge TPU development and application execution.

FROM debian:buster

RUN apt-get update && apt-get install --yes curl gpg
RUN echo "deb https://packages.cloud.google.com/apt coral-edgetpu-stable main" | tee /etc/apt/sources.list.d/coral-edgetpu.list
RUN curl https://packages.cloud.google.com/apt/doc/apt-key.gpg | apt-key add -
RUN apt-get update
RUN apt-get install --yes libedgetpu1-std
RUN apt-get install --yes python3-pycoral
RUN apt-get install --yes edgetpu-compiler
RUN apt-get install --yes usbutils
RUN apt-get install --yes python3-pip
RUN python3 -m pip install --no-cache-dir -U pip

COPY requirements.txt /home/data/requirements.txt
RUN python3 -m pip install --no-cache-dir -r /home/data/requirements.txt
COPY convert.sh /home/.
COPY run.sh /home/.

ENTRYPOINT ["sh"]
CMD ["/home/run.sh"]

Training

The training was done in a quite standard fashion using keras.

The model was converted to tflite using the tensorflow tools.

saved_keras_model = 'model.h5'
model.save(saved_keras_model)

def representative_dataset():
for data in tf.data.Dataset.from_tensor_slices((X_test.numpy().reshape(-1,28,28,1))).batch(1).take(100):
yield [tf.dtypes.cast(data, tf.float32)]

converter = tf.lite.TFLiteConverter.from_keras_model(model)
converter.optimizations = [tf.lite.Optimize.DEFAULT]
converter.target_spec.supported_ops = [tf.lite.OpsSet.TFLITE_BUILTINS_INT8]
converter.target_spec.supported_types = [tf.int8] # extra line missing
converter.representative_dataset=representative_dataset
converter.inference_input_type = tf.uint8
converter.inference_output_type = tf.uint8
tflite_model = converter.convert()


with open('model.tflite', 'wb') as f:
f.write(tflite_model)

Run inference

The inference is performed using the pycoral tools available in the pycoral python package.


from pycoral.adapters import common
from pycoral.adapters import segment
from pycoral.adapters import classify
from pycoral.utils.edgetpu import make_interpreter
import os
os.chdir("/home/data/")


X_test = np.loadtxt("X_test.txt")
Y_test = np.loadtxt("Y_test.txt")

# %%
import os
os.environ["DYLD_LIBRARY_PATH"]="/usr/local/lib"

# %%
interpreter = make_interpreter('model_edgetpu.tflite', device=":0")
interpreter.allocate_tensors()
width, height = common.input_size(interpreter)


# %%
classify.get_classes(interpreter, top_k=1)

# %%
def pred(X_data):
common.set_input(interpreter, X_data.reshape((width, height, 1)))
interpreter.invoke()
return classify.get_classes(interpreter, top_k=1)[0].id

y_pred = [pred(x_test) for x_test in X_test.reshape(-1,28,28,1)]
y_real = Y_test

Repo

All the code to run this model is available at

https://github.com/fvalle1/coral-docker-mnist/tree/main

--

--

Filippo Valle

Interested in physics, ML application, community detection and coding. I have a Ph.D. in Complex Systems for Life Sciences