課程目錄:TensorFlow Lite for Embedded Linux培訓
        4401 人關注
        (78637/99817)
        課程大綱:

          TensorFlow Lite for Embedded Linux培訓

         

         

        Introduction

        TensforFlow Lite's game changing role in embedded systems and IoT
        Overview of TensorFlow Lite Features and Operations

        Addressing limited device resources
        Default and expanded operations
        Setting up TensorFlow Lite

        Installing the TensorFlow Lite interpreter
        Installing other TensorFlow packages
        Working from the command line vs Python API
        Choosing a Model to Run on a Device

        Overview of pre-trained models: image classification, object detection, smart reply, pose estimation, segmentation
        Choosing a model from TensorFlow Hub or other source
        Customizing a Pre-trained Model

        How transfer learning works
        Retraining an image classification model
        Converting a Model

        Understanding the TensorFlow Lite format (size, speed, optimizations, etc.)
        Converting a model to the TensorFlow Lite format
        Running a Prediction Model

        Understanding how the model, interpreter, input data work together
        Calling the interpreter from a device
        Running data through the model to obtain predictions
        Accelerating Model Operations

        Understanding on-board acceleration, GPUs, etc.
        Configuring Delegates to accelerate operations
        Adding Model Operations

        Using TensorFlow Select to add operations to a model.
        Building a custom version of the interpreter
        Using Custom operators to write or port new operations
        Optimizing the Model

        Understanding the balance of performance, model size, and accuracy
        Using the Model Optimization Toolkit to optimize the size and performance of a model
        Post-training quantization
        Troubleshooting

        Summary and Conclusion