Walk through the steps to author, optimize, and deploy a custom TensorFlow Lite model to mobile using best practices and the latest developer tooling. This includes using the model authoring APIs, applying and debugging model optimization techniques such as quantization, benchmarking on a real device, and deployment to Android.
Resource:
Converting your model - goo.gle/3M5yux0
Analyze your converted model - goo.gle/3L76HLc
TensorFlow Model Optimization - goo.gle/39T3iCX
Quantization Debugger - goo.gle/3N4uE7h
Performance best practices - goo.gle/3Lgcxdv
TensorFlow Lite website - goo.gle/37BhVdk
TensorFlow Forum - goo.gle/3L0RxY0
TensorFlow website → goo.gle/3KejoUZ
Follow on Twitter - goo.gle/3sq7a4C
Speakers: Arun Venkatesan, Yu-Cheng Ling, Adam Koch
Watch more:
All Google I/O 2022 Sessions → goo.gle/IO22_A...
ML/AI at I/O 2022 playlist → goo.gle/IO22_M...
All Google I/O 2022 technical sessions → goo.gle/IO22_S...
Subscribe to TensorFlow → goo.gle/Tensor...
#GoogleIO
Негізгі бет Deploy a custom Machine Learning model to mobile
Пікірлер: 2