GPTKB
Browse
Query
Compare
Download
Publications
Contributors
Search
NVIDIA TensorRT Inference Server
URI:
https://gptkb.org/entity/NVIDIA_TensorRT_Inference_Server
GPTKB entity
Statements (33)
Predicate
Object
gptkbp:instanceOf
gptkb:software
machine learning inference server
gptkbp:developer
gptkb:NVIDIA
gptkbp:feature
gptkb:HTTP/REST_API
gptkb:gRPC_API
model versioning
dynamic batching
GPU and CPU support
metrics and logging
model repository management
multi-framework support
gptkbp:firstReleased
2018
https://www.w3.org/2000/01/rdf-schema#label
NVIDIA TensorRT Inference Server
gptkbp:latestReleaseVersion
gptkb:Triton_Inference_Server
gptkbp:license
gptkb:Apache_License_2.0
gptkbp:operatingSystem
gptkb:Linux
gptkbp:platform
gptkb:TensorFlow
gptkb:OpenVINO
gptkb:TensorRT
gptkb:MXNet
gptkb:PyTorch
gptkb:ONNX
gptkbp:programmingLanguage
gptkb:Python
gptkb:C++
gptkbp:renamed
gptkb:NVIDIA_Triton_Inference_Server
gptkbp:successor
gptkb:NVIDIA_Triton_Inference_Server
gptkbp:uses
AI model deployment
deep learning inference
cloud inferencing
edge inferencing
gptkbp:website
https://developer.nvidia.com/nvidia-tensorrt-inference-server
gptkbp:bfsParent
gptkb:NVIDIA_Triton_Inference_Server
gptkbp:bfsLayer
6