Click here to Skip to main content
15,880,469 members
Articles / Artificial Intelligence / Tensorflow

Dockerized AI on Large Models With NLP and Transformers

Rate me:
Please Sign up or sign in to vote.
5.00/5 (4 votes)
19 May 2021CPOL5 min read 6.6K   51   3  
In this article we run an inference model for NLP using models persisted on a Docker volume.
Here we’ll continue to tackle large models, this time for Natural Language Processing (NLP) tasks with PyTorch and Transformers.

Views

Daily Counts

Downloads

Weekly Counts

This article is part of the series 'Containerized AI and Machine Learning View All

License

This article, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)


Written By
Architect
Poland Poland
Jarek has two decades of professional experience in software architecture and development, machine learning, business and system analysis, logistics, and business process optimization.
He is passionate about creating software solutions with complex logic, especially with the application of AI.

Comments and Discussions