Nvidia has released an early version of Chat with RTX, a demo app that allows users to run a personal AI chatbot on their PC. The app can analyze YouTube videos and personal documents to provide summaries and relevant answers based on the user’s data. It requires an RTX 30- or 40-series GPU with at least 8GB of VRAM. While the app is still a bit rough around the edges, it has the potential to be a valuable tool for data research, particularly for journalists and document analysis. Chat with RTX can search through video transcripts and summarize entire videos, although there are some bugs in the current version. It can also scan through PDFs and fact-check data quickly and efficiently. However, the app is still in the early stages and feels like a developer demo. It installs a web server and Python instance on the user’s PC and utilizes Nvidia’s Tensor cores on an RTX GPU for faster queries. The installation process takes around 30 minutes and the app is quite large in size. There are known issues and limitations, such as inaccurate source attribution and the inability to remember context for follow-up questions. Despite its limitations, Chat with RTX showcases the potential of AI chatbots running locally on PCs, providing an alternative to subscription-based services for personal file analysis.
