The Cost of AI and Solving it Locally

A presentation at THAT Conference, WI 2023 in in Wisconsin Dells, WI, USA by Matt Williams

A new tool comes out daily offering magical answers to every question thru AI. But at what cost? Samsung learned the hard way that anything you share, ChatGPT will share for the next person to find. Midjourney and other tools make it obvious that if you are not paying, you don’t own the results. Now companies are limiting their employees from using the services or risk dismissal. So what are people to do? One option is to run the tools locally. But do you know how? In this session we will look at how Large Language Models work at a high level and the dangers posed by them. Then we will review a number of the great tools out there to run models locally. We will end the session looking at Ollama, a new LLM runner that is changing the game by applying Docker technologies to this new world.

Resources

The following resources were mentioned during the presentation or are useful additional information.