Large language models can have billions, or even trillions, of parameters. But how big do they need to be to achieve acceptable performance? To test this, I experimented with several of Google’s Gemma 3 models, all small enough to run locally on a single GPU. Specifically I used the 1 […]
Tag: AI
Making LLMs Useful with Function Calls and Embeddings
Large Language Model AIs like Google’s Gemini and Open-AI’s GPT can be interesting to play around with even in a simple chatbot like Chat-GPT. But those chatbots largely waste their potential. Their understanding of natural language is impressive, but their “knowledge” is limited to what they were trained on. But […]
No, AI is not going to destroy the world in 10 years.
The Fallacy of the Chinese Room
If you spend much time reading about Artificial Intelligence and philosophy of the mind, you will hear about John Searle’s Chinese Room thought experiment. Written in 1980, it is an argument that however well a computer is able to imitate a mind, it cannot actually think. The argument goes something […]