1st generation programming languages – MACHINE LANGUAGE In the beginning there was machine language – 1s and 0s only, sometimes entered by manually connecting cables into the right spot and setting the correct switches the right way (e.g., ENIAC). This is what is meant as a 1st generation programming language. 2nd gen languages – ASSEMBLY… Continue reading 5th generation programming language?
a tale of complexity
Let me tell you a tale – a tale of magic and power. Ok, maybe there isn’t any real magic to this other than the magic described by Arthur Clarke in this quote: “Any sufficiently advanced technology is indistinguishable from magic.” My reference to power is in the sense of what you as a third… Continue reading a tale of complexity
chatGPT code hallucinations
By this point, many people understand the concept of an AI “hallucination”. This is the term that has come to describe incorrect information stated as facts in the output of a chatGPT prompt. For example, if you ask “Who is Brian Toone?” to chatGPT, you get the response below which has some correct information, but… Continue reading chatGPT code hallucinations
Llama, Llama! (Red Pajama)
This was a fun book we used to read to our kids … but in other exciting news, Meta approved my research request for access to the LLaMA pre-trained large language model. 4/18 5:27pm. I started using the llama download script to download all the data, but the download.sh script doesn’t work by default on… Continue reading Llama, Llama! (Red Pajama)
Win11 officeGPT
Through my efforts to install the Stanford Alpaca system on my M1 mac studio that I dubbed homeGPT, I discovered that some of the required packages are configured to only use CUDA … in other words I need an NVIDIA graphics card. One of my research computers from school is a Windows 11 system with… Continue reading Win11 officeGPT
homeGPT, cont’d
Stanford Alpaca takes the LLaMA pre-trained model released by the Meta AI and “fine-tunes” the model to respond better to natural language questions and prompts (exactly what chatGPT does). Note, the screenshot above is from LLaMA, not from Alpaca. This shows you the value that alpaca adds as the output from the example LLaMA is… Continue reading homeGPT, cont’d
ChatGPT programming
I’m still waiting to hear back from Meta (Facebook) about my application for the llama dataset. I can install all the tools, but the pre-trained model which took months to train is behind an application form. It was leaked online through 4chan with someone then submitting a GitHub pull request to update the AI research… Continue reading ChatGPT programming
HomeGPT, part 1 – PyTorch
3/21 6:48AM. The genie truly is out of the bottle. I am in the process of installing LLaMA on my home mac studio. Here’s the play-by-play. LLaMA is based on PyTorch, so in this part, I install and activate the latest version of pytorch optimized for my M1 Ultra. Some notes … pyenv vs virtualenv… Continue reading HomeGPT, part 1 – PyTorch
Improper access control
That’s a lot of duplicates! I’ve written a script to remove duplicates. I’ve run that script multiple times, and every few months when I go to create a new lifetime overlay of all my rides, I’ve noticed the duplicates have reappeared. Well, it’s been a couple months and they are back! My first thought was… Continue reading Improper access control
iMac woes
Early in the summer I found my late 2014 iMac had restarted itself once or twice on its own with “Unexpected Shutdown” messages. One time, it failed to reboot and took me to the Recovery screen with Disk Utility as an option. I ended up “checking” the drives and it reported no errors and so… Continue reading iMac woes