Last week, we went over some basics of Artificial Intelligence (AI) using Ollama, Llama3, and some custom code. Artificial intelligence (AI) encompasses a broad range of technologies designed to enable machines to perform tasks that typically require human intelligence. These tasks include understanding spoken or written language, recognizing visual patterns, making decisions, and providing recommendations. Machine learning (ML) is a specialized subset of AI that focuses on developing systems that improve their performance over time without being explicitly programmed. Instead, ML algorithms analyze and learn from large datasets to identify patterns and make decisions based on these insights. This learning process allows ML models to make increasingly accurate predictions or decisions as they are exposed to more data.
A few months ago, I added Liner to the resource page of my website. It allows you to easily train an ML model so that you can do image, text, audio, or video classification, object detection, image segmentation, or pose classification. I created “Is this Joe or Not Joe?” using that tool. TensorFlow.js is running client-side with a model that is trained on a half dozen examples of photos that are Joe and a half dozen examples of photos that are not Joe. You can supply a photo and get a prediction if Joe is in the image or not. You can always retrain the existing model with more examples. That is an example of machine learning.
So, you can think of ML as a subset of AI and Deep Learning (DL) as a subset of ML.
Have any questions, comments, etc? Please feel free to drop a comment, below.
📣 Exciting news, everyone! 🌟 Make sure to head over to this weeks blog "What's new in R 4.4.0?" by Russ Hyde, and dive into the world of the latest R release📊🔬💻
Discover some of the amazing new features that this version has to offer! 🔍 🔭 🚀
Very nice picture that was shared by Ronald van Loon on X, you can discuss if the categories are complete and correct, but it illustrates that the field of AI is much more then just transformers/LLMs. #AI#Machinelearning#neuralnetworks#deeplearning#LLM#Transfomers
Lots of people who work in #AI have, in their head, an idea about what sort of interaction with an #LLMmight give them pause. The thing that might make them start to suspect that something interesting is happening.
Here's mine:
User: Tell me a cat joke.
LLM: Why did the cat join a band? He wanted to be a purr-cussionist.
The skforecast Python 🐍 library provides ML applications for time series forecasting using different regression models from the scikit-learn library. Here is a tutorial by Joaquín Amat Rodrigo and Javier Escobar Ortiz for time series forecasting with the skforecast using XGBoost, LightGBM, Scikit-learn, and CatBoost models 🚀.
:blobcat_think: I think I've figured out what's been bothering me about this: the text here implies data organises itself.
AI is both the dataset and the organising analysis and management structure that implements decisions/responses based on that dataset.
Where 'Cloud' is an empty marketing term and 'other people's computers' accurately states the real condition, this text here presents only a partial representation of what comprises an AI.
Assuming this is deliberate to highlight the mass theft of data the use of "other people's" from the original phrase doesn't directly state no permission was given for that use. Saying "Just stolen data" would make that point crystal clear.
Sorry. This is pure pedantry from me but it really has been niggling at me since i saw this a week ago. Apparently I'll get no peace if I don't let this out!
[1/2] Surprising findings in brain research 🧠: As a team from #CharitéBerlin shows in #Science, thoughts in the human neocortex flow in one direction ⬆️, as opposed to the loops seen in mice 🔄. That makes processing information extra efficient. These discoveries could further the development of artificial neural networks.
Meta released today Llama 3, the next generation of the Llama model. LLama 3 is a state-of-the-art open-source large language model. Here are some of the key features of the model: 🧵👇🏼
A major release to Ollama - version 0.1.32 is out. The new version includes:
✅ Improvement of the GPU utilization and memory management to increase performance and reduce error rate
✅ Increase performance on Mac by scheduling large models between GPU and CPU
✅ Introduce native AI support in Supabase edge functions