(2/3) What is the main conclusion from this amazing technology? Hollywood, as we know, is going to change. Whether it will take months or years, the current process of making movies or any similar form of entertainment will change.
The Writers Guild of America strike last year represents the swan song of this dying industry. The early adopters of this GenAI technology in this industry are the ones who will have jobs in the long term.
AI systems of today excel at narrow tasks, such as playing chess or generating text that sounds like something written by a human. But they lack the sort of common sense that would allow them to operate seamlessly in a messy world, do more sophisticated reasoning, and be more helpful to humans
This short tutorial, by Sam Witteveen, provides a short intro to LangGraph 🦜, the LangChain new 🐍 library. The tutorial focuses on the foundations of LangGraph - StateGraph, nodes and edges, agent executer, and agent supervisor.
A new crash course for getting started with #CUDA with #Python by Jeremy Howard 🚀. CUDA is NVIDIA's programming model for parallel computing on GPUs. CUDE is being used by tools such as #PyTorch#tensorflow and other #deeplearning and LLMs frameworks to speed up calculations. The course covers the following topics:
✅ Setting up CUDA
✅ CUDA foundation
✅ Working with Kernel
✅ CUDA with PyTorch
A new course for LangChain by Krish Naik and freeCodeCamp. The course focuses on the functionality of LangChain with a practical example of setting up a chatbot using Streamlit with the following LLMs:
✅ OpenAI's GPT-3.5 and GPT-4
✅ Llama2
✅ Google Gemini Pro
✅ Working with Hugingfaces models
What is the best way, platform or tool for an early stage researcher that wants a personal site to link repositories, show a cv, but also blog about papers and ideas informally?
AI metaphors lurk even in seemingly innocuous statements like "chatbots don't understand what isn't programmed into their datasets."
Words like "understand" and "programmed" don't reflect what's going on under the hood with large language models.
OK, we can't all say "an overfit embedding yields misleading centroids," but we all do need to recognize metaphor's power and perils. This paper is a good start: https://arxiv.org/abs/2401.08711
The LangGraph series provides a short introduction to the LangGraph 🦜🔗 library. This includes the following topics:
✅ The library core functionality
✅ Agend executor
✅ Dynamicly returning a tool output directly
✅ Managing agent steps
(1/2) DeciDiffusion 2.0 is a new Diffusion-based text-to-image generation model by Deci AI. According to the Deci AI, this model is 2.6 times faster in 40% iteration compared to Stable Diffusion v1.5.
Model spec:
➡️ A 732 million-parameter model.
➡️ Enhanced latency is the result of its optimized architecture and scheduler.
➡️ Designed to run optimally on affordable hardware, such as Qualcomm’s Cloud AI 100.
The Machine Learning with Graphs course by Prof. 𝐉𝐮𝐫𝐞 𝐋𝐞𝐬𝐤𝐨𝐯𝐞𝐜 from Stanford University (CS224W) focuses on different methods for analyzing massive graphs and complex networks and extracting insights using machine learning models and data mining techniques. 🧵🧶👇🏼
(2/3) The course includes 47 lectures, and it covers topics such as:
✅ ML applications for graph
✅ Graph neural networks (GNN)
✅ Knowledge graph completion
✅ Recommendation with GNN
✅ Geometric deep learning
✅ Link prediction and causality
(4/4) Along with the book, course lecture slides and Python notebooks are available on the book website and GitHub, respectively.
🔗 https://github.com/udlbook/udlbook