#FakeIDs#NeuralNetworks: "An underground website called OnlyFake is claiming to use “neural networks” to generate realistic looking photos of fake IDs for just $15, radically disrupting the marketplace for fake identities and cybersecurity more generally. This technology, which 404 Media has verified produces fake IDs nearly instantly, could streamline everything from bank fraud to laundering stolen funds.
In our own tests, OnlyFake created a highly convincing California driver's license, complete with whatever arbitrary name, biographical information, address, expiration date, and signature we wanted. The photo even gives the appearance that the ID card is laying on a fluffy carpet, as if someone has placed it on the floor and snapped a picture, which many sites require for verification purposes. 404 Media then used another fake ID generated by this site to successfully step through the identity verification process on OKX. OKX is a cryptocurrency exchange that has recently appeared in multiple court records because of its use by criminals.
Rather than painstakingly crafting a fake ID by hand—a highly skilled criminal profession that can take years to master—or waiting for a purchased one to arrive in the mail with the risk of interception, OnlyFake lets essentially anyone generate fake IDs in minutes that may seem real enough to bypass various online verification systems. Or at least fool some people."
"So-called “neural networks” are extremely expensive, poorly understood, unfixably unreliable, deceptive, data hungry, and inherently limited in capabilities.
Quite interesting but confusing, as I come from #backpropagation DL.
If I got it right, the authors focus on showing how and why biological neural networks would benefit from being Energy Based Models for Predictive Coding, instead of Feedforward Networks employing backpropagation.
I struggled to reach where they explain how to optimize a ConvNet in PyTorch as an EB model, but they do: there is an algorithm and formulae, but I'm curious about how long and stable training is, and whether all that generalizes to typical computer vision architectures (ResNets, MobileNets, ViTs, ...).
Code is also #opensource at https://github.com/YuhangSong/Prospective-Configuration
I would like to sit a few hours at my laptop and try to better see and understand, but I think in the next days I will go to Modern #HopfieldNetworks. These too are EB and there's an energy function that is optimised by the #transformer 's dot product attention.
I think I got what attention does in Transformers, so I'm quite curious to get in what sense it's equivalent to consolidating/retrieving patterns in a Dense Associative Memory. In general, I think we're treating memory wrong with our deep neural networks. I see most of them as sensory processing, shortcut to "reasoning" without short or long term memory surrogates, but I could see how some current features may serve similar purposes...
The Machine Learning with Graphs course by Prof. 𝐉𝐮𝐫𝐞 𝐋𝐞𝐬𝐤𝐨𝐯𝐞𝐜 from Stanford University (CS224W) focuses on different methods for analyzing massive graphs and complex networks and extracting insights using machine learning models and data mining techniques. 🧵🧶👇🏼
(2/3) The course includes 47 lectures, and it covers topics such as:
✅ ML applications for graph
✅ Graph neural networks (GNN)
✅ Knowledge graph completion
✅ Recommendation with GNN
✅ Geometric deep learning
✅ Link prediction and causality
JOSS publishes articles about open source research software. It is a free, open-source, community driven and developer-friendly online journal. JOSS reviews involve downloading and installing the software, and inspecting the repository and submitted paper for key elements
Please reach out if you are interested in reviewing this paper or know one who could review this paper.
Henry Markram, of spike timing dependent plasticity (STDP) fame and infamous for the Human Brain Project (HBP), just got a US patent for "Constructing and operating an artificial recurrent neural network": https://patents.google.com/patent/US20230019839A1/en
How is that not something thousands of undergrads are doing with PyTorch every week?
The goal, says the patent text, is for <<methods and processes for constructing and operating a recurrent artificial neural network that acts as a “neurosynaptic computer”>> – which seems patentable, but not the overreach that is patenting the construction and operation of an RNN, which is, instead, ludicrous.
Seems likely that the legal office in Markram's research institution did an overreach and got away with it. Good luck enforcing this patent though: Markram did not invent RNNs.
Artificial Intelligence in the Age of Neural Networks and Brain Computing, Second Edition demonstrates that present disruptive implications and applications of AI is a development of the unique attributes of neural networks, mainly machine learning, distributed architectures, massive parallel processing, black-box inference, intrinsic nonlinearity, and smart autonomous search engines.
#AI#ML#NeuralNetworks#Automata: "The way we understand what artificial intelligence is and how we design it has serious implications for society. Marta Peirano reviews the origins and evolution of AI, and addresses its problems and dangers in this article, taken from the catalogue of the exhibition AI: Artificial Intelligence." https://lab.cccb.org/en/the-double-life-of-artificial-intelligence/
Now that #NeuralNetworks have had repeated big successes over the last 15 years, we are starting to look for better ways to implement them. Some new ones for me:
#Groq notes that NNs are bandwidth-bound from memory to GPU. They built a LPU specifically designed for #LLMs https://groq.com/
A wild one — exchange the silicon for moving parts, good old Newtonian physics. Dramatic drop in power utilization and maps to most NN architectures (h/t @FMarquardtGroup)
Are you interested in cortico-basal ganglia networks and would like to model them, but only have a basic proficiency in Python or computational modeling in general?
Well then, I’m happy to announce the release of CBGTPy, a software package for running biologically-realistic simulations of the cortico-basal ganglia-thalamic (CBGT) networks in a dynamic range of tasks. The latest tool out of our Exploratory Intelligence group at CMU, University of Pittsburgh, and University of the Balearic Islands (Spain).