blackbelt352@lemmy.worldtoTechnology@lemmy.world•OpenAI, Google, Anthropic admit they can’t scale up their chatbots any furtherEnglish
65·
2 months agoIt’s a lot. Like a lot a lot. GPUs have about 150 billion transistors but those transistors only make 1 connection in what is essentially printed in a 2d space on silicon.
Each neuron makes dozens of connections, and there’s on the order of almost 100 billion neurons in a blobby lump of fat and neurons that takes up 3d space. And then combine the fact that multiple neurons in patterns firing is how everything actually functions and you have such absurdly high number of potential for how powerful human brains are.
At this point, I’m not sure there’s enough gpus in the world to mimic what a human brain can do.
Technology Connections did a really good video a few years ago explaining how e-readers and eInk works and what draws people to them. It’s really informative, goes over a bit of the tech history of it, roughly how it works, and whether it may or may not work for you.
https://youtu.be/dhRgw0HfrYU