fossilesque@mander.xyzM to Science Memes@mander.xyzEnglish · 2 months agoSquiggly Boiemander.xyzexternal-linkmessage-square68fedilinkarrow-up1871arrow-down113
arrow-up1858arrow-down1external-linkSquiggly Boiemander.xyzfossilesque@mander.xyzM to Science Memes@mander.xyzEnglish · 2 months agomessage-square68fedilink
minus-squareTamo240@programming.devlinkfedilinkEnglisharrow-up8·2 months agoIts an abstraction for neural networks. Different individual networks might vary in number of layers (columns), nodes (circles), or loss function (lines), but the concept is consistent across all.
minus-squareNotANumber@lemmy.dbzer0.comlinkfedilinkEnglisharrow-up4·edit-22 months agoKinda but also no. That’s specifically a dense neural network or MLP. It gets a lot more complicated than that in some cases.
Its an abstraction for neural networks. Different individual networks might vary in number of layers (columns), nodes (circles), or loss function (lines), but the concept is consistent across all.
Kinda but also no. That’s specifically a dense neural network or MLP. It gets a lot more complicated than that in some cases.