lidd1ejimmy@lemmy.ml to Memes@lemmy.ml • 4 months agoOffline version of Chat GPTimagemessage-square5 fedilinkarrow-up1166
arrow-up1166imageOffline version of Chat GPTlidd1ejimmy@lemmy.ml to Memes@lemmy.ml • 4 months agomessage-square5 Commentsfedilink
minus-squareneidu2@feddit.nlhexbear20·edit-24 months agoTechnically possible with a small enough model to work from. It's going to be pretty shit, but "working". Now, if we were to go further down in scale, I'm curious how/if a 700MB CD version would work. Or how many 1.44MB floppies you would need for the actual program and smallest viable model. linkfedilink
minus-squarelidd1ejimmy@lemmy.mlhexagonhexbear3·4 months agoyes i guess it would be a funny experiment for just a local model linkfedilink
Technically possible with a small enough model to work from. It's going to be pretty shit, but "working".
Now, if we were to go further down in scale, I'm curious how/if a 700MB CD version would work.
Or how many 1.44MB floppies you would need for the actual program and smallest viable model.
yes i guess it would be a funny experiment for just a local model