minus-squareeu8@lemmy.worldtoSelfhosted@lemmy.world•Guide: Self-hosting open source GPT chat with no GPU using GPT4AlllinkfedilinkEnglisharrow-up1·1 year agoTake my answer with a grain of salt, but I’m pretty sure if you have a GPU you can just run the same models and it should work more efficiently for you. The only difference for you is you can run some of the larger models. linkfedilink
minus-squareeu8@lemmy.worldtoSelfhosted@lemmy.world•Guide: Self-hosting open source GPT chat with no GPU using GPT4AlllinkfedilinkEnglisharrow-up1·1 year agodeleted by creator linkfedilink
Take my answer with a grain of salt, but I’m pretty sure if you have a GPU you can just run the same models and it should work more efficiently for you. The only difference for you is you can run some of the larger models.