Meta LLaMA 7b on Win 11 RTX3070
I’ve been playing with Stable Diffusion locally for a while and meant to make a post about getting torch-directml setup to work on WSL with CUDA, but haven’t got around to it. This is basically the same process with even easier instructions to follow. Here’s a Reddit post documenting the process: [D] Tutorial: Run LLaMA … Read more