OK, for a while I've been thinking I'd like access to a GPU to do some local #ml jobs. Probably don't need to train a bunch of models from scratch, but I can imagine wanting to do some fine-tuning.
Right now I use Whisper on CPU and image generators like Stable Diffusion via pre-existing endpoints. I can imagine in the future my heaviest use will be image generators like SDXL and text-to-speech, and possibly I'd want to use them for high quality speech-to-text workflows as well.
I've got [this small media server](https://pcpartpicker.com/user/pineapple_incident_030489/saved/#view=hVxLyc) that has a free PCiE slot, anyone know if I can just throw something like [this](https://pcpartpicker.com/product/4DkH99/msi-ventus-2x-black-oc-geforce-rtx-4060-ti-16-gb-video-card-rtx-4060-ti-ventus-2x-black-16g-oc) in there and call it a day?
@weebull Power supply seems like one of the cheaper things to upgrade (though maybe not one of the physically easiest ones, considering that it plugs into everything 😛
@pganssle indeed. I was thankful that by going with the same brand (Corsair) I was able to just unplug everything at the power supply end, and then plug-in the new one.
@pganssle probably not with that power supply. The thing says 165W or whatever, but it'll spike.
I've done something similar with and RX7600 and a 450W supply. It was fine on 3D, but SDXL caused the overcurrent protection to trip. I've since upgraded to a 750W and seen short spikes to 300W+ just for the GPU.