It looks like !buildapc community isn’t super active so I apologize for posting here. Mods, let me know if I should post there instead.
I built my first PC when I was I think 10-11 years old. Built my next PC after that and then sort of moved toward pre-made HP/Dell/etc. My last PC’s mobo just gave out and I’m looking to replace the whole thing. I’ve read over the last few years that prefabs from HP/Dell/etc. have gone to shit and don’t really work like they used to. Since I’m looking to expand comfortably, I’ve been thinking of giving building my own again.
I remember when I was a young lad, that there were two big pain points when putting the rig together: motherboard alignment with the case (I shorted two mobos by having it touch the bare metal of the grounded case; not sure how that happened but it did) and CPU pin alignment so you don’t bend any pins when inserting into the socket.
Since it’s been several decades since my last build, what are some things I should be aware of? Things I should avoid?
For example, I only recently learned what M.2 SSD are. My desktop has (had) SATA 3.5" drives, only one of which is an SSD.
I’ll admit I am a bit overwhelmed by some of my choices. I’ve spent some time on pcpartpicker and feel very overwhelmed by some of the options. Most of my time is spent in code development (primarily containers and node). I am planning on installing Linux (Ubuntu, most likely) and I am hoping to tinker with some AI models, something I haven’t been able to do with my now broken desktop due to it’s age. For ML/AI, I know I’ll need some sort of GPU, knowing only that NVIDIA cards require closed-source drivers. While I fully support FOSS, I’m not a OSS purist and fully accept that using a closed source drivers for linux may not be avoidable. Happy to take recommendations on GPUs!
Since I also host a myriad of self hosted apps on my desktop, I know I’ll need to beef up my RAM (I usually go the max or at least plan for the max).
My main requirements:
- Intel i7 processor (I’ve tried i5s and they can’t keep up with what I code; I know i9s are the latest hotness but don’t think the price is worth it; I’ve also tried AMD processors before and had terrible luck. I’m willing to try them again but I’d need a GOOD recommendation)
- At least 3 SATA ports so that I can carry my drives over
- At least one M.2 port (I cannibalized a laptop I recycled recently and grabbed the 1TB M.2 card)
- On-board Ethernet/NIC (on-board wifi/bluetooth not required, but won’t complain if they have them)
- Support at least 32 GB of RAM
- GPU that can support some sort of ML/AI with DisplayPort (preferred)
Nice to haves:
- MoBo with front USB 3 ports but will accept USB 2 (C vs A doesn’t matter)
- On-board sound (I typically use headphones or bluetooth headset so I don’t need anything fancy. I mostly listen to music when I code and occasionally do video calls.)
I threw together this list: https://pcpartpicker.com/list/n6wVRK
It didn’t matter to me if it was in stock; just wanted a place to start. Advice is very much appreciated!
EDIT: WOW!! I am shocked and humbled by the great advice I’ve gotten here. And you’ve given me a boost in confidence in doing this myself. Thank you all and I’ll keep replying as I can.
There is no need for a separate sound card now, it is built in.
Compared to those pain points building a modern PC should be a breeze. CPUs go in Zero Insertion Force sockets so as long as you remember to lift the little lever you won’t bend any pins. People don’t even wear static discharge wrist bands anymore (all though it couldn’t hurt) or worry about shorting things out. And power connectors only fit one way unlike the AT power connector.
Speaking of breeze your only pain point might be making sure you have enough air circulation for cooling all that gear.
I remember working on a PC back in my Geek Squad days that had a lever.
For air circulation, what should I be on the lookout for? Making sure I have clearances, of course, but should I buy more fan that I need?
Case usually have fans preinstalled that should be fine. Just pay attention to the direction, have tham all blow air front to back. There is usually an arrow indicating which way it moves air.
Run a benchmark after buiding the PC and check temperatures.
Don’t just look at temperatures though, look at clock speed too. 95c+ is normal for modern high end CPUs (AMD 7000 series actively try to run at that temp under full load). What you want to make sure is that it’s not throttling.
If this is a server and you don’t want your thermal paste to be toast in a year then I’d suggest lowering the maximum temperature in the bios if it lets you.
That and you need to decide how much positive or negative pressure you want in there as well. You could always do some calculations. Treat your case as an open control volume where mass can transfer across the boundaries. Then the sum of air going into and out of the case must equal the rate of change of air in the case. Assuming the volume of air in your case is constant, this term would be zero. So you can look at the rated volume flow rate for each fan (CFM - aka cubic foot per minute) and see if this summation is positive or negative. A positive value would mean “positive pressure” and a negative value “negative pressure”. The only problem is if the fans are not running at max RPM and/or the rated CFM value - which is the case if you have your fans plugged into the motherboard( regardless of whether you’re using PWM or 3 pin). In this case, you would have to calculate the volume flow rate of each individual fan as a function of the RPM. This may not be a linear function and would probably require taking some data and coming up with a regression for the data. This would be way harder to do.
tldr: add up the CFM going into the case, subtract the CFM leaving the case. If the value is positive you have “positive pressure”
Get Nvidia GPU for AI, period.
Read the manual for the motherboard you want and make sure that the M2 slot supports NVMe rather than SATA. (Also, learn to tell NVMe from SATA chips.) M2 slots that are SATA usually share a SATA lane with the SATA connectors and if you populate the M2 slot you might lose a connector.
Another thing to read about is whether populating which M2 slot reduces the speed of one of the PCIe slots. Same reason (shared lanes) but with PCIe instead of SATA. These things should be spelled out next to the M2 connectors.
NVMe drives in Linux have /dev/nvme* designations not /dev/sd*.
lots of good advice here. I just want to restate: do yourself a favor and migrate your HDDs over to any solid state drive. Whether that means “classic” SSDs with a SATA-Port or M.2s is your prerogative, but in either case you’ll start wondering how you could ever stand that s pinning noise and the vibrations and the slow, slow data transfer.
Some thoughts:
Ubuntu, most likely
I’d encourage you to take a look at Linux Mint, it alleviates some of the Ubuntu fuckiness. And if you want to join the “I use arch btw” crowd, maybe checkout EndeavourOS if you’re feeling more brave than just Ubuntu variants (which is built on arch, but makes barrier to entry a little easier).
i9s are the latest hotness but don’t think the price is worth it
Take a look at last generation to soften the blow to your wallet. E.g., instead of looking at a 14900k, look at 13 or even 12 series. In fact, this is a useful strategy all around if you’re being price conscious: go one gen older.
GPU that can support some sort of ML/AI with DisplayPort
Probably going to want to go with a discrete card, rather than just integrated. Other major consideration is going to be nvidia vs AMD, for which you’ll need to decide if CUDA should be part of your calculus or not. I’ll defer to any data science engineers that might wander through this post.
The rest of your reqs pretty much come as regular stock options when building a pc these days. Though another nicety for my latest builds, is multi-gig nics (though 2.5Gb was my ceiling, since you’ll also need the network gear to utilize it). Going multi-gig is nice for pushing around a fuckton of data between machines on my lan (including a NAS).
Very last thing that I’ve found helpful in my last 3 builds spanning 15 years: I use newegg for its reviews of items, specifically so I can search for the term “linux” in any given product’s reviews. Often times I can glean quick insight on how friendly (or not) hardware has been for other’s linux builds.
And I lied, I just remembered about another linux hardware resource: https://linux-hardware.org/?view=search
You can see other people that have built with given hardware. Just remember to do a scan too once your build is up to pay it forward.
Good luck, and remember to have fun!
Used EndeavourOS for a few years too but switched to Fedora Workstation recently. EndeavourOS is still great but I like Fedora more now since it’s just easier. A lot of stuff I did manually before like switching ext4 for BTRFS, enabling compression and switching to Pipewire is done by default (also LUKS for full diks encryption which I was too lazy to install before) and I can update my system and install most software through GNOME Software without having to use the CLI. It’s also very easy to get OpenCL and HIP working, it’s just one package each you need to install. Only downside for me is that it’s not as easy to install stuff from COPR than it is from the AUR because you first have to enable the repo for each package you want to install from COPR. I think COPR is more secure tho, especially for someone like me who never looked at the PKGBUILD when installing from AUR.
I have to agree here. I use PopOS mostly, but most Ubuntu derivative nowadays beat the living crap out of Ubuntu. PopOS, Zorin, Mint, etc. Like many others, Ubuntu was my gateway to Linux, but I lived out of that in less than a year. Started spinning Mandriva (damn I’m old), Debian itself, and I’ve tried Ubuntu a few times over the years, mostly on VMs now, since now I hold no hope that it’ll ever go back to what it was.
And if you want to join the “I use arch btw” crowd…
I may be a linux nerd and pedantic, but not that pedantic. 😅 I’ve looked into Linux Mint and not opposed to an distro switch. I’ve been very happy with Ubuntu over the years. My first distro was slackware, then Fedora. Settled in Ubuntu and haven’t turned back.
if CUDA should be part of your calculus or not.
Probably not, if my cursory google search is correct. But happy to be convinced otherwise.
Though another nicety for my latest builds, is multi-gig nics (though 2.5Gb was my ceiling, since you’ll also need the network gear to utilize it)
I’ve had the benefit of laying my own CAT-5e in my house. Given the distances, CAT-6 was going to cost twice as much with a negligible increase in bandwidth. That said, I’m restricted by the narrowest straw, which is wifi (when streaming media to my phone) and ISP (which taps out at around 300mb/s). My current PC has 1gb/s card and I’ve only occasionally had issues.
I use newegg for its reviews of items, specifically so I can search for the term “linux” in any given product’s reviews.
Oh that’s a good tip!
deleted by creator
There’s nothing pedantic about using Arch. There’s a reason it and its derivatives are so popular.
Way to show off how not-pedantic you are!
deleted by creator
Pedantic is an insulting word used to describe someone who annoys others by correcting small errors, caring too much about minor details, or emphasizing their own expertise especially in some narrow or boring subject matter.
??
deleted by creator
deleted by creator
For AI/ML workloads the VRAM is king
As you are starting out something older with lots of VRAM would be better than something faster with less VRAM for the same price.
The 4060 ti is a good baseline to compare against as it has a 16GB variant
“Minimum” VRAM for ML is around 10GB the more the better, less VRAM could be usable but with sacrefices with speed and quality.
If you like that stuff in couple of months, you could sell the GPU that you would buy and swap it with 4090 super
For AMD support is confusing as there is no official support for rocm (for mid range GPUs) on linux but someone said that it works.
There is new ZLUDA that enables running CUDA workloads on ROCm
https://www.xda-developers.com/nvidia-cuda-amd-zluda/
I don’t have enough info to reccomend AMD cards
I got ROCm to work on a 7800XT after fixing some Python errors. It was quite unstable though.
GPUs these days use a whole lot of power. Ensure your power supply is specced appropriately.
And make sure it’s an actually good PSU too.
I know in gaming, possibly in other loads Nvidia 40 series, and especially 30 series love transient spikes which can easily exceed 2x the nominal power consumption. Make sure your PSU can handle those spikes both in terms of brevity, and current.
That’s what finally did in my 10 year old Corsair. I was technically within specs on wattage with my new 4070 but certain loads would cause it to trip the over current protection anyway.
My 3090 is a light flickering machine. Kind of annoying tbh.
Honestly any parts you buy today probably won’t be much good in 30 years.
Honestly any parts you buy today probably won’t be much good in 30 years.
At least not when AGP comes to town.
AMD is the gold standard for general user PCs in the last 5+ years. Intel simply cannot compete at the same energy expenditure/performance. At the same/close price/performance, Intel either burn a small thermonuclear power plant to deliver comparable performance, or simply is worse compared to similar Ryzens
Ryzens are like aliens compared to what AMD used to be before them
So I’d go with them
As for the GPU, if you want to use Linux forget Nvidia
Well, let’s see:
-
You no longer have to set jumpers to “master” or “slave” on your hard drives, both because we don’t put two drives on the same ribbon cable anymore and because the terminology is considered kinda offensive.
-
Speaking of jumpers, there’s a distinct lack of them on motherboards these days compared to the ones you’re familiar with: everything’s got to be configured in firmware instead.
-
There’s a thing called “plug 'n play” now, so you don’t have to worry about IRQ conflicts etc.
-
Make sure your power supply is “ATX”, not just “AT”. The computer has a soft on/off switch controlled through the motherboard now – the hard switch on the PSU itself can just normally stay on.
-
Cooling is a much bigger deal than it was last time you built a PC. CPUs require not just heat sinks now, but fans too! You’re even going to want some extra fans to cool the inside of the case instead of relying on the PSU fan to do it.
-
A lot more functionality is integrated onto motherboards these days, so you don’t need nearly as big a case or as many expansion slots as you used to. In fact, you could probably get by without any ISA slots at all!
plug 'n pray
While I love this list, it is more applicable to the turn of the century than a a decade ago. I was half expecting to see “ram no longer has to be installed in pairs” on the list.
ETA: Talking about EDO memory not dual channel
I think you may have misread OPs post. They haven’t built a PC since shirtly after they were 10-11, which was almost 30 years ago. So developments since the turn of the century are in fact relevant here, heh.
They haven’t built a PC since shirtly after they were 10-11…
Well, they’d need a much bigger shirt now than when they were 10 or 11.
I’ll see myself out.
LOL
Wait RAM doesn’t need to be installed in pair? I am an old apparently
-
I’d defintely go with an M.2 SSD, you can get 1tb for 50€ and 2tb for 100€ now and they’re much faster, more reliable and take up way less space.
For ML/AI stuff, you might be just fine using an AMD GPU. AMD GPUs are a lot easier to use on Linux and are also a good bit cheaper. I use Fedora with an AMD GPU and I just installed the packages for OpenCL and HIP and now I can run LLMs on my PC using my GPU. I’ve also used Stable Diffusion with that GPU on Linux before. If there’s something specific you want to do regarding that, I’d look up first if you need an Nvidia GPU for that but from my experience AMD GPUs work just fine.
I’d take a look at AMD CPUs again. Last time I checked they were even cheaper (including mobo price) than Intel even though they’re also more efficient (faster and less power draw). Prices might have changed tho. You should probably use a Ryzen 5, a Ryzen 7 will only make sense if you use all cores because game performance is pretty much the same. A Ryzen 3 is more of a budget option tho, I wouldn’t use that. If it’s in your budget, you should also use the newest generation that uses the AM5 socket because you’ll be able to upgrade your CPU without needing a new mobo. I think it also only supports DDR5 RAM, which is more expensive than DDR4. If you use a Ryzen generation that uses the AM4 socket, it’s gonna be cheaper but if you want to upgrade you’ll need a new mobo with AM5 and new DDR5 RAM in addition to the new CPU.
As for Linux distros, my recommendations are Linux Mint if you want something very easy, EndeavourOS if you want something Arch-based or Fedora if you want something that’s not quite as easy as Mint but more up-to-date. I personally use Fedora but I used EndeavourOS before. I detailed why I switched to Fedora in a reply here somewhere.
The responsiveness between a hard drive and an SSD is night and day. NVMe is even faster but not noticeable unless you move a hell of a lot of data around. A motherboard having at least 1 M.2 NVMe slot is common, so installing the OS on it is an option. Hard drives have more storage per price, but unless space is significant factor I suggest using SSDs (also quieter than a spinning disk!). More info on storage formats in this video
Recent generations of motherboards use DDR5 RAM, which were very expensive on release. I think the price has come down but I am not up to date this generation. You may be able to save money making a DDR4 system but you’ll be stuck on a less supported platform.
AMD had like ~10 years of bad/power hungry processors and Intel stagnated, re-releasing 4-core processors over and over. AMD made a big comeback with their Ryzen series becoming best bang for buck, then even over taking Intel. I think it’s pretty even now.
If you don’t intend to game or do certain compute workloads then you can avoid buying a GPU. Integrated CPUs have come quite far (still low end compared to a dedicated GPU). Crypto mining, Covid and now AI has made the GPUs market expensive and boring. Nvidia has more higher-end cards, mid range is way more expensive for both and low end sucks ass. On Linux AMD GPUs drivers come with the OS, but Nvidia you have to get their proprietary drivers (Linux gaming has come a long way).
DDR5 has gone down dramatically compared to launch. You can get 64GB with a very fast bus for under 200 dollars now. At launch 32GB would easily set you back 250+. AMD has made a killing with Ryzen. Never mind the new naming convention that Intel came up with to make it even more complicated to choose the right CPU for your use cases, ridiculous. As for Nvidia GPU drivers, at the end of the day, they just work, regardless their proprietary drivers philosophy (which, again, I agree sucks). But if the OP is going to be doing AI development, machine learning and all that cool stuff, he’d be better served by getting a few CUDA TPUs. You can get those anywhere from 25 dollars to less than 100, and they come in all types (USB, PCI, M.2). https://coral.ai/products/#prototyping-products I have 1 USB Coral running the AI on my Frigate dicker for 16 cameras, and my CPU never reaches more than 12% while the TPU itself barely touches 20% utilization. You put 2 of those bad boys together, and the CPU would probably not even move from idle 🤣
deleted by creator
Same reply. And you can add as many TPUs as you want to push it to whatever level you want. At 59 bucks a piece, they’ll blow any 4070 out of the water for the same or less cost. But to the OP, you don’t have to believe any of us. You’re in that field, I’m sure you can find the jnfo on if these would fit your needs or not.
deleted by creator
Let’s get this out of the way. Not a single consumer grade board has more than 16 lanes on 1 PCI slot. With the exception of 2 or 3 very expensive new boards out there, you’ll be hard pressed to find a board with 3 slots giving you a total mas of 28 lanes (16+8+4). So, regardless of TPU or GPU that’s going to be your limit. GPUs are designed as general purpose processors that have to support millions of different applications and software. So while a GPU can run multiple functions at once, in order to do so, it must access registers or shared memory to read and store the intermediate calculation results. And since the GPU performs tons of parallel calculations on its thousands of ALUs, it also expends large amounts of energy in order to access memory, which in turn increases the footprint of the GPU. TPUs are application-specific integrated circuits (ASIC) designed specifically to handle the computational demands of machine learning and accelerate AI calculations and algorithms. They are created as a domain-specific architecture. What that means is that instead of designing a general purpose processor like a GPU or CPU, they were designed as a matrix processor that was specialized for neural network work loads. Since the TPU is a matrix processor instead of a general purpose processor, it removes the memory access problem that slows down GPUs and CPUs and requires them to use more processing power. Get your facts straight and read more before you try to send others on wild goose chases. As I said, the OP already works this field, it shouldn’t be hard for him to find the information and make an educated decision.
deleted by creator
OK mman, dont pop a vein over this. I’m a hobbyist, with some experience, but a hobbyist nonetheless. I’m speaking from personal experience, nothing else. You may well be right (and thanks for the links, they’re really good for me to learn even more).
I guess, at the end of the day, the OP will need to make an informed decision on what will work for him while adhering to his budget.
I’m glad to be here, because I can help people (at least some times) and learn at the same time.
I just hope the OP ends up with something that’ll fit his needs and budget. I will he adding a K80 to my rig soon, only because I can let go of 50 bucks and want to test it until it burns.
I wish you all a very nice weekend, and keep tweaking, its too Much fun.
Hold on a second, how come every time i look for TPUs i get a bunch of not-for-sale nvidia and Google cards, but this just exists out there and i never heard of it?
I found out about those about 6 months ago only, and it was by chance while going over the UnRaid forum for Frigate, so I decided to do some research. It took me almost 4 months to finally get my paws on one. They were seriously scarce back then, but have been available for a couple of month now. I only got mine finally at the end of November. They seem to be in an availability trend similar to Raspberry Pis.
I was really hoping to cannibalize the 32 GBs of DDR3 RAM but I couldn’t find a MoBo that supports it anymore. Then I saw DDR5 is the latest!
I don’t really do any gaming. If I wasn’t going to tinker with AI, I’d just need a card for dual DisplayPort output. I can support HDMI but…I prefer DP
The 4070 TI will give you quite a few years out of it for sure. You could also completely forego the GPU and get a couple of CUDAs for a fraction of the cost. Just use the integrated graphics and you’re golden.
deleted by creator
Dude, you KNOW I’m talking about TPUs. The name escaped my mind at the moment. Sorry if my English is not to your royalty level. Are you really so hired that you have to make a party out of that? Ran out of credits on pornhub or something?
Are CUDAs something that I can select within pcpartpicker? Or is this like a cloud thing?
I misspoke, and I apologize. I could not recall the term TPU, so I just went with the name of the protocol (CUDA). Nvidia has various TPU devices that use CUDA protocol (like the K80 for example). TPUs (Tensor Processing Units) are coprocessors designed to run some GPU intensive tasks without the expense of an actual GPU unit. They are not a one to one replacement, as they perform calculations in completely different ways.
I believe you would be well served by researching a bit and then making an informed decision on what to get (TPU, GPU or both).
deleted by creator
Here is an alternative Piped link(s):
Piped is a privacy-respecting open-source alternative frontend to YouTube.
I’m open-source; check me out at GitHub.
Seems like you got your answers already, but pcmasterrace community also exist
They sadly don’t have 3.5" [floppy] drives anymore, and both the ISA and PCI busses are nowhere to be found 😔
I used pcpartpicker for my latest build it’s a good help when assembling and can help avoid those incompatible parts.
Technically 3.5" SSDs are still out there, but they’re massive (16-64 TB) and target enterprise use (with a price to match).
And 3.5" is still the standard for platter HDDs, which are still the more economical option if you need large amounts of storage.
Now if you meant no more 3.5" floppy disk drives, then yes, those are definitely gone. ;)
I meant to write 3.5" floppy drives, and yes the 3.5" and 2.5" form factors are still going strong, even if the NVMe’s probably will reduce the use of 2.5"
Linus tech tips recently made huge pc build guide video that you might benefit from watching.
https://m.youtube.com/watch?v=BL4DCEp7blY&pp=ygUbbGludXMgdGVjaCB0aXBzIGJ1aWxkIGd1aWRl
Here is an alternative Piped link(s):
https://m.piped.video/watch?v=BL4DCEp7blY&pp=ygUbbGludXMgdGVjaCB0aXBzIGJ1aWxkIGd1aWRl
Piped is a privacy-respecting open-source alternative frontend to YouTube.
I’m open-source; check me out at GitHub.