Need some advice on networking for our 'server room'. I'm not a network guy, more of a sysadmin/ops.
We have about $30-40k budget for network gear. Looking to see if it's worth buying a dedicated core switch or combine access & core together for our servers and workstations.
There is currently six 48 port 1G switches used for our office environment (4x Aruba 2540's and 2x Aruba 2930) in the server room. These are for workstations, phones, printers, and IoT devices. Each has 10G SFP+ uplinks. These currently link up to a Netgear switch that constantly locks up when the network gets a decent amount of traffic. We also run our servers off of this as well.
We have 8 servers (mix of HCI and Veeam backups). There's about 40 ports of SFP+ total, so 20 split between a pair.
I was looking into possibly connecting the servers to a 24p 10G Aruba 6300 switch pair, and hang some of the SFP56 breakouts for the 1G access switches. Or would it make sense to do the 6400 - buy 1 blade for core and connect the access switches to it and buy another blade for the servers and link that to the core blade? I've never worked with a modular switch before so not sure if this would be recommended.
Otherwise would it make sense to look at the 8320 32P QSFP as the core and 6300 24P SFP+ for servers?
I'm also open to other vendors, we just currently use Aruba's at our branch sites.
No comments:
Post a Comment