Thursday, November 1, 2018

Need advice (or paid consultant/integrator) for networking an animation studio

Hello all,

My apologies if this is the wrong place to post this.

I'm apparently now the 'de facto' IT guy of a small animation studio in the LA area. We are moving to a new space, and I've been put in charge of supervising the network upgrade that accompanies the move. I've never been super knowledgeable about networking, and I'm frankly out of my depth - but also eager to learn. I don't yet have a fixed budget, but if I had to guess, it would be <= $10K (USD) for physical hardware (switches/router/cables - the structure wiring, racks, etc. and such are not part of this). I am seeking either advice or a paid consultant/integrator to assist me with this process.

Current Situation

  • The studio has grown very rapidly over the course of two years, and our networking equipment has not kept up.
    • Internet comes into a Linksys WRT1900AC that we use as our main router (and also as a WAP)
    • Our entire studio is currently configured under one network - no subnetting, VLANs, or other segmentation
    • One port of the Linksys runs to a TP-Link 48-port Gigabit switch, that serves our main rendering machines
    • One port of that is daisy-chained to a second TP-Link switch (24 port), which has most of our workstations connected to it. There are a few other desktop switches connected to this, but most workstations are directly linked to it.
    • Another port of the 'top' TP-Link switch is daisy-chained to a Quanta LB6M 10G SFP switch for a rack of 12 servers for data processing. This is currently the only 10G equipment on our network.
    • There are 2 WAPs in use (not sure of the models). I think one of them is a pure WAP, while the other is configured as a separate network
  • Our storage is currently divided into four physical systems based on our usage. None of them currently support 10G - but we plan to upgrade after the move:
    • 'Hot server' for our active projects. This is about 35TB. It has a live redundant backup, and 3x 1G links aggregated for bandwidth.
    • 'Cold vault' for completed projects and longer-term storage. This is about 75TB.
    • 'Datastore' for datasets from clients and projects. This is about 70TB.
    • 'Cache' for baked animations, shared video scratch, etc.
  • In total we have about 36 wired machines on our network, but we probably have an additional ~12 on wireless at any given time.
  • We do not have or need any kind of VoIP

Current Problems

  • We have team members who frequently edit video at resolutions much greater than 4K (our largest project to date was 17,000 x 6000 pixels per frame), and they've complained about access speed
  • We have some machine-learning and data analysis projects where our current network has been proven to be a bottleneck
  • Since multiple machines need to read and write to the Cache drive frequently, the network speed can dramatically affect our render times
  • When moving files onto our servers, we've noticed issues with moving many small files compared to fewer larger files. That sounds to me like it's probably a software issue or file indexing limitation, rather than a networking problem, but I figured I'd post it here anyways.

New Network Goals

  1. Boss has explicitly asked for the wired network to be 10Gb/s wherever it makes sense
    1. He and at least three other workstations need 10G
    2. Storage needs 10G
    3. Most of our rendering only needs Gigabit
  2. Reduce daisy-chaining where possible, or at least make the links higher speed and bandwidth
  3. Reduce bottlenecks on storage IO as much as possible
  4. Support a second LAN with internet access that is fully isolated from our main one (a developer wants a sandbox with internet)
  5. Support probable expansion as we add more hires and connect more machines to our render setup

New Network Progress

  • The building is being wired with Cat6A. Everything will be coming back to our server room. Each of the 12 primary workstations will have a dedicated cable into our patch bay (with some extras for later expansion or furniture rearrangement).
  • More rack space; currently 4 racks are available, which is ample room for our gear and leaves some room to scale horizontally.
    • 1 entire rack (42U) is currently provisioned to be just for networking, patching, and storage.

My Plan

TLDR of everything above:

Currently gigabit off a prosumer router/firewall/WAP combo. Unplanned network, no VLANs or subnets. Daisy chained switches. 150+ TB of storage. ~36 wired, 12 wireless machines. No VoIP. Having IO problems with our storage, especially for video editing and ML tasks.

  • New standalone router
    • Do we need a firewall?
  • WAPs will be connected through building patches
  • Top-of-Rack style setup, with a 10G aggregation switch in the networking rack.
  • 2 of the racks will be Gigabit equipment - so the ToR for each should be Gigabit with 10G uplinks.
  • One of the racks will consolidate our 10G SFP+ equipment. I can reuse the LB6M for now as the ToR.
  • Storage should connect into a 10G switch that connects to the aggregation. Our storage hardware isn't currently 10G capable, but it will be in the future.
  • The patch out to the workstations should connect into a 10GBASE-T switch that connects to the aggregation.

Where I Need Help

  • Does this plan make sense at all? Or is there a different/better way to do this?
  • I am still unsure about router choice and configuration for the new network. There are a lot of options, and it isn't immediately clear to me why one might be a good choice compared to another.
  • What would be the advantage of adding a firewall device to our system?
  • Does it make sense to 'buy into a system' for this upgrade, like Ubiqiti or Cisco equipment?
  • What options do I have for a 10GBASE-T switch? I'm having trouble finding one that will support 12+ copper connections.
  • My boss has asked me to buy Cisco where possible. However, I'm not sure that is even remotely possible on this budget, and I'm not trained with Cisco equipment - so even if the gear itself was affordable, I doubt I could properly install it.

Again, I am willing to hire a consultant/integrator for this project, but I need to be able to justify it to my boss. We don't have much budget for ongoing maintenance, so anything installed should be something that we can maintain ourselves, or with minimal outside assistance.

Apologies for the wall of text. Thanks in advance for any help!

EDIT: Jeez what happened to the formatting? Tried to go back and make it more legible. Wish there was a preview.



No comments:

Post a Comment