Monday, November 28, 2016

Proper Wi-Fi Design and Deployment: Required Even In the Home

As more and more wireless appliances and streaming media applications permeate the home, the reliability of the Wi-Fi network as the infrastructure for this traffic becomes critical to performance.  This is especially challenging in a multi-vendor environment. While every vendor  technically follows the IEEE 802 networking standards (which are supposed to provide for interoperability), there is the “standard” and then there is the reality.  

Most importantly, the more intense networking demands of a home network REQUIRE a properly designed and integrated system, which is quite antithetical to the conventional consumer “plug and play” approach. 

Architecturally, a home LAN can be broken into the following functions.  Each of these functions are independent of each other, and thus can generally and safely be satisfied by different equipment from different vendors:

  1. Modem:  Converting the Internet connection from the street (coax cable, fiber, DSL, satellite, etc.) into Ethernet
  2. Router:  Defining the local area network (LAN) and providing routing and NAT functionality between the WAN and the LAN
  3. Switch(es):  Providing interconnectivity between wired network devices on the LAN, including infrastructure devices (e.g. routers, APs, other switches) and wired client devices.
  4. Access Points:  Providing multiple wireless client devices access to the wired LAN

Several companies provide physically-integrated equipment that provide multiple functions (e.g. a cable modem Wi-Fi router which provides all four functions in one box), but these should be treated as separate functions.  Within a particular function, it is generally a bad idea to mix and match vendors.  This is especially true in the “access point” function, because there is quite a lot happening “under the hood”, and while all of the vendors are following the IEEE 802.11 specs, they are all doing it somewhat differently which makes co-existence problematic and, at the very least, difficult to do well.   Thus, if you are installing a managed Wi-Fi system, you want to remove or disable any other third party APs installed by the customer or other service providers to the extent possible.   Co-located Wi-Fi systems will interfere with each other and reduce Wi-Fi capacity, even if they are not in “active use”.

For the modem, the general recommendation is to have the ISP provider put their appliance in “bridge mode”, which shuts off all but the modem function.  Some ISPs have a harder time of doing this than others in practice.   At a minimum, the built-in access point in the cable modem should be disabled.  The built in AP is invariably only consumer grade and has very few knobs to turn in terms of performance tuning, making it inappropriate for the performance challenges of larger home Wi-Fi networks, especially those that require multiple APs for both coverage and capacity. 

The built-in router of a cable modem can be used reasonably safely if VLANs are not required.  If VLANs are required, you are definitely best off putting the cable modem into bridge mode and using a wired-only SMB Enterprise router (and thus avoiding a double NAT scenario which lowers performance for some appliances like gaming consoles).  EnGenius currently does not make this type of product, though it is on our long term roadmap.  SonicWALL SOHO is a good example, as if you get the standalone version (i.e. without the content filtering or AV licenses) it is fairly inexpensive and quite capable, though it is certainly difficult to program.  There are several other vendor products on the market in this category.  The key features you want to look for are VLAN support, DHCP, and dynamic DNS.  Firewalls are also useful in many home and SMB applications.

For the switches and APs, I would recommend that you look into EnGenius’s Neutron product line.  The APs are all centrally managed, which makes it easier for you as a service provider to provision and maintain, and can be managed either from the local EWS switch or from a hosted cloud-based server called ezMaster.

Within the AP for the home, there are three key factors to keep in mind:
  1. Placement:   APs should be placed as close as possible to the client devices, with as little physical structure (e.g. walls) as possible.  In a multi-AP deployment, APs should be placed as far apart from each other as possible, with as much physical structure in between as possible.  This is true three-dimensionally, so stacking in hallways from floor to floor should absolutely be avoided.
  2. Transmit Power:   Most smartphones and tablets have very weak transmitters, on the order of 100x less powerful than access points.  If the APs are set to their maximum power settings, they can create a false sense of coverage.  The client devices at the far end of the coverage area will hear the APs (since the AP is “shouting”) but the AP can have a lot of difficulty hearing the client devices (since the clients are “whispering”).   Coverage should generally be more balanced, and thus turning down the transmit power of the APs will help to ensure more balanced bi-directional communication.  Additionally, 5 GHz does not travel as far as 2.4 GHz and suffers more attenuation when passing through walls and objects, thus the 2.4 GHz power generally needs to be 6 dB lower than the 5 GHz power to have roughly the same area of coverage.  I usually start at 14 dBm for 2.4 GHz and 20 dBm for 5 GHz, tuning from there based on the environment.  I also generally do not recommend “auto power”, as this changes the coverage area over time and can create either gaps in coverage or co-channel interference with neighboring APs.
  3. Channel:  Neighboring APs will overlap and will interfere with each other, unless the neighboring APs are put on static / non-overlapping channels.  Auto-channeling is a very hard optimization problem, and I have yet to see any vendor do it well, despite marketing claims.  APs should always be put on static / non-overlapping channels with the channel scheme staggered as much as possible.
Ironically, there are several new startups on the market offering home mesh products (e.g. Eero).  These are targeted as a “plug and play” approach for consumers to provide larger areas of coverage by placing more APs and then not requiring Ethernet cabling to interconnect them, instead using the Wi-Fi itself as a backhaul.  These APs also inevitably rely upon auto-channel and auto-power in an attempt to keep things simple for the uneducated user.  These products, however, follow the old mantra of providing “coverage”, when in reality a home network needs to provide adequate “capacity” with room for growth.  The problem with the fundamental approach is that mesh reduces throughput by 50% per hop (spending ½ the time servicing customers, ½ the time for backhaul to a connected AP).  Such APs fundamentally cannot meet the performance (bandwidth and channel utilization) demands of a high-performance network, and generally should be avoided in networks where high performance and large client capacity are driving requirements.

The core problem in home networking lies in the fact that ISP modem / routers are consumer products, and not very capable for more complex or more high-demanding networks.  I’ve run into the issue myself where you cannot practically put the modems into bridge mode because the ISP service people cannot handle it over time – even if you get them to configure it in bridge mode, a firmware upgrade or service issue generally breaks that and the ISP returns their modem to their default configuration. Additionally, if the customer is using their VoIP product, you are stuck with their modem.    

Unfortunately, you are usually stuck with whatever cable modem hardware the ISP provides, and every ISP is going to pick something different.  Most ISPs are fairly large behemoths who don’t care about interoperability with any other vendor’s equipment, and have no real incentive to provide compatibility for services that they undoubtedly view as (current or future) competition.   That is a business reality that is not going to change anytime soon. 

The best you can do, therefore, is to work around it.   I’ve done installations where I’ve literally sealed cable modems in steel boxes to block the Wi-Fi signal from it.  More commonly, I’ve also done installations where you just set the cable modem to some random SSID, fix the channel on 2.4 GHz and 5 GHz, and don’t use it for connecting any client devices.  You set up your real Wi-Fi network with the proper equipment and avoid conflicts on the fixed 2.4 GHz and 5 GHz channel of the cable modem, and basically treat the modem as a 3rd party rogue AP. 

The entire Wi-Fi industry has spent nearly 20 years telling consumers how “easy” it is to install Wi-Fi, yet simultaneously made the protocol more complicated (and thus more sensitive and fragile) in order to cheat the RF physics to squeeze throughput performance.   Performance relies upon establishing and maintaining better control over equipment choice, locations, channel, and transmit power settings.  I concur that the situation in home network environments is fairly untenable, and the introduction of more Wi-Fi appliances and infrastructure products in the consumer space only continues to make the situation worse. 

The reality is that a network needs to be designed, controlled, and properly maintained if you want high performance out of it, and that means a careful mix and match (and control) of vendor products for the modem, router, switch, and APs with proper configuration.  

There is no magic bullet.   

Tuesday, November 15, 2016

The Misconceptions of Managed IT Services

Small businesses running on shoestring budgets often consider IT services as overhead.  While most rely upon Internet access, Wi-Fi, and shared access to data, they don’t view these services as part of their core business. Having any or all of these fail for an extended period of time could easily drive a small company out of business, yet many shy away from options such as managed services that might add stability and often reduce cost.

Instead, most still follow the antiquated "break / fix" model of IT management, where an IT consultant who was originally called in to set up a system gets called back in to fix things whenever there's a problem.  In most cases, the hesitation to engage for the long haul has to do with misconceptions about the managed services model.  A managed service provider keeps a continuous watch over a network infrastructure to ensure it is always operating smoothly and securely in return for a monthly fee.

Clarifying these misconceptions can help customers overcome misplaced concerns, and create new “win-win” scenarios for consultants and clients alike.

Misconception #1: Managed Services are More Expensive 

In truth, managed services tend to actually be less expensive over time. Clearly, there’s value in being able to plan a predictable and affordable monthly monitoring charge but that won’t be often convince clients to take the plunge. More compelling is the idea that managed service providers can catch most problems while they are still emergent, when they generally can be resolved more quickly and easily.  This produces major savings, and any client that’s suffered a debilitating outage can easily grasp that value.

Contrast this approach to the case where an IT consultant is brought in to fix a problem only after a crisis occurs, or a small problem festers over time and ultimately grows out of control.  The IT consultant can rack up many, many hours to fix a big problem and present a correspondingly large bill for services.  Such emergencies are unpredictable, and can lead to an accusatory and adversarial relationship between you and your IT consultant.

Misconception #2:  “If it Ain't Broke…” 

Most of us have learned at some point that it costs less to keep up with our every-3000-mile oil changes than to wait for the “check engine” light to come on. The same is true for the network infrastructure.

Network equipment vendors constantly release bug fixes and security updates that should be installed and tested. And the cybersecurity landscape never stops changing.  Small businesses may mistakenly feel they’re too small to be targeted, only to find themselves turned into an automated “bot” unwittingly doing the bidding of complex sinister cybercriminals or even have their hard drives encrypted and held for ransom.    

Again, this is an easily demonstrated value-add of subscribing to managed services and having a seasoned IT professional proactively managing the network.

Misconception #3:  I Don't Have to Worry about Downtime, Because All My Data is "in the Cloud" 

The proliferation of cloud services has made it much easier for small businesses to take advantage of state-of-the-art networking technology without having to invest heavily in owning servers and services.  However, relying on cloud services presents its own set of challenges:

  1. A company likely needs more one than one cloud service. Maybe they use email for sending documents to employees outside the office, and Dropbox for sending documents to third parties.  These are disparate services that, like their onsite hardware counterparts, may not work together seamlessly or securely.
  2. The cloud model doesn’t always scale cost-effectively for small businesses because, in reality, there is no one cloud.  For each service, data is stored on a server accessible over the public Internet vs. your private network.  Subscribing to multiple cloud services from multiple vendors may not deliver the same economies of scale as services using one managed service provider and having single point of contact when something goes wrong.
  3. Nobody ever really tests the backups until something goes awry, and suddenly data recovery is needed to keep the business running.  That is not the time you want to discover that something was misconfigured, or that the service doesn’t work the way you thought it would.  
Small businesses need to understand the fact that each disparate provider adds a layer of risk.  Most cloud services, even popular ones, are only small businesses themselves.  What happens to your data if their service gets hacked or the company suddenly goes out of business?  

Misconception #4:  My IT Guy Can Be Here in Ten Minutes 

Here, the misconception is that you can either have a trusted local IT guy, or contract with a local managed service provider but not both at the same time.  In reality, more and more IT consultants are transitioning to become managed service providers that enjoy higher growth and recurring revenues for monitoring and troubleshooting customers remotely.

This trend is being enabled in part by multiple evolving technologies, including the cloud.  One platform from start-up Uplevel Systems is designed from the ground up for IT consultants serving small businesses. The Uplevel model brings together the key elements of small business IT – access, Wi-Fi, security, storage, and management—on one remotely managed platform.  Customers enjoy better-than-consumer-grade features, functionality, and security at a predictable and affordable cost, with the added benefit of their local consultant’s watchful eye.

Using one integrated infrastructure follows industry best practices, simplifying configuration, maintenance, and troubleshooting, and reducing security risks.  Data is still shared and saved to the cloud, but now it’s the trusted IT professional evaluating data and making decisions on the company’s behalf.

Managed services optimized for small businesses offer a win-win by aligning the business goals of IT consultants with the unique needs of the clients they serve.  No more “bundling up” a bunch of IT issues until they become large enough to warrant the cost of a visit by the “IT guy.”   For the IT consultant, managed services mean more predictable revenue streams, faster growth, and opportunity to engage customers at a more strategic level. 

Done right, managed services help both parties avert the major problems that can shut down a small business and create emergencies that ultimately benefit no one. The hurdle to adoption – the misconceptions stated above – can be easily overcome with basic ROI models and a “try it you’ll like it” approach.

Wi-Fi, my frequent focus, is a great place to start.