Test hurdles for modern day networks

Article By : George Acris

In this article, we outline different test requirements that have come up as data volumes become increasingly elevated.

Our society is becoming increasingly obsessed with the transfer, manipulation and sharing of large quantities of data. As a consequence of this, exacting pressures are starting to be placed upon existing communication infrastructure. There are 2 questions that now need to be asked. Firstly, can the network resources being made available keep pace with our seemingly unquenchable thirst for more and more bandwidth? Secondly, what are the test implications going to be?
Numerous factors are contributing to the elevated data volumes being witnessed, calling for a ramp up of capacity and increased test activity. Alongside these, a number of industry trends are beginning to appear that will mandate more sophisticated test procedures.

IoT deployment
Internet of Things (IoT) is only just starting to emerge, but the activity around it is certain to ramp up very rapidly. It seems certain that IoT will have a major effect on how M2M communication is executed, with a wide range of different sectors benefitting from this technology. Via IoT more effective factory automation, building access, smart metering, surveillance, home appliance and countless other systems will be made possible.

Big data
Here the desire to analyses and process more and more data means that greater quantities of it need to be handled. This can then be used for all sorts of task such as predictive modelling or the optimising of system performance. The compiling of these extensive data sets obviously calls for robust networks that can carry high capacities of data.

Exponential growth in mobile data
Though mobile communication was originally voice centric, this can no longer be said to be the case. Projections recently made by ABI Research indicate that mobile data consumption will have reached a staggering 2,289 MBytes per month by 2019. The latest edition of the Ericsson Mobility Report predicts that by 2020, around 70% of the global population will be using smartphones and that current data usage here in Europe and in North America will have increased six-fold. The accessing of videos, playing of online games, sharing of photos, use of over-the-top messaging apps, plus the growing popularity of location-based services are all adding to the data load that mobile networks and their supporting backbone networks need to carry.

Prevalence cloud computing applications
Continued movement away from localized PC hosted software packages towards the use of cloud based services will enable a broader array of applications to be benefitted from, as well as offering much bigger reserves for data storage purposes. Forecasts from IDC suggest that investment in cloud IT infrastructure during 2015 will reach $33.4 billion (this equates to approximately a third of the overall worldwide IT infrastructure spend). By 2019 this figure will have reached $54.6 billion and represent almost half the total outlay on IT infrastructure.

New networking hardware
In order to improve efficiency, network operators are progressively making greater use of reconfigurable optical add-drop multiplexer (ROADM) devices. As ROADMs dispense with the need for opto-electric conversion, network architectures into which they are incorporated exhibit much higher throughputs.

Security issues
It's not just the volume of data that needs to be dealt with, in addition there are growing concerns about maintaining security over networks, with cyberattacks and suchlike becoming increasingly commonplace. The frequency with which these take place is only likely to rise, as remote working becomes more popular (with telecommuting already increasing by 80% in the last decade) and cloud services gain further traction. Furthermore, the bring-your-own-device (BYOD) phenomenon has serious repercussions in terms of system security. Infected portable electronics equipment could, via interfacing with a company's data networks, pass on viruses or malware. The problem is accentuated by companies rarely having a well-formulated BYOD policy or adequate authentication practice s.
In response to these different dynamics, networks must be continuously scrutinized and upgrades made when appropriate. This is an ongoing process across the entire operating life of the network. First of all effort must be made to ensure that the physical installation is fully compliant with the stipulated requirements. This may involve the physical testing of LANs to certify their conformance to the relevant standards (such as CAT6 or CAT7). This can be done using a number of very specific tools, such as the Fluke Versiv system for certifying either copper or fibre networks, or the EXFO Maxtester 940 FibreCertifier for fibre networks. Alternatively it could be verifying the characteristics of a fibre in a WAN network, which may include dispersion testing and thorough optical time domain reflectometer (OTDR) measurements, using products such as the Viavi (formerly JDSU) MTS2000 platform, the EXFO FTB-1 platform, or the Anritsu MT9000 platform. With OTDRs, all of the major manufacturers produce a wide range on modules designed to meet different measurement requirements. You will need to select a unit that covers the wavelengths that you will be operating at and a dynamic range that will allow you to measure the full length of your fibre. Depending on the wavelength, optical fibre can have an attenuation factor of between 3 dB/km and 0.2 dB/km. The higher losses are in the multi-mode wavelengths, whilst for single mode fibre it is around 0.2 to 0.4 dB/km. Therefore if you are looking at a fibre that is 100 km long, you will need a minimum dynamic range of 30 dB to 40 dB.
Once the physical environment of the network has been verified, then the data performance can subsequently be tested. This could be a simple RFC-2544 test of the network, (which covers fundamental parameters like throughput, latency, etc.), however more advanced Y-1534 test procedures (where quality of service and network performance elements are both dealt with) are now applied with growing regularity. Suitable solutions are offered by the major equipment brands (Anritsu, EXFO, Viavi and Veex) and the product you select will often depend on other factors, such as any additional functionality you may require—like fibre channel capability.
There are a multitude of different aspects that need ongoing testing and monitoring as networks continue to be expanded and upgraded. It is critical, therefore, that the operators, contractors and enterprises which find themselves caught up in this maelstrom can source the test tools that will allow them to react to constantly changing demands.
With the increasing use of ROADMs to improve the performance of networks, more specialist test equipment is required in order to measure the performance of these networks. State-of-the-art optical spectrum analysers (OSA) and power meters are designed to look at these systems. Suitable ROADM capable OSAs could be the EXO FTB-5240SP or the Viavi OSA500R. Also the widespread deployment of LTE networks has meant that synchronisation of network infrastructure is even more critical now than ever before. Even small errors can result in dropped calls and failed handovers, which will frustrate subscribers. Therefore use of packet-based timing technology, such as Synchronous Ethernet (SyncE) and IEEE 1588v2 Precision Time Protocol (PTP), are both being widely adopted. To complement this, more in-depth testing of network stability is being carried out too. This can be done using products from leading timing equipment manufacturers such as Calnex with its Paragon family of synchronisation testers, or by using the additional functionality available in the standard network testers previously mentioned.
After it has been established that the network is at optimum performance, it is necessary to give considerable thought to the security aspects. The concerns about violations and exposure to viruses, that were mentioned previously, dictate thorough testing of firewalls and the implemented security software via systems which are designed to emulate denial-of-service (DoS) attacks, etc. This testing becomes more specialised and requires the use of High end Network simulation tools from companies such as Ixia and Spirent.
All of these different dimensions need to be monitored and tested on a regular basis as the networks involved are, over time, grown and modified. As additional devices are connected, changes to the profile of the network will be witnessed. This will affect how well it responds to the demands being placed upon it.
To address the array of different test requirements that have been outlined here, companies need to be able to obtain suitable instrumentation in a cost-effective and timely manner. Their supplier should have a broad array of products to choose from. This will mean that the company can select the equipment that best matches their requirements at that time and is not forced to make compromises or is inadequately provisioned to upgrade if this becomes necessary. In addition, the company should, with help from their test equipment partner, be able to decide which sourcing option is most appropriate from a financial point of view (whether rental, purchase, new, used, or even utilising divide-by programmes) and be able to keep total flexibility.

About the author
George Acris is the Director of Marketing at Microlease.

Subscribe to Newsletter

Test Qr code text s ss