In today’s era of high technology and rapid-paced digital marketing, the customer is all important. That person who visits your website and decides to engage with your brand, product, or service will make or break your business. By the looks of your website they will determine if you have something that’s worth their time and money. Now the challenge to this game is that you have a very narrow window of time in which to seal a deal. Research tells us, in fact, that 55% of your site visitors will spend less than 15 seconds there. So this is all the more reason why it’s critical to make sure your infrastructure is running optimally at all times. Encountering a downed website is pretty much the end of any customer engagement. They’ll simply go somewhere else.
Web and server monitoring is essentially about ensuring that your infrastructure is capable of hosting your applications smoothly and without any service interruptions. In an era where customer expectations are higher than ever this kind of monitoring has become fairly standard practice. In other words, smart businesses that value their customers will do whatever it takes to keep alerted to downtime issues before it impacts the end-user. What many businesses may not yet realize however is the importance of “monitoring frequency.” Monitoring frequency has to do with the pace at which your website performance provider pings your environment to see if everything is working properly. Many vendors are still stuck in a mode of offering 5-minute intervals when in reality the market demand has shifted to 1-minute frequency checks.
Let’s look at it this way. Say you have a couple seconds long spike in traffic. This traffic spike actually represents less than 1% of a five-minute polling cycle. Even though this could significantly impact business operations and communications, the anomaly won’t be caught so easily if the monitoring frequency is set to occur every 5 minutes. This is why it is important to establish a reliable baseline for polling your environment.
To make a case for the importance of continuous monitoring perhaps we can take the angle of exploring industries where IT and business leaders wouldn’t think twice about having 1-minute monitoring frequency. Can you think of a few industries where such a service would be most in demand? Are you, in fact, in one of these business verticals?
The following article explores 7 industries today where continuous monitoring is a must have. As you read about these specific industries keep in mind as well that very few, if any, businesses exist today that would not benefit from continuous monitoring. Take it from one industry insider, “Continuous monitoring is the single best protection an organization can have to safeguard network health, while taking advantage of the efficiencies and agilities that the new extended IT landscape offers.”
Due to the rise in mobile commerce and online banking, more and more people are completing financial transactions online today. In fact, the American Bankers Association 2011 survey revealed that older Americans, for the first time, prefer doing their banking online. Results showed that 62% actually preferred online banking over branches and ATMs (28%). This means that banks, more than ever, have to be in complete harmony with the needs of the “digital customer.” Everyone expects to log-in to their account anytime of the day or night to check balances, transfer funds, or deposit checks (yes, virtually through a smart phone) in a way that is simple, seamless, and secure. This means that banks must continually understand and monitor how customers are interacting with both traditional website and mobile applications in order to provide them with the best user experience possible. There’s little question that banks require a very sophisticated infrastructure, comprised of continuous website and mobile monitoring, to ensure that performance issues are detected long before they impact customers.
E-commerce continues to grow at staggering rates with metrics showing 2015 web sales of $341.7 billion for the year, a 14.6% increase over 2014’s $298.3 billion. But the rapid scaling of the industry also has led to tremendous new levels of competition . . . not to mention increasing demands for a richer and friendlier and faster experience. The most common trends show that if your site doesn’t do a full page load in 3 seconds or less, then your customers will abandon your e-commerce website and go to your competitors. Let’s face it – a brick-and-mortar business wouldn’t do too well with poorly stocked and lousy customer service. Likewise, your e-commerce site simply won’t survive with high page load times or other erratic and buggy site features. The need for a robust continuous monitoring strategy is a non-question for anyone operating an e-commerce website in 2016.
In today’s world of increasing cyber-crime government agencies of all types must be leaders and propagators of solutions that protect national security interests against any and all threats. The U.S. government really set the pace for continuous monitoring with this 80 page standards report in 2011. And on its website, a section is dedicated to continuous monitoring, with these words: “In today’s environment of widespread cyber-intrusions, advanced persistent threats, and insider threats, it is essential for agencies to have real-time accurate knowledge of their enterprise IT security posture so that responses to external and internal threats can be made swiftly.” While continuous monitoring started with the US government, all organizations in the private sector really need real time review of their systems and infrastructures to ensure company assets are protected against cyber-crime.
New technologies have obviously made our lives much more efficient and productive. And everyone today is contemplating a soon emergent future when retail drones deliver our packages to our front lawn and autonomous vehicles take the boredom out of the daily commute. But there’s a dark downside to all of this change. In 2015 car hacking researchers Charlie Miller and Chris Valasek proved the existence of a security vulnerability in the “Uconnect” internet computer system on certain models of connected cars. This weak spot would allow hackers to plant code and take over major systems such as steering, brakes, and transmission – all from a laptop across the country! To add to the worst nightmarish scenario, one security expert claimed that he hacked into the flight systems of a number of commercial aircraft that he was traveling on. For government transportation leaders and officials protecting the vast public network of roads, rails, and airspace is obviously the front and center concern in an era of increasing terrorism. By the same token, private carmakers, plane, and drone manufacturers must necessarily become advocates for the most stringent real-time continuous monitoring measures to ensure their products meet the highest standards of security humanly possible.
Web Hosting Providers
Anyone who has operated a website knows how important it is to choose the right web hosting. Web hosting providers, like any business, must do everything today to please the customers. There are over a billion websites on the Internet and that requires a huge amount of bandwidth for coverage, so competition in this market is fierce. Anyone today should insist at a minimum that providers offer a robust security policy along with a guarantee of 99.99% uptime. Companies that are not offering continuous traffic monitoring, daily malware scans, and denial-of-service mitigation simply are not living in the real world. Other questions to ask and insist on are what the provider’s plan is in case there is a security incident. What are their response times? Is there a backup of the data available and what is the maintenance and software updates schedule? Any credible web hosting provider absolutely must be top-of-the-line today when it comes to continuous monitoring best practices.
The last few years have been a major wake-up call for the cyber-security industry. With the growing prominence of Internet of Things, the potential for malicious attacks now extends well beyond the realm of cyber-space. Prior to this time a virtual attack might temporarily disable a website or hack an email or bank account. But because of the interconnectedness of devices and people, the threat now extends much further and can impact our cars, homes, and even lives. Truly, the scale and sophistication of cyber-crime has gone through the roof, to the point that nothing is safe anymore. There are fortunately an increasing number of cyber-security services on the market today that are addressing these concerns; major names like IBM, Intel, Symantec are providers of security solutions. And all of the players in the space must absolutely be front-of-line in terms of providing real-time continuous monitoring solutions that take into consideration sensor detection on all devices, active and passive vulnerability assessment, review of log data of all devices on premises, attack path analysis, compliance reporting, and more.
According to recent metrics, approximately a third of the world’s population of 7.4 billion are active social media users. The biggest social media and network sites in the world like Facebook, Twitter, and LinkedIn have not only adopted continuous monitoring but have been major contributors to the development of the industry, due to the massive site usage requirements. Failure is simply not an option. The last thing a company like Twitter wants is to have to display a message that something is “technically wrong”. When that happens then it becomes news headlines and the brand takes a major hit. So continuous monitoring on the backend of any social media and social network site is simply a non-question.