CEOWORLD magazine
Saturday, October 22, 2016
HomeInsiderWhy Financial Services Companies’ Databases Need Continuous Monitoring and Proper Data Stewardship

Why Financial Services Companies’ Databases Need Continuous Monitoring and Proper Data Stewardship

Financial Services Companies

Why Financial Services Companies’ Databases Need Continuous Monitoring and Proper Data Stewardship

While perimeter, , and tend to grab the headlines, in reality it’s the database repositories and the private financial information stored in databases that are the actual target of most breaches. Comprehensive database security is commonly an overlooked area within financial services organizations, yet one of the most critical.

Databases pose a unique security challenge for banks and financial institutions of all sizes. The at financial services companies is usually quite extensive with many databases remaining unknown, unmonitored, or simply left completely unmanaged and worse, unsecured.

It’s a common mistake for financial services organizations to have limited visibility into their database infrastructure, providing an open avenue for cyber attackers. Once inside the database infrastructure, an attacker can easily operate strategically and remain undetected, stealing records, compromising credentials, and installing malware over many months.

“Banks are under an onslaught of attacks from bad actors, so the fact that 12% of banking CEOs reported that they don’t know if they’ve been compromised is troublesome. Cyber is a business bottom-line issue: a true CEO issue,” said Charlie Jacco, Financial Services Cyber Leader at KPMG. “While CEOs may be more privy to information regarding the exact number of cyber technology deployment and hack attempts, all employees should know and be in lock-step on their bank’s greatest vulnerabilities and concerns as it pertains to how that bank views cyber security. The data shows, on a leadership level, a strong difference in opinions.”

In fact, according to KPMG’s 2016 Banking Outlook Survey published earlier this year, approximately 47 percent of banking executive vice presidents and managing directors, as well as 72 percent of senior vice presidents, reported they do not have insight into whether their institution’s security has been compromised by a cyberattack over the past two years. These numbers are alarming and point to a critical need for securing and monitoring databases. Any attack that reaches the core networks can put the financial institution databases and private information at extreme risk.

With breaches increasing at an alarming rate, it’s important for financial organizations to follow thorough data stewardship practices and continuously monitor all of their databases – from their initial deployment, throughout their lifecycle, and into their retirement when the database is decommissioned. Monitoring needs to be detailed down to the “table level” to completely understand the database security profile, data ownership, purpose of the data, and any changes to the data stores. Without an in-depth understanding of every database and detailed knowledge of the private data residing in databases throughout the network, it is impossible to keep data secure and prevent a serious breach. IT security personnel need to put the proper tools, policies and procedures in place.

READ  Why staff buy-in will make or break your strategy

The process starts with a comprehensive assessment of the database infrastructure. It is recommended to use non-intrusive monitoring tools to identify every database on the network and every application or user that is accessing them. Further, the database’s business purpose needs to be documented, the nature and sensitivity of the data stored in the databases determined, and proper retention policies established. It is also important to know what will be done with each database when its retention time has expired. “Zombie” databases that should have been decommissioned long ago are an open opportunity for attack because the database may not be properly patched, credentials may not have been updated, and no one is actively monitoring the database activity.

Once policies are established and the verification of all databases is complete, financial organizations should then continuously monitor these databases throughout their lifecycle to ensure policies and procedures are updated and effectively enforced. Key to stopping serious data breaches is paying specific attention to who is using or accessing a database, how it’s being used, and identifying key changes in use patterns. Identification of an unknown user or uncommon usage pattern may be a sign that there’s a malicious attacker on the network.

“Zombie” databases are particularly vulnerable to insider threats, advanced persistent threats (APTs), and compromised credentials. Attackers can use them as an open door to get access to other databases and potentially private financial information across the network.

In a similar fashion, “rogue databases” can present a large and very high-risk attack surface as well. These one-off databases may have been commissioned during the development phase of a new application and connected to the network without the IT team being aware of their existence. While developers may think they are doing something innocuous, without IT going through the proper lifecycle steps, the data won’t be properly protected. Private data on these rogue databases resides outside the scope of the security team, leaving the organization highly vulnerable. Without intelligent monitoring to identify when a new database is active on the network and to check the database against current data asset inventory, it’s not possible to properly secure its data.

READ  Yes, Health and Wellness Programs Are Worth Your Investment

With so much attention focused on securing the perimeter, mobile devices, and the cloud, financial services IT teams risk ignoring the security of their organizations’ “crown jewels” – all of the databases residing on their network. In order to prevent a serious data breach, every database needs to be identified, inventoried, continuously monitored, and retired if not in use. It’s extremely critical for the protection of sensitive information for IT teams to be aware of who is accessing a database, what each database is used for, and to ensure data is protected for the lifetime of the database. Without a comprehensive database monitoring model in place, financial institutions run the risk of a serious breach of information, and ending up being front page news.

Written by:

Steven D. Hunt – President and Chief Operating Officer, DB Networks, Inc. 

Steven D. Hunt

Steven D. Hunt Verified account

President and Chief Operating Officer at DB Networks, Inc.
Steven D. Hunt is responsible for leading the development and operation of the company. Prior to DB Networks Steve was SVP, Engineering and Operations at Coradiant and later was a member of their Technical Advisory Board before it was acquired by BMC Software. Steve was an early employee and VP Engineering at Netsift (acquired by Cisco Systems in 2005). He was also an early employee at Copper Mountain (IPO in 1999), holding several positions starting as VP Engineering and later GM & SVP Engineering. Steve spent many years at Bell Labs, including heading their Internetworking Department. Steve earned a B.S. in Electrical Engineering from Drexel and a M.S. in Electrical Engineering from Stanford University.
Steven D. Hunt

Add a comment