New Age ‘Intelligent’ Record Keeping Systems
In its record keeping survey for the year 2011, Plan Sponsor identified that more than 60% of current record keeping systems are legacy systems. Even systems built on mainframe technologies are considered outdated by current generation business. Technology writer Stewart Alsop predicted way back in 1991, that the last mainframe would be unplugged in 1996. Not only did the prediction go wrong, but mainframes are still powering on well into the 21 century. Several banking and finance corporations are using systems based on mainframe technologies. One classic question still haunting several technology pundits and business experts is whether these companies are lost in the labyrinth of the legacy systems or have become complacent with these systems; considering the huge costs and business processes associated with technology shifts.
Most mainframe purchasers during late 1980s and early 1990s were big companies in data-intensive industries such as finance and insurance. So were pension and retirement industry companies which invested in systems built on mainframe technologies and they continue to bleed money year on year, in the name of upkeep and service costs. It is known that mainframe systems serve multiple users from a central repository with excellent virtualization and process automation capabilities, but at the cost of extensive human attention and intense space requirements. It is generally observed that mainframe based legacy systems are used by organizations with really deep pockets and have mission-critical functions that require massive amounts of information to be processed, in a reliable manner.
Basically, earlier recordkeeping systems were built to perform specific sets of accounting transactions, with an ability to track participant balances and transactions. Since the pension and retirement industry was at a nascent stage, companies were looking to technology for a competitive advantage. These legacy systems were designed and built keeping in view the convenience of programmers, rather than the business users; which is evident from the transaction constructs of the core. Even the participant interactions were simple and minimal, and were made under the constant guidance and supervision of administrators These legacy systems driven by transaction files were efficient at simple and narrow functionalities like managing debits and credits; thereby gaining the confidence of banking and finance industries. They can’t serve the current industry needs, since they can’t understand the business processes or adapt to changing industry or market needs. Blame it on the monolithic architectures they are built upon.
Legacy record keeping systems are considered as Pandora boxes by current pension and retirement industry experts. Some of the quite common issues prevalent in organizations with legacy record keeping systems are:
- 70-80% of the annual IT costs being consumed by maintenance alone
- Inflexible, closed or monolithic architectures, affecting the integration with other systems thereby hindering development and innovation
- Huge maintenance and enhancement expenses on the applications
- Higher time-to-market, due to complex configurations involved in incorporating changes related to evolving business or regulatory requirement
- Fewer hardware and software support for legacy systems
- Hardware and software support issues for business-critical operations
- Since the legacy systems were built on outdated technologies, skills availability for these outdated technologies is a huge challenge in the market
- SLA related issues due to absence of real-time architecture
As a ‘Legacy Modernization’ technique, some organizations are following a mixed-method of re-engineering and re-hosting their current legacy record keeping systems. They are re-hosting their legacy systems on advanced hardware platforms; which involve tasks related to re-compilation, migration of system inventory etc., with no major modifications to business logic and data. Some organizations go to next level with ‘re-engineering’, which involve either reverse engineering or forward engineering methods. Through re-engineering these organizations want to achieve the new architectural paradigms like SOA, which offer them more agility, flexibility, and reduced TCO. But re-engineering strategy always has the trade-off of higher elapsed time associated with high risk migration plans.
Of late, a popular method of pseudo re-engineering is ‘Web-Enabling’ the legacy systems. As mentioned earlier, because of legacy record keeping system’s architecture being inflexible or closed, it’s a daunting task to web enable these legacy systems.
Underneath the wrappers
When outmoded legacy systems were discontinued due to the efficiency of server-oriented operating systems; they tried all available options to withstand the competition by camouflaging their obsolescence.
A very popular legacy system built on mainframe technologies, in a strategy to retain its existing customers introduced a web application add-on to automate its manual and labour-intensive payroll and census data collection processes. Through a web based interface they tried automating and integrating the processes and workflows, with assistance of a rules-based validation process.
Just providing the web wrapper bolt-on doesn’t qualify them to embark on SOA strategy. Legacy systems need complex integration techniques to incorporate their mainframe frameworks into a distributed SOA strategy. Though there exist several ways to integrate the mainframe with distributed systems to participate in the SOA, the fundamental fact is that the mainframe is a mammoth creature to integrate with and integration would be a herculean task, as legacy applications were build on proprietary systems and protocols. To initiate the mainframe integration, one has to master the underlying data structures, which are basically COBOL programs; which in turn are mapped to the mainframe data, stored outside the distribution systems.
The copybook ( a file structure) , a reusable file description that the COBOL complier copies into the code at compile time, is the source of metadata describing file structures on the mainframe. Mainframe databases may also have a data dictionary as a source of more metadata, but the copybook is still required for COBOL and can be used for integration purposes. Mainframe integration software products can take COBOL copybooks as input and create a mapping for the transformation of mainframe data to a structure that can be used by distributed systems such as an ASCII XML schema – mainframe data is encoded in the EBCDIC character set so this too must be translated.
Many legacy mainframe systems do not directly update master files as transactions occur. They write the transactions to a flat file (sequential records) and use batch programs to read flat files and update the master files. This is typically done at night when the online systems are down. The batch processing technique predates On-line Transaction Processing (OLTP) systems, such as CICS, and in many cases is an artefact of these legacy beginnings. Online processing uses an OLTP system such as CICS to process multiple users’ requests as transactions and updates files or databases as the transactions occurs. The OLTP system manages the concurrent access to resources with support for resource locking and transactions.
At a high level, you can integrate with the mainframe within processes or though the underlying data. Integrating at a process level might involve process integration techniques like “screen scraping”, “peer-to-peer communication”, “messaging”, “CICS adapter”, “API Adapter” or “Web Services” between the distributed system and the mainframe. Data integration techniques like “ File Transfer”, “Data Base Connectivity”, “Database Adapters”, “File Adapters” that operates at the data level without direct interaction with the mainframe process, for example using FTP or a database connection via ODBC/JDBC.
With all these integration techniques available, one has to remember that with the development done on the legacy technologies there is often a conflict between transactional priority and development priority. Mainframe business transactions are prioritized over software development. Whatever these legacy system developers might claim, the fact remains that the data can’t be analyzed in real time; it will involve the same old delayed or batch reporting processes offered by the mainframe. They can’t be driven by business events.
Bolt-on workarounds aka ‘The Labyrinth’ like web wrappers only increases the costs; they don’t add much value to the business. These web solutions can’t understand the end to end process, since the legacy systems on which they work are driven by core technical transaction construct, not by the business events. They are just the middleware, built to redefine the interactions; pertaining to specific transactions, with limited communication scope with upstream and downstream systems. Chances are there that these web solutions bring in duplication of data and rules, thus making it tough to diagnose issues easily in some cases. If not integrated well with the core system, the data displayed on web solution might differ from the data residing in the core database. These web solutions have limited control on business interactions or external interface dependencies, thus not representing the process engine in whole. Thus, a wrapper ultimately cannot decrease the TCO or enhance the business’s ability to adapt to new service offerings.
These bolt on solutions may claim they are comprehensive, interactive, help you automate the business process, eliminate redundancy, improve efficiency, review performance etc., but they are just wrappers around the existing legacy system. Fixing the root cause, i.e., replacing the legacy record keeping system would be the right solution, not just wrapping it with attractive UIs i.e., just web enabling the legacy systems. The landscape of recordkeeping business has dramatically changed because of higher participant awareness and involvement; they can’t be fooled with feel good web wrappers. ’Old Wine in new Bottle’ trick don’t work for the legacy systems which are coming up user friendly web solutions.
A high level comparison of available options in the market today to modernize the legacy record keeping systems is tabulated below:
From the table it is clear that organizations can leverage investments in the existing legacy record keeping systems through re-hosting and re-engineering techniques, with minimum effort thereby extending the life span of their legacy systems; however, it will be a tactical approach. In one of their surveys, Gartner predicted that 25-30% of employees with legacy skills will retire by 2012. With more and more talent moving away from legacy technologies, the sustainability of legacy applications is at cross roads. However reliable, scalable and dependable these legacy systems might be, issues related to obsolete technologies and platforms, scarcity of skills, high operational costs, complex integration are increasing every day. As a strategic move, considering the millions of stakeholders, organizations on legacy record keeping systems should move on to the ‘New Age’ – intelligent record keeping systems.
Intelligent New-Age ‘Game Changing’ Record Keeping Systems
Legacy systems have been the cornerstone of data centre computing for decades — with their scalability, stability, and capability to manage colossal workloads without “server sprawl” instances. These are no longer advantageous considering the huge associated hardware and labour costs, their complex and aging architectures and pricey software licensing and maintenance expenses. It is time to move on to intelligent record keeping systems, built on latest technologies which are more stable, scalable and reliable; which can educate and empower all the key stakeholders with more choices and with greater ease.
With the same levels of performance, reliability, security, scalability offered by the newer technologies, and lower infrastructure and transaction costs, organizations with an eye to the future, should switch to the new record keeping systems built on cutting edge technologies, rather than web enabling their legacy systems. These ‘Game Changing’ record keeping systems give the ‘competitive advantage’ through better innovations and product offerings.
Many companies still sticking to the age-old legacy systems are apprehensive about replacing their legacy record keeping systems with the new age, intelligent record keeping systems. They are worried about replicating the existing business logic in new record keeping systems. Not all new age record keeping systems can reuse the existing business logic; however, a new breed of record keeping systems are beginning to make an entry in the market with such capabilities. They can take over the legacy systems with minimal business process re-engineering, customization and business logic modifications. These record keeping systems offer better business processes when compared to home grown legacy systems. They enable shorter time-to market cycles for all regulatory and business changes, as they drastically reduce the maintenance and modification of source code, compared to re-engineering and web-wrappers cases. They enable implementation of new functionality enhancements in shorter time frames.
Some of the benefits offered by these new age record keeping systems are:
- Enhanced business process efficiency
- Faster business process integration with shorter time-to market
- Better business decision making through intelligent dashboards and analytics
- Multi-channel access and enhanced user experience
- Enhanced product and service offerings
These new-age record keeping systems understand the complex interdependencies of legacy record keeping systems and communicate intelligently with them. They are good at precise and inclusive extraction of business logic from the legacy record keeping systems, without compromising on business objectives and business continuity.