Back to the future >> preparing for an avalanche

When a bank implements major solutions, you need to watch like a hawk. The smallest glitch can set off an avalanche. When we were asked to validate the performance of an integrated financing solution for leading commercial bank, we assumed it was like any other project. This wasn’t the case. The challenge thrown at us was ensure the system is future proof for 3 years! From our experience, we knew scripting and simulating large user load was the easier part. Banks run on data and documentation and this product is intended to cater to the agri-commodity business of the bank. We foresaw an avalanche of data thundering down!

The product is intended to enable financing for farmers for the commodity they have produced. Bank offers loan against the commodity that is being stored in warehouses. With a focus on commodity finance, the solution encompasses various modules of commercial operations right from sourcing of the account, operations, monitoring and control, recovery management, audit and closure through repayment. Each of the process that is initiated has to go through approval process and most of the processes have initiation and approval stages for the completion of the process!

Based on the understanding and post having some initial discussion with the bank, detailed operational profile was derived. 40+ scenarios were identified for the test with concurrency of 600 users.

The plan was to conduct the load test for 3 different combinations where peak concurrency of each module defined is achieved during the different combinations of load test.

Considering the key requirement was to conduct the test simulating 3 years of usage of the system, the only success factor was the test data creation. Hence it was required to create the huge test data before doing the actual test. The system was heavily loaded with data – 2000 users, 5000 borrowers 300 warehouses (100 Govt, 200 Private/Godown warehouses), 44,000 Loans, 50,000 liquidation, 10 image uploads per borrower and every warehouse creation and so on…

The scripts were developed to populate the required test data in the system to replicate three year usage. The first step towards that was to create 2000 users in the system. User creation meant creating more data for every user required role, and branch for which it needs to be created. After creation of the users, we started creating the warehouses and the borrowers required for the test. The next major activity was loan bookings and liquidations. 40,000 loan bookings and 50000 liquidation records were created by running JMeter scripts.

The interesting part was the interesting set of functional issues that surfaced during the data creation. The customer couldn’t be happier. The product was supposed to have been tested thoroughly for functionality. Once these were fixed, we were prepared for the next set of performance related issues. Steadily, with one step at a time, we ensured the avalanche would not occur for the next three years.

Add a Comment

Your email address will not be published. Required fields are marked*

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Shopping Basket