Recent Comments

    Financial Models Bog Down in Corporate Bureaucracy

                                                                                                                                  13 january 013

    Although financial firms think their models are an important part of their success, development and deployment of new models is often bogged down in bureaucracy, according to survey results released today by MathWorks, the UK developer of MATLAB and Simulink.

    The survey showed that  88 percent of financial institutions believe they would lose their competitive edge, 79 percent believe their profits would decrease, and 54 percent believe risk would be increased, if they were operating poor models – for example, flawed or outdated models.

    They also were concerned about slow model development, although the recent Knight Capital loss of $440 million because of a software glitch shows the dangers of inadequate testing. The MathWorks survey reported that 82 percent through slow model development will result in firms lacking the agility to respond to market changes, and 74 percent believed it would result in ineffective risk management.

    It takes months to integrate models into the business, according to the report.

    “When asked what the ideal integration time would be, respondents sometimes appeared despondent,” says the report. ‘Many were resigned to the fact that models simply cannot be integrated without iterations of edits and approvals. Interestingly, academics were looking for quite short turnarounds, with 40 percent looking to reduce the integration time to mere hours. This potentially signifies  lack of appreciation of the bureaucracy and testing experienced within the large institutions.”

    The report notes that sell-side respondents “were emphatic that it’s the bureaucracy within the firm that is slowing this process down.” Proprietary desks reported that they can go live with their models in a few hours. MathWorks did not address ways to reduce the bureaucracy.

    Testing in finance can be fairly ad hoc, said Steve Wilcockson, industry manager for financial services at MathWorks, and it varies by location and industry segment.

    “In a larger bank they do have global test infrastructures but they can be unwieldily. It takes a long time to deploy your algo because the test process is time consuming. Trading shop wants to get ideas out there very very quickly because that is your competitive advantage. Fortunately there are tools that work nicely with MATLAB whereby you can automatically test algos in an infrastructure.” The company has extensive experience testing algos in automobiles and airplanes where the quality of the algo is critical.

    MathWorks suggests some ways to speed the process of model integration, including taking models directly into production rather than recoding them, and rationalizing the development and implementation process. Firms reported problems finding enough quantitative analysts and developers to carry out model development.

    “Our respondents were concerned with building algorithms and numerical analytics that essentially define ideas or intellectual property that relate to applying data models to the real world,” said Wilcockson. “Whether building risk management or trading strategies, it is often the algo that drives the success or failure of a trading strategy.”

    That is especially important in automated high frequency trading (HFT) where a flawed model can lead to immediate losses, as MathWorks says. An interesting note in the report says that financial institutions plan to increase their use of automated trading (59 percent). Meanwhile, many on the buy-side thinks automated trading has had its day (37 percent) and they are looking for  alternatives including bespoke human-driven trading models.

    Data is a concern at financial firms, but it is quality rather than quantity they are worried about. In trading, 49 percent of the firms reported dealing with gigabytes, 28 percent with megabytes although 19 percent on the sell-side said they were dealing with datasets in the terabytes.

    “It’s not just the size of the data but what you do with it,” explained Wilcockson. “We encounter more challenges with simulated data than with real data. Organizations have to construct their own data for valuation and risk modeling.”

    Leave a Reply

    Your email address will not be published. Required fields are marked *

    *