Close Close
Popular Financial Topics Discover relevant content from across the suite of ALM legal publications From the Industry More content from ThinkAdvisor and select sponsors Investment Advisor Issue Gallery Read digital editions of Investment Advisor Magazine Tax Facts Get clear, current, and reliable answers to pressing tax questions
Luminaries Awards
ThinkAdvisor

Portfolio > ETFs > Broad Market

Giving The Fat Finger To Fat Finger Syndrome

X
Your article was successfully shared with the contacts you provided.

A recent item in the online edition of London’s Daily Mail reported that a stock trader in Japan had executed an erroneous trade that sent stock markets across Asia tumbling and led to his brokerage losing some 27 billion yen (about $227 million U.S.).

The trouble came when Mizuho Securities, Japan’s second-largest bank, tried to sell 610,000 shares of a recruitment agency, J-Com, at one yen (less than one cent) each. The trader actually meant to sell one share at 610,000 yen (just over $5,000 U.S.), but the Tokyo Stock Exchange processed the trade, even though the price was clearly wrong.

This happened, according to the Daily Mail, “despite major banks and financial institutions pouring millions of pounds into tightening up their computer systems to prevent trades being made in error.” The kind of error being referred to here recently has been dubbed “fat finger syndrome,” apparently because it arises out of data input that is mistyped by bumbling human operators.

But let’s think about this. Clearly this is not a case of some harried data clerk whose engorged, greasy finger slipped off the 1 key and landed on 2. Obviously, the inputter put numbers in the wrong fields entirely (reversing price and number of shares), which might lead many of us to call this a case of “fat head syndrome.”

It’s very interesting, however, that although we lay the blame on a flawed human being, those involved in this incident seem to think that the solution lies in having better computer systems, as opposed to more alert human operators.

At one time, it was common in computing circles to point out proudly that “computers don’t make mistakes,” and that is true, assuming the human beings who program them don’t make any mistakes. But there are human mistakes aplenty in programming and software development. If you don’t believe it, ask yourself why you have to download software patches continually from the largest–and presumably most diligent–software companies, or why some software refuses to work with other applications without some prodding or reprogramming from (gasp!) humans.

And there is the crux of the matter. Mistakes, despite the understandable assumption they are anomalies, actually are the norm in the world of computing. The reasons for this are economic as much as anything else. Most software companies–at least off the record–will admit that they could provide products that are closer to being bug-free, but they would add that their development costs would be so high as to render the products unaffordable for many buyers.

So, what can we say about the inappropriately dubbed “fat finger syndrome” incident and others like it? Certainly, a human error occurred, but I can’t help thinking that there should have been some safeguards–both electronic and human–in place to prevent such a bonehead maneuver.

One would think, for example, that such a precipitous drop in price would have set off loud alarms in the stock exchange’s systems, but apparently this was not so. On the other hand, even a cursory glance at the trade by someone familiar with the market should have been enough to alert that individual to examine the transaction.

The trouble is the financial industry demands instantaneous transactions and results, so we don’t have time for human scrutiny or built-in safeguards that might call forth human scrutiny. Just as we are willing to put up with a certain amount of error in our software applications, we seem willing to tolerate errors in transactions–that is, unless they cost $227 million.

Here’s my solution: Have human programmers take the time to install more safeguards in computer systems that automatically kick out transactions with impossibly large or small numbers. Then have those transactions examined by an experienced human being who can do the proper checking. Yes, the transaction will take longer to execute, but most of us would probably agree that it’s worth a few minutes’ delay to save a huge wad of cash, not to mention someone’s job. Instead of better systems and less human involvement–which often is touted as a solution to error–we actually need more such involvement.

Humans make mistakes and humans tolerate mistakes. Thus, the tools we use–no matter how well conceived–can produce flawed results, and we will need to monitor processes and make corrections. Humans are both the problem and the solution.

“Star Trek” fans will remember the intrepid Captain James T. Kirk making liberal use of his incredibly sophisticated shipboard computer in order to make command decisions. Yet, when push came to shove, Kirk often relied on his human ingenuity–much to the annoyed chagrin of the logical Mr. Spock–in making a final choice.

There always will be fat fingers and fat heads. So it must be as long as we humans are in charge, and thankfully–at least for the moment–that remains the case.


NOT FOR REPRINT

© 2024 ALM Global, LLC, All Rights Reserved. Request academic re-use from www.copyright.com. All other uses, submit a request to [email protected]. For more information visit Asset & Logo Licensing.