Sundeep Kumar, Senior Vice President, Institutional and Commercial Risk Technology, Citigroup, and Brian Cassin, Managing Director, S&P Capital IQ, discussed changing data needs and trends in commercial banking.
BRIAN CASSIN: This space is so dynamic these days and has changed quite a bit. How do you think the market will want data for different applications in the near future? Clients are looking at the same content but are considering how it can be used across an enterprise and in different applications. So how do you determine what is the right data for the right speed?
SUNDEEP KUMAR: There are different types of data. There is real-time data as well as data pertaining to the monthly, quarterly and historic reports that regularly come in. There are also requests for reports for reference data that come in RFUs. So the same applications might be using different data at different frequencies. There could be some applications that are just display data, such as data on MarketWatch, and there may be some applications that are published after utilizing the publishing lever to information, where businesses are trying to use that for a market-making process.
So companies are trying to figure out if there is a right frequency in which they want to get data or if the applications require a large number of data of different frequencies. Do you plot at the highest frequency or into a different class of services? Do you do it on a subscription level or ad hoc, whenever your clientele requires you to do it? Those are some of the most difficult decisions that people have to make.
It’s a conundrum for experts like yourself. Which way do you think the market is leaning? Do you think they’re leaning toward taking it at the lowest latency and trying to do it themselves, or are they relying more on the suppliers these days?
People have been creating their own data warehouses, which gives them the advantage of addressing and using data any way they want. A large amount of data is being utilized through other technologies. That requires you to get data as quickly as possible and then decimate it using a set of middle-ware publishing that you do yourself. That seems to be the trend: focusing on big data, taking all of that data in and then producing your own data cubes from that.
“People have been creating their own data warehouses, which gives them the advantage of addressing and using data any way they want.”
That leads me to another question that is along the same lines but more about content. We think a lot about which content and technologies are changing the way businesses see opportunities or imagine applications. Can you give us your take on that?
A few months ago, there was a case in which the market suddenly gapped. Someone mentioned on Twitter something that happened at the White House, and the whole market went down as a result. Experts are speculating that this happened because trading houses are utilizing the Twitter feed to directly influence their decisions. That’s very interesting. Early on, we were only decimating market prices, and people are still looking at that. But we are also looking at many other things, including Twitter data, Facebook data and other browsing data.
There are many applications coming up that utilize all of these data elements. People are building an expert system into it and have complex evaluating systems that allow them to have a richer experience. Of course, it comes with problems. Currently, there are very few people who utilize a Twitter feed in their trading application with a system that sifts out the noise from the valuable data. However, people are beginning to develop different types of systems in order to filter the news to solve those problems.
What are your thoughts about the current risk management systems out there, especially when it comes to data management and maintenance, data processes, architecture, workflow, etc.?
We can no longer create an application that will look and feel exactly the same for everyone; that is the trend we are seeing. So we have been working on systems that are extremely configurable and flexible, which allows us to implement the local content but, at the same time, provide a framework in which a minimal amount of work is required in order to customize it. The market is moving away from a coded system to a more data-based system, where you define the systems in a metadata way. A large number of our systems are utilizing PPM orchestration engines, where XML-based configurations are used to drive code.
In essence, the trend has been moving toward more of a descriptive type of programming, and you see that in all modern languages, including you know C#, BBF and even in Flex, where you are defining things in a more descriptive way and the coding is done behind the scenes using a designer.
“The days of massive infrastructures are over; companies don’t have the budgets to support them. Whether it’s through us as the distributor, through an exchange of market data, or through third-party data, they’re looking for the ability to receive simple APIs.”<h/2>
It sounds like you’re tapping into a newer type of professional to help you build those systems internally.
Earlier in my career, coding required people to understand the entire lifecycle of the application and define how the high-frequency trading system and functionality worked. But the trend now seems to be going more toward creating pieces individually and then fitting them together; people don’t need to understand the complete end-to-end anymore. There is a very large number of people who are developing very small complements, and then there are those who are configuring them and creating very rich applications with them.
We find the same issue in our industry when it comes to exchange-traded data. At the WPIC conference last week, people were calling for a better standardization of content because everyone is on different protocols. But we’re finding that you can have certain people write specific pieces of code and then have other people pull it all together. It’s interesting that we’re seeing it across multiple, different segment types. Is there anything else that you want to speak about?
Another trend we’re seeing now is that clients are looking for data in varying speeds. They’re especially looking for something you touched on with your solution: is it flexible? The days of massive infrastructures are over; companies don’t have the budgets to support them. Whether it’s through us as the distributor, through an exchange of market data, or through third-party data, they’re looking for the ability to receive simple APIs — whether it’s C++ or C# — that they can code into their applications. And, in those applications, they mostly want to see normalized data, although some people want raw data, and we connect them that way. But for the most part, the trend is for normalized data as fast as we can deliver it. People want us to handle most of the hard parts and deliver to them a simple, lightweight API that they can then ingest into their systems to use in multiple, different facets downstream, whether it’s ULL trading or going into black box systematic trading. We’ve done that for a very long time.
Now, we’re now starting to talk to a lot of firms about helping them power their risk solutions in the middle office. They don’t need ultra-low latency in their risk solution, but they do need to see price movements that they can use as part of their calculations when they’re shocking their portfolio and doing other things. There’s also a need for a lot of those folks to use one API to power the back office to reconcile their books at the end of the day. And, of course, everyone has a home-grown OMS, EMS and websites.
So the overall trend is oriented toward implementing a normalized, simple API with standardized symbology across it that companies can tie into multiple applications to consume data at the speed that they need it. That’s what I’m seeing.
Sundeep is a seasoned technologist and a strategic and delivery-focused development manager with immense experience in financial risk management, pricing, analytics, bond valuations, credit products risk management (CDOs, credit default swaps, synthetic CDOs, etc.), credit underwriting, commercial bank operations (credit origination, credit analysis and approval, early warning, and remedial), Basel II and Basel III reporting, exotics, and OTC products. He is also experienced in designing, developing, and implementing trading systems for diverse financial products and services ranging from equities, equity derivatives, futures, fixed income, algorithmic and basket trading, and FX. Sundeep has in-depth business knowledge of market protocols, trading rules, credit processes, regulatory requirements, Monte Carlo simulation, Mathcad, pricing, and analytics of financial products. He has an M.B.A. in finance and international business from NYU Stern School of Business and is a C.F.A. charterholder.
Sundeep has expertise in designing low-latency, high-frequency, and algorithmic trading systems, having developed techniques to handle multithreaded programming scenarios, minimizing locking, context switching, minimizing overhead within the systems, and developing optimized algorithms. Sundeep has a passion to learn new technologies and techniques and has great resolve to make sure the job is done efficiently and on time.
Sundeep has strong experience in handling multiple complex global projects and large teams (including 300+ resources) spread globally across North America, EMEA/CEEMEA, APAC, and LATAM. Sundeep is responsible for financial governance, resource management, operational risk, and ongoing strategic partnership with the senior leadership team, providing relevant information to the business and leadership team to make decisions that align with the objectives of the broader organization. He is also an experienced lead computer forensics and troubleshooting expert.
Brian Cassin is a managing director in the product and content organization of S&P Capital IQ. He is also global head of Real-Time Solutions, responsible for growing parent company McGraw-Hill Financial’s market share and revenue from real-time offerings, data, and delivery. Brian is also responsible for the strategic direction of integration of these state-of-the-art capabilities into the larger S&P Capital IQ product suite.
Prior his present role, Brian led the S&P Capital IQ Wealth Management business, where he was tasked with driving the strategic direction of new content and product initiatives while maintaining and enhancing existing product lines. During his tenure, Brian’s team launched two key initiatives to be formally launched in 2013, significantly expanding the company’s reach into the wealth management client workflow.
Brian joined S&P Capital IQ in 2006 as a vice president in client development. In this role, his primary focus was leading the team responsible for building, selling, and supporting market awareness tools—most notably, the Real-Time desktop application, which Brian continues to oversee in his current role.
Before 2006, Brian worked at Thomson Financial, serving in multiple leadership roles including sales, support, and client education. Most of Brian’s tenure there was also focused on real-time businesses and delivery for a core clientele of wealth managers, institutional traders, and investment managers. He played a major role in the integration of these capabilities into the Thomson ONE platform. Additionally, he oversaw many large initiatives, including the training and rollout of Merrill Lynch’s Wealth Management Workstation.
Prior to that, Brian served as an economic consultant at Haver Analytics.
Brian graduated from Fordham University with a B.A. in economics and holds an M.B.A. in finance and information systems from Fordham University’s Graduate School of Business.