Minimizing Cost-of-Ownership and Maximizing Returns

Good, Cheap, Fast – Pick Any Two

Building versus buying software debates are an enduring part of technology development. These usually end up summarized in three variables: customization, cost, and time. As the old adage goes, you can pick any two of these at the expense of the third.

 

Beacon Platform, offering a set of quant tech infrastructure and applications powered by next-generation data science technologies, is the financial industry’s answer to this trade off. Compared to in-house development of an analytics solution, Beacon’s powerful mix of functionality and customization can reduce direct lifecycle costs by 2.5x over five years.

 

Financial institutions have often opted to build their own applications and tools, considering the advantages of proprietary algorithms and customization to be worth far more than increased costs or extended development times. However, the increasing maturity of cloud computing, web development frameworks, and shared source code are moving the needle significantly towards buying.

 

Quant Tech platforms with robust data science foundations have emerged as a viable alternative to in-house FinTech development. Companies can now buy and deploy far more functionality, faster and at a lower cost, focusing their resources on areas of true competitive advantage. Making these decisions means comparing and evaluating objectives, available technologies, and expected results. Or in simpler terms, what do you want, what do you need, and what do you get?

 

Digital collage showing a human forearm and a robotic forearm with a background image of a city and charts layered over the top

 

What do you want?

In an uncertain market environment, financial firms are looking to operate leaner and react faster, helping to navigate regulatory constraints, optimize assets, and improve returns.

Staying competitive in financial services, as it relates to technology, often comes down to speed and scale. Who can spot the trend change, unrealized opportunity, or developing risk early enough and respond at scale? These are often core arguments for in-house development. But breaking this down further provides greater clarity on what to build and what to buy.

Responding to changes obviously requires a high level of flexibility and agility. Developing proprietary models and analytics is a core component of this, but can be difficult to achieve if the development team is spending too much time on core infrastructure and basic functionality.

 

Next is the ability to rapidly scale models, computation capacity, and data capabilities up and down with market needs. In-house solutions to this problem are typically under-resourced for worst case scenarios or over-resourced most of the time.

 

Finally, as we have seen all too often in the past, is the need to future-proof investments against the inevitable and unrelenting progress of technology. In financial parlance, building everything in-house means assuming all of the risk, instead of distributing and hedging the risk among multiple partners and stakeholders.

 

What do you need?

Beacon’s offering is an enterprise-grade data science and cloud computing foundation, combined with quantitative analytics and modeling tools, for rapid development, testing, and deployment of applications.

If you are not going to build everything in-house, what are the essential components? Start with the parts that you absolutely need to build yourself, the proprietary models and custom analytics that differentiate your business from others. For this you need an advanced development environment and toolkits that enable rapid experimentation and deployment.

 

Identifying new opportunities usually involves a mix of quantitative analysis and data visualization, which humans find faster and easier to process. So you need base functionality that includes graphical and interactive time series analysis, what-if scenarios, and strategy testing, among others. Redeveloping and maintaining these commodity tools in-house results in a large and unnecessary overhead for any company.

 

Handling big data and rapidly scaling massive, computationally-intensive models is a problem that has been resoundingly solved with cloud computing services. Native cloud applications and infrastructure management solutions provide secure elastic compute and storage capabilities on demand, without the large capital costs and lengthy deployment times associated with on-premise servers. Tasks can launch hundreds or thousands of cores as needed and release them when finished, delivering results from complex calculations in minutes instead of days or hours.

 

Finally, whether you build or buy, you need to protect what you have from technological obsolescence. Open architectures, modular design, and access to the source code of what you bought are essential to ensuring that what you buy and what you build can continue to grow and evolve. Appropriate levels of abstraction separate your core work from complexities such as how to draw the user interface or where to run the task. As the underlying technology changes, your work remains the same and only the abstraction layer needs to adapt.

 

What do you get?

Increased digitization enables better scale and leverage of the firm’s data and analytical infrastructure, improving lending, trading, and investment decisions.

Ok, what are the benefits of not building everything yourself? Maybe the most important benefit of using modern quantitative tools is reducing the reliance on slow, opaque, and fragile spreadsheets. Moving large data sets and complex calculations out of spreadsheets and into a development environment brings many benefits, not the least of which are mature version control and the ability to quickly back-out changes.

 

Second, if you use the same platform throughout your quantitative teams, you reduce or eliminate silos of data and algorithms. Teams find it easier to share and reuse code, improving internal consistency and collaboration. Your business becomes more modular, enhancing agility. And you get better visibility and governance of models and algorithms, reducing the black-box effect common to many legacy implementations.

 

Third is the advantage of a modern and open technology foundation. Integrated development environments that boost productivity, application frameworks that abstract away the technical details of web browsers, and open architectures that readily interface with internal and external systems and data, all make it faster and easier to add value and capture a competitive advantage. Or in other words, increase the return on investment of your research and development activities by substantially reducing the time and money spent on infrastructure.

 

Finally, what all of this enables is the next stage in the digital transformation of the financial sector. Fully digitized, modular processes make it faster and easier to respond and adapt to market changes. By building what you do best on top of a broad Quant Tech platform of infrastructure, front office applications, and development tools, you can finally optimize all three dimensions of customization, cost, and time, minimizing total cost of technology ownership and maximizing return on investment.

 

For more information on how Beacon Platform is creating the future of financial markets on the cloud, visit us at beacon.io.