With inflation climbing and affecting most (if not all) industries, budgets around the world are tighter than ever. However, companies that fail to plan for innovation may be planning to fail. Cyber threats continue to advance, and new technologies present exciting and lucrative opportunities that can’t be ignored. 

Traditionally, most companies have approached IT and tech budgets in the same way, addressing immediate needs and making predictions based on past experiences. This could lead to large, unpredictable expenses down the road, along with missed opportunities or increased competition from more innovative competitors. Instead, it’s important to take a risk-based approach to technology and cybersecurity that adapts in accordance with your needs and an ever-evolving threat landscape. 

How to Budget Correctly 

Every tech strategy should incorporate an optimal blend of tech investment and processes. Tools and automation are just two of the factors to take into consideration. You will need to consider security training, risk assessments, auditing, policy development, security of mobile devices, disaster recovery, and patch management and maintenance, among other things. 

The list may seem overwhelming, but the sheer volume of cyberattacks has only emphasized how critically important security has become. Healthcare is a good example. In the first half of 2022, the healthcare sector suffered more than 337 breaches, implicating 19 million records in just six months. The financial sector reported $590 million in losses due to data breaches in the first six months of 2021, up from $416 million during all of 2020.  

Fortunately, there are consulting firms and vendors that can reduce complexity and assist companies with meeting their cybersecurity cost demands in an efficient way. 

It’s equally important to plan for technological advancement and its impact on your market. Going back to the healthcare example, we know that new medical tech is responsible for 40-50% of annual cost increases in hospitals. Often, these technologies are in their infancy and change so rapidly that it’s hard to determine whether these technologies will eventually achieve economies of scale and reduce costs. 

Before adopting a new innovation, it’s important to conduct a thorough return on investment (ROI) and cost analysis. Again, it’s a good idea to work with a specialist consultant that understands the cost of implementation and management, along with other factors that your accounting department may not be aware of. 

Technology can consume time and money at an alarming rate within any organization, considering programming, bug fixes, project scoping and development, training, and support. ROI is put at risk with every change request and adaptation. 

You could justify the return based on income levels and capture projects through the budget process (much like hospitals justify returns when hiring a new specialist neurosurgeon), but it’s important to evaluate the way new technology affects overall costs and resources, especially in large organizations like hospital systems. 

Calculating Your ROI

The simplest formula for calculating return on investment is ROI = net gain/cost. Defining costs when it comes to IT and cybersecurity is much harder. Costs may include staff time, programmer time, equipment, disruption to business operations, and others. It’s not unusual for costs to be intangible and hard to assign, like unintended outages or support. There are multiple variables attached to activities that make it difficult to tie a number to the final action. 

The benefits of the project are equally hard to determine. Many investments have clear and direct benefits, including increased outputs or financial savings. However, the benefits of tech investment may be improved customer experiences or freeing up staff time to work on other projects. Medical equipment may open up new revenue streams, while fintech innovations represent a distinct competitive advantage (or future-proofing) for the organization. Telemedicine and digitization, for example, make for more comprehensive assessments of the patient, which translate to fewer trips to the doctor’s office and faster diagnosis, which ultimately drive down healthcare costs even though the initial investment is steep. 

The same could be said for cybersecurity. Many organizations view cybersecurity as a cost center. They may view IT security as a necessary evil in order to be compliant. This way of thinking isn’t reflective of innovative thinking about cybersecurity and doesn’t promote a culture of security by design and default. It’s important to view and treat cybersecurity as an investment in growth and resilience. 

Sufficient early investment could avoid far costlier consequences in the long term, especially for high-risk industries. As one study found, healthcare accounted for only 17% of total cyber claims but accounted for 28% of breach costs. An average breach in a hospital costs $2.1 million, and the US healthcare industry loses more than $6 billion every year. Ransomware attacks, for example, demand a ransom payment but also affect public relations and operations, as companies can remain offline for weeks following an attack. 

It’s worth noting that the reason hospitals are frequent targets of cyber-attacks is that many medical devices use older technologies, leaving them vulnerable to exploitation. Ironically in prioritizing operations and efficiency over cybersecurity in their budgets, these hospitals put operations and efficiency at risk. 

How Much Should You Spend on Technology and Security? 

Most companies say that you should spend between 7-10% of your IT budget on cybersecurity, but this depends on a number of factors. You could spend 20% of your annual IT budget on cybersecurity without achieving the right level of assuredness you require or spend 5% and achieve acceptable risk levels. 

According to an IDG Communications survey, the factors that determine the priority of security spending includes best practices, compliance mandates, and responding to security incidents or mandates from the board of directors. 

To determine how much to spend on cybersecurity, companies should identify the point where expenditures yield a marginal return in risk reduction. This is the point at which your organization can demonstrate defensible due diligence. 

Determining how much you should spend on technology is equally variable. Apple invests 5% of its annual revenue in research and development, while Amazon invests 28% in R&D. According to PWC’s Comparison of Innovation Spending, most businesses spend 3-4% of their total revenue on innovation, but that figure depends on innovation maturity, industry, and region.

Instead of picking an abstract figure, align your innovation investment with your revenue targets and introduce measurable metrics that correlate your investment to ROI. Start by calculating your growth gap, the additional revenue needed beyond what is forecasted with your existing products. 


Factoring in the cost of technology and security is difficult – and that is assuming that an organization has the luxury of deciding what to spend on its own. Many businesses face regulatory requirements, customer expectations, and other demands that dictate their IT and innovation spending. 

With so many unknowns to consider, it’s worth working with a trusted partner that understands the hidden costs (and benefits) of technology and security spending.