The Technology Paradox: How Technology Can Slow Business Down

Share on:

Overview

Technology quite literally operates at the speed of light. It's flexible, powerful, and even cheap. But many businesses are finding that the more they leverage technology, the more rigid, impotent, and expensive it becomes to operate and maintain. Technology slows business down due to no inherent fault or lack of its own. I call this the Technology Paradox, and contrary to popular belief, it is not usually due to poor implementation, lack of technical skill, or poor quality. To understand the origin of the Technology Paradox, consider the following example.

Palmetto Leaves and Sweetgrass

As you drive around Charleston, SC and the surrounding areas, you'll see Gullah men and women of all ages weaving sweetgrass baskets along the side of the road. Unlike in traditional basket-weaving, sweetgrass baskets are made by bundling thin, spaghetti-like blades of sweetgrass together with blades from the saw palmetto plant, These bundles are tied together at the ends and coiled like a snake to create an endless variety of these beautiful, strong baskets.

The flexible nature of sweetgrass allows these people to make all sorts of other items, such as potholders, drink coasters, and even boxes. The bundling of the brittle sweetgrass gives these items strength and durability, but it also takes away the flexibility. A sweetgrass basket without handles cannot be easily turned into a sweetgrass basket with handles. A small sweetgrass basket cannot be easily transformed into a large one.

Technology is like sweetgrass. It's highly flexible and can be used to make just about anything. But once you make something with it, it becomes almost impossible to turn it into anything else. Many businesses think that if they take some technology and bundle it, share it, or just use it with other technology, that they'll end up with an entire technology infrastructure that's both powerful and flexible. Sadly, this is just not true. Countless businesses have ended up with giant, ugly collections of technology that are rigid, interlocked, and almost uselessly complex.

Sweetgrass baskets

7 Bad Reasons Businesses Use Technology And How To Avoid Them

There are countless reasons businesses utilize technology. Seven of them are bad and lead to escalating costs, diminished returns of both time and money, and decreased effectiveness. Unfortunately, these seven reasons are among the most common. As you read through these reasons, take note of the ones that sound familiar.

Reason #1 — Vague or Unclear Goals

"What technology do we need?" is a refrain I've heard over and over again in organizations. The hidden assumption in the question is that technology is needed at all. My answer is always the same, "What technology do you need for what?" The answer is usually vague — "improving performance," or "lowering costs," or "becoming more competitive." No technology, no matter how sophisticated, can help you achieve ambiguous outcomes. Businesses who try to use technology to achieve vague goals are actually worse off than businesses who don't use technology to achieve similarly vague goals. Technology can mask a multitude of sins, and a business can proceed with vague or undefined goals for a long time before realizing its mistake.

Research In Motion (RIM), the maker of the once-dominant Blackberry smartphone, is a perfect example of a company that put technology at the forefront to make up for an utter lack of sound business goals. In 1999, RIM released the original Blackberry smartphone. The goal of the device was plain and simple: allow wireless, instant access to corporate e-mail, anywhere and at anytime. While this goal was clear, coherent, and valuable, RIM apparently did not have a yardstick by which to measure when the goal would be complete. RIM achieved this goal in short order and proceeded to focus on growth, expanding globally and striking contracts with telecom providers the world over. But for the next ten years, RIM didn't seem to pursue any new goals. Instead, it focused on improving its technology, releasing over 70 different smartphone models which were fundamentally more or less the same. RIM's market cap capped out in 2008, just one year after the release of Apple's iPhone, then began to decline. The iPhone quickly stole that market share and took hold as the preferred smartphone, but it had nothing to do with the iPhone's technology. RIM had been perfecting its smartphone technology for over ten years, while Apple had only just begun. Major corporations all over the world were using Blackberries, and RIM had built strong relationships with many of them. But what propelled Apple to the forefront at the speed of light was the achievement of a very specific goal that culminated in the creation of the iPhone. Apple's goal was to create a single, portable device that would allow people to quickly and easily listen to music, take telephone calls, and access the entire Internet, anywhere and at anytime. Just as RIM had set and achieved a very specific and valuable goal almost a decade prior, so Apple set and achieved its own specific and valuable goal and began growing wildly.

The story of Apple and RIM proves that the success or failure of any company — including a technology company — has almost nothing to do with technology. RIM's Blackberry devices could do all of the things the iPhone could do, but the only goal they were designed to achieve was wirelessly access corporate email. Consequently, Blackberries did that one thing well, and everything else poorly.

After more than a decade of goal-free living, RIM did try to salvage some of its market share by finally setting for the Blackberry what was essentially the same goal Apple had set for the iPhone. But even this approach was doomed from the start. You'll see why in the next section.

The Lesson: In order for technology to be valuable to any organization, it must be in support of specific, measurable, and valuable goals. Use technology if and only if technology can meet those goals in a way that delivers a substantial return on money and time.

Reason #2 — Keeping up with the Joneses

The problem with following industry trends or trying to imitate the success of others' is that neither option will put you at the head of the pack. You'll either be on par with everyone else, or you'll be a "me too" competitor. Despite this fact, businesses often acquire technology simply because a competitor or colleague did and had — or didn't have — tremendous success in doing so.

When RIM began to develop devices that bore an uncanny resemblance to the iPhone, they were simply playing catch-up. Those goals were not in any way derived from RIM's core values. Rather, they were a desperate attempt to salvage RIM's market share, which was going to Apple at the speed of light. RIM, having had the rug pulled out from under them, began to experience severe confusion and bewilderment. After all, from a technological standpoint, they should have been the ones leading the smartphone revolution. They had the infrastructure, the technology, the experience, the established relationships, and a selection of smartphones so plenary that even the most finicky luddite would be pleased. And yet, not only did RIM not lead the smartphone revolution, they were utterly crushed by it. They started out well enough with a sound business goal, but the focus on goals was quickly replaced by an unhealthy focus on technology that drove the company for the next decade of its existence.

Many businesses are stumbling along in the dark in much the same way, looking for the light of technology to guide them to success and prosperity. But as RIM learned the hard way, that's not the way it works. The worst thing that can happen to a business who looks to technology as an oracle that will provide guidance and wisdom is that that business will actually latch onto a particular technology and run with it as RIM did. Technology drives the business' activities from then on, and eventually the business winds up in a ditch.

Another, slightly more recent example of "keeping up with the Joneses" is "machine learning." 10 years ago, machine learning was called "Big Data", a catch-all buzzword for collecting and analyzing massive amounts of data — some of which may actually be relevant to business. The concept has gone by other monikers, such as "Business Intelligence" or "Business Analytics," but the idea is the same: by analyzing various metrics, businesses can make more intelligent decisions around product development, market penetration, operational efficiency, and so on. Businesses are spending billions on Big Data, hoping for a big payday. There are absolutely valuable use cases for Big Data. But there is also a lot of stupidity. I was interviewed for a CNBC.com article on Big Data. In it, I pointed out that Orbitz, the online travel company, notoriously and to the chagrin of many privacy advocates, used Big Data to determine that customers who use Apple Mac computers spend on average $20 more a night than those who use Windows computers. But it was already a well-known fact that Mac users tend to spend more than Windows users in general. Traditional market research would have revealed the same thing much quicker and at a much lower cost.

The Lesson: Chasing the latest industry trend in hopes of riding its coattails to success is a recipe for disaster, especially if that trend involves a substantial investment in technology. Don't copy another company's goals just because they proved to be successful for them. Your goals must always be based on your values and passions.

Reason #3 — Other People's Ideas (OPI)

In business, using other people's money is often a wise idea. But using Other People's Ideas (OPI) is usually not. Industry standards and "best practices" fall under this category. I like to think of both being on opposite sides of the spectrum. Industry standards are the "lowest common denominator" of any industry. For example, not commingling funds is "industry standard" in financial services. "Best practices" on the other hand are what you would do if you had money to burn, and you didn't particularly care if they were really appropriate. They are essentially one-size-fits-all propositions, which is why I put the term "best practices" in quotes. One-size-fits-all works for hats and socks, but not business. What is best for one organization may be detrimental to another.

IT is, unfortunately, the poster child for "best practices" run amok. Security, specifically information security, is the sacred cow of IT. Many insane and costly activities have taken place under the guise of "security best practices," which various technology vendors, technology standards bodies, and even private individuals have established. It doesn't matter where they come from. What matters is that most IT people are conditioned to follow them, often without regard for the negative impact on the business. Ironically, information security, which is intended to protect business and critical information systems and infrastructure, often impedes business operations by slowing them down.

Security and speed are always diametrically opposed. Take password requirements for example. Most IT organizations and many websites follow the "best practice" of requiring passwords to contain a minimum number of characters and be "complex" — containing a combination of uppercase and lowercase letters, numbers, and symbols. This complexity, while supposedly more secure, leads to more people forgetting their passwords. When this happens to an employee trying to log into his computer, he may try multiple passwords. After a few tries, his account is "locked out" and he has to get in touch with IT, who must reset his password to something equally complex, which he must write down or remember, and then use to log in. Once he logs in, he has to change his password again, to something still equally complex, which he also must remember. Not only does this process consume the employees time, it also consumes IT's time, which would be better used to achieve business goals, rather than chasing the elusive and vague non-goal of "security."

"Security" is never a legitimate business goal. In fact, it's not a goal at all because it's vague, not measurable, and has no definable value. Activities that are couched in "security" are often intended to reduce very specific risks — namely, theft, misuse, or destruction of information stored in IT's systems. IT security measures are only one way to reduce this particular type of risk, but they are never ends by themselves. If you have a problem with employees stealing files, the solution is not to make it more difficult for everyone else to legitimately access those files. The solution is to fire and stop hiring crooked employees! Don't ever accept "security" as an excuse for impeding business, and don't try to use technology to make up for shortcomings in other areas.

The Lesson: Don't let the ideas, goals, and opinions of others guide your business' destiny. Following industry standards or "best practices" is not a legitimate business goal.

Reason #4 — Personal and Non-business Goals

Businesses, or rather individuals working in businesses, sometimes order the inappropriate or excessive use of technology in order to advance a personal agenda. While not always nefarious, such use of technology is almost always damaging to the business' best interests. Personal agendas fall into three common groups:

1. Reducing Boredom and Increasing Fun

George Washington and James Madison warned against standing armies during peacetime. In a way, IT organizations can be like standing armies. IT people love solving problems, completing projects, and finding new ways to improve things. But if there is a lull, many IT folks can quickly become bored. What do you do when you get bored? You look for fun, of course. You may go golfing, boating, fishing, or something else. Well, some IT folks want to go upgrading, installing, and tweaking. They may upgrade some software to the latest version just to check out the new features, or they may swap out some old networking equipment with new just because they like having the latest and greatest. While this is often harmless, there are two potential problems. First, this sort of fun costs money and almost never delivers a commensurate return-on-investment. Second, new technology often brings with it new problems, the majority of which can't be predicted. Since technology in most organizations is tightly interdependent, one "little" problem from a seemingly innocuous upgrade can cause a domino effect that negatively impacts critical business processes.

2. Career Advancement or Protection

IT is certainly not alone when it comes to basing technology decisions on personal goals. People both inside and outside of IT will order technology as a way of enhancing or protecting their own careers. For example, a seasoned executive may say yes to new technology so as not to appear "out of touch," even if that technology is conspicuously inappropriate. Often, such decisions to use technology are couched in a supposed benefit to the business. For example, an IT engineer may propose upgrading some critical systems under the guise of making them faster, when in fact his intent is to simply add another item to his resume. Another IT engineer who knows that the upgrade will do very little to improve performance may go along with the plan so he doesn't jeopardize his career by not being a "team player."

3. Quid Pro Quo

Quid pro quo is a Latin phrase meaning, "this for that." Around 2011, Research In Motion (RIM), the company that made Blackberry smartphones, announced that it would give a free Blackberry Playbook tablet to any business that upgraded its Blackberry Enterprise Servers by the end of the year. During this time, RIM was in the battle of its life against Apple and Google for its share of the smartphone market, so it needed businesses to continue using its Blackberry devices. By offering a free gift, RIM convinced many businesses to expend resources to upgrade their Blackberry Enterprise Servers, something they might not have done otherwise.

Quid pro quo can also take the form of protecting personal relationships. One company I worked with implemented what I'll call Vendor A's telephone call monitoring and recording solution to allow managers to evaluate employee performance when speaking on the phone with clients. Once the system was up and running, one of the top executives abruptly put the kibosh on the entire thing and declared that the company would use a different solution from a different vendor, Vendor B, with no further explanation. The company ended up using neither solution. I later learned that the executive had a relationship with someone at Vendor B. Talk about a scorched earth policy!

The Lesson: Don't assume that every push for new technology from within IT is due to a personal agenda, but be on guard. When you get wind of what you suspect is a personal agenda, always ask, "What is the specific, measurable business goal?" And don't accept vague reasons like "improvement" or "industry standard."

Reason #5 — Technology Is The Most Obvious Solution

Technology tends to make people forget the ways things used to be done. Most of us have become so accustomed to having GPS that we don't even think of using a map when taking a road trip. GPS is thought of as the "only way" to get from point A to point B.

The same sort of thing happens in business. Today there are countless methodologies for powering through one's "to-do" list. You're probably familiar with a few. Software programs have been built around many of them, and they range from dead simple to mind-numbingly complex. But nothing quite beats putting an item on your calendar and just doing it on the scheduled day.

Sometimes, the most obvious solution is obvious because it's constantly in front of us, not because it's the best. Technology is ubiquitous and constantly renewing, so we notice it more. The "old ways" of doing things are easy to overlook, though they may be better all around. To be clear, I'm no luddite. I've spent my entire career working with technology, but I'm still frequently frustrated by how much harder it can make even the simplest tasks. For example, if I want to place a take-out order at a restaurant, I may spend five minutes on the restaurant's app or website going through the ordering process when I could place my order over the phone in two minutes. But I still place most of my orders online because my habit is to use technology. It's familiar. It's comfortable. It's always there. But does it always give me the best return-on-time? Not always.

The Lesson: When considering using a particular technology to achieve business goals, ask, "What did people do before this technology?" Later on, I cover the implications of this question and teach you how to decide, with 100% confidence, whether technology is appropriate in any given situation.

Reason #6 — Technology Is Wrongly Assumed to be the Better Investment

Businesses often think that a high-tech solution will provide greater bang for the buck than a low-tech or no-tech one. But much of the return-on-investment (ROI) and return-on-time (ROT) that technology promises is bogus because it's inflated. To understand why, you have to understand a concept I call hyper-leveraging. Hyper-leveraging is the common IT practice of squeezing every bit of usefulness out of one's existing technology in whatever way possible. The philosophy behind hyper-leveraging is, "We have it, so we use it." It's important to note that cost-savings is not necessarily a factor in the decision to hyper-leverage. Ease of management, perceived time-savings, increased control or security, or conservation of technology resources can also be a deciding factor.

A Real Example of Hyper-leveraging

About six months prior to the start of tax season, a payroll company purchased a new application to track tax notices and penalties for its clients. Let's call the application "Ticks" (not its real name). Ticks would have to be used by everyone in the Tax department, and due to the nature of government bureaucracy, it would have to be updated very frequently with new tax forms and such. When the IT department began planning the deployment of Ticks, it had to decide whether to hyper-leverage its existing technology. It could either place Ticks on individual computers, or it could hyper-leverage its existing technology by placing the application a centralized set of servers that everyone was already using to access other applications.

Both options had advantages and disadvantages. Putting Ticks on individual computers would take longer initially and be more of a challenge for IT to manage, but application updates would be automatic, and if anything went wrong, the problem would be isolated to those individual computers.

Putting Ticks on centralized servers, on the other hand, would be quicker up front, but IT would have to manually update the application frequently. Furthermore, IT couldn't update Ticks quickly if there were urgent updates — as there often are during tax season. The centralized servers were used by everyone in the organization — including Payroll Operations and Sales — so if anything went terribly wrong with a Ticks update, it could potentially impact existing clients as well as prospective ones. In the end, IT decided to hyper-leverage its technology by putting Ticks on centralized servers. Not surprisingly, whenever there was a problem with Ticks, the entire Tax department suffered a drop in productivity, and IT had to spend time and resources working with the vendor to resolve the problem. Had IT not hyper-leveraged, the problems with Ticks would have been less impacting and resolved quicker.

Newer Doesn't Mean Faster

As the old saying goes, time is money. But the ROT of technology is rarely considered because newer technology is just assumed to be faster than old. Unfortunately, nothing could be further from the truth.

I remember years ago when Voice Over IP (VoIP) phone systems were all the rage. Unlike traditional business phone systems, VoIP systems didn't require special wiring to connect the phones to the system. Instead, they would simply use the existing computer network infrastructure. This not only made these new systems cheaper, it made them faster to install as well. Anyone could do it, and many did. But it also created a potential problem. Now, if the computer network were to go down, the phones would go down, too. Before, if IT was doing work that caused the network to go down for a few seconds, most people would have their work disrupted for a few seconds with little impact. But after the VoIP system was in place, a network outage of the same duration would result in every telephone call being dropped, likely irritating a lot of customers.

But that was only the tip of the iceberg. With the advent of VoIP systems came a new level of integration with even more technologies like email. One company I worked with uses a VoIP system that allows users to receive and listen to their voicemail from their email inbox. When a voicemail comes through, the user gets a message in their email, and they just double-click it. The system rings their desk telephone, and when the user answers it plays the message. It's very convenient and saves users the trouble of having to manually dial into their voicemail to check and listen to new messages. All was well, until the IT department tried to upgrade the phone system. The air-tight integration of the phone system with email and each user's computer meant that IT also had to upgrade special software on every single user's computer! If they didn't, users would lose their voicemail-via-email functionality, so everything had to be upgraded at the same time. Not only that, upgrading the phone system caused all the users to lose their voicemails and their voicemail settings. Everyone in the organization had to reconfigure their voicemail greetings, and they had to do so quickly to avoid clients from hearing an unprofessional, robot-voice greeting when leaving a message. As you might imagine, many users spent a good portion of their day calling IT for assistance.

The negative implications of such convergence of technologies cannot be overstated. No longer can the typical organization spend its days meeting and supporting business goals. It has to take valuable time out to deal with the implications of hyper-leveraged technology. The IT organizations in most businesses are no longer able to effectively and quickly meet business goals because they have hyper-leveraged themselves into paralysis.

Technology Problems Are Rarely About Technology

Now would be a good time to point out that the problems like those in the above examples are not simply a matter of choosing the wrong technology or implementing it incorrectly. It is axiomatic that hyper-leveraging any technology creates inflexibility somewhere, and working through that inflexibility takes time — usually much more time than expected. It is that lost time that businesses pay for in terms of failed or delayed goals and loss of productivity. Furthermore, hyper-leveraging means that the failure of one piece of the technology infrastructure can have a catastrophic impact on another, even if the two aren't otherwise related. When the impact is severe enough, businesses will gladly pay a pretty penny just to get to an acceptable level of performance, reducing or eliminating any remaining ROI.

The Lesson: Always understand exactly how technology will be used to achieve your business goals prior to pulling the trigger. Calculate both the ROI and the ROT of using technology, and proceed only if it's better than low-tech or no-tech alternatives. Never assume that newer will be faster or better.

Reason #7 — Cost-Cutting

Ironically, when businesses cut technology budgets and try to lower technology costs, they end up using technology more, but using it less effectively. The focus on cost-cutting is always to get more out without putting more in. Not surprisingly, when businesses put the pressure on IT to cut costs and "do more with less," IT ramps up its practice of hyper-leveraging, which does nothing but create more problems.

Cost-Cutting Means Customer-Cutting

A very large healthcare company I worked with provides hospitals with remote access to its proprietary healthcare applications. While the applications are proprietary, the remote access component is achieved using software from Citrix. As part of the company's agreement with Citrix, they must have a special license for each doctor or nurse that is connected to its system at any given time. These licenses aren't cheap. Furthermore, they're all tracked by a special licensing server to ensure that the company doesn't "cheat" and use more licenses than it actually purchased.

Now this is where it gets interesting. The company sets aside for each hospital its own set of servers to run the applications, but not its own set of licenses. All of the licenses are pooled together and shared among all the hospital customers. The reason? You guessed it — to keep costs down. One day, the company decided to upgrade the licensing server. The decision to upgrade was not haphazard — it was a prerequisite to bringing new customers on-board using new technology the company was contractually obligated to provide. There was a hard cost associated with delaying the upgrade. Furthermore, the upgrade was supposed to be seamless and transparent to customers. But it wasn't. During the upgrade, none of the customers could connect to access their applications. In a hospital, doctors and nurses not being able to connect to view patient medial records can be a life-or-death problem. The company resolved the problem relatively quickly, and no lives were lost, but the hospitals' confidence in the company was not so quick to recover. Cutting costs had its own costs.

Change Management: A Good But Irrelevant Idea

Scenarios like the one above are one of the reasons change management methodologies were invented. The purpose behind change management is to ensure that a change to one system doesn't negatively impact another. This is usually done by scheduling a window of time when some or all of the affected systems will be taken down and made unavailable to the business. When you consider this in the context of hyper-leveraging, it becomes clear that change management solves a problem that, in many instances, shouldn't exist in the first place. Recalling the last example, should the license server upgrade have been scheduled and done during a more opportune time? Of course. But that's besides the point. Hospitals are never closed. There is really no good time for a hospital to lose access to patient information, although most hospitals have procedures in place to deal with such a situation, as long as they know it's coming and have time to prepare. But how does one coordinate such an event among dozens of hospitals scattered all across the country in different time zones? The license server was shared among hospital customers to reduce costs, but this created inflexibility. The upgrade couldn't have been scheduled for one customer at a time. It was an all-or-nothing deal. No wonder the company decided not to bother with change management! It would have been easier to herd cats.

The better decision would have been to spend the money dedicate a set of licenses for each hospital customer. Not only would this have allowed the upgrades to take place as-needed and at mutually convenient times, it would have prevented the possibility of a catastrophic impact on all customers at once. So why wasn't this seemingly-obvious decision made beforehand? There's really no way the people who made the decision to share the licenses could have foreseen such a disaster. But, had they considered the underlying business goals — and cost-cutting is never a valid business goal — they would have likely made a better decision.

The Lesson: Cost-cutting is never a goal. Furthermore, cost-cutting has its own costs. Remember, there is a reason that doing things the right way usually costs more up-front than doing it the wrong way. Doing things the wrong way always costs more over the long-term.