The Dangers of Legacy Thinking

Every successful company and organization inevitably must confront a powerful question:

Is what got us to where we are helping us move forward or holding us back? Your company or organization may be thriving, but is this record of success sustainable and can you keep going?

Every successful company and organization inevitably must confront a powerful question:

Is what got us to where we are helping us move forward or holding us back? Your company or organization may be thriving, but is this record of success sustainable and can you keep going?

Maybe you’re noticing kinks in your armor or a drop-off in your sales. You’re thinking and acting as usual, but something is misfiring.

This is what I refer to as “legacy thinking.” If left unchecked, legacy thinking can pose enormous obstacles to your continued success—or worse.

Legacy Technology—Dangerous but Also Diverting      

Legacy thinking has a better-known cousin—legacy technology. The issue of legacy technology is old news—in more ways than one.

As you probably know, legacy technology refers to old forms of technology that are simply no longer optimal. This includes everything from software, operating systems or almost any technology once groundbreaking but now well past its prime.

The issues reach beyond outdated technology. Trying to get by with legacy technology can be very expensive, from the cost of operating the systems themselves to paying people to make certain nothing goes wrong, an inevitability. For example, Delta Airlines’ entire fleet in the United States was temporarily grounded because of computer problems—the second shutdown over a period of six months also shutting down the carrier’s website and mobile apps.

A more serious example occurred last year when the British bank Tesco shut down online banking after 40,000 accounts were compromised.

Those major headaches do not mean legacy technology is a problem in and of itself—it can cause a dangerous comfort in legacy thinking.

Legacy Thinking Defined

Like legacy technology, legacy thinking refers to thinking, strategies and other actions that are outdated and no longer serve you to the extent that they once had. This can be problematic if legacy thinking accounted for much of the success you’ve been able to achieve.

Many organizations can point to business principles, strategies and other ways of thinking that underscored success. One example is agility—the ability to respond quickly to changing events and market conditions. Reacting as quickly as possible helped many organizations climb to the top of their industries. Being agile, both internally and externally, seemed like a bulletproof way to approach things.

However, we are now in a period of transformational change. Whether products, services or the marketplace, change is not slowing down, which means legacy technology is becoming outdated faster as well.            

The same is occurring with legacy thinking. As the rate of change increases, even the most agile of organizations will be hard-pressed to keep up—let alone leap ahead with new ideas and innovations—and agility will likely prove to be less effective.

Take that reasoning and apply it to other forms of thinking and strategies that may have served you well in the past. Are they moving you forward or holding you back? If they’re more a hindrance, that’s legacy thinking. 

Legacy Thinking—Changing Your Thinking Changes Your Results

The first thing to understand about legacy thinking is that it isn’t necessarily all bad. Overcoming legacy thinking doesn’t mandate erasing every strategy, idea or leadership concept you ever used in the past. Instead, identify those ideas and strategies that continue to serve you well while pinpointing others that may have worn out their value.

Agility in and of itself is not something to be completely discarded. There will always be fires and other immediate issues that warrant an agile response. However, it’s no longer the silver bullet it once was.

Consider other forms of legacy thinking. For instance, maybe you or some others in your organization are hesitant to embrace new technology critical to your future growth and success. I saw this firsthand when I worked with a major retail organization. Many key figures on the leadership team didn’t embrace the company’s commitment to technology and other elements of the future. Mobile apps, internet shopping and other innovations made the company’s future seem bleak.

To remedy the situation, management made lateral moves with some individuals so their attitude wouldn’t hinder the company’s vision, while others were tasked with identifying strategies, ideas and tools that would serve the company’s progress well. The result was twofold—not only did the company effectively separate elements of harmful legacy thinking from their workflow, but those once-hesitant executives saw firsthand how powerful those tools and ideas could be. They were walked into the future—and they liked what they saw.

The next time you’re considering the dangers of legacy technology, include the pitfalls of legacy thinking. Just as old software shut down an entire airline, legacy thinking can cripple your organization. Don’t forget that there’s always the opportunity for an upgrade in the way you think and act.

Learning to Master the Art of Your Career

It doesn’t matter what you do for a living — whether you work in medicine or retail, law or construction, software engineering or writing — there’s an art and science to every career. Each profession has its scientific aspects, those more mechanical facets, rules, and methods you must know to succeed. Yet no matter how dry, straightforward, or technical, these professions also have creative qualities that foster critical thinking.

It doesn’t matter what you do for a living — whether you work in medicine or retail, law or construction, software engineering or writing — there’s an art and science to every career. Each profession has its scientific aspects, those more mechanical facets, rules, and methods you must know to succeed. Yet no matter how dry, straightforward, or technical, these professions also have creative qualities that foster critical thinking.

This dichotomy is the reason no two professionals within the same industry are identical. These people may work within their careers for the same amount of time, possibly went to similar schools, or perhaps have the same position at the same company. However, they differentiate themselves in the ways they apply creativity and critical thinking to their jobs.

This idea impacts our personal lives as well. Consider medical professionals with the same specialty. If all dentists were the same by virtue of having identical skill sets and nothing more, you would have no preference for whom you go to for a root canal. But this isn’t the case; you prefer your dentist over one you have never been to due to their individual touch.

A real-world example occurred with one of my brothers, as some years back he struggled with pain in his legs. He visited three different orthopedic surgeons, all with identical skill sets and backgrounds. The doctors examined my brother. One suggested invasive surgery and the second proposed a more exploratory surgery. Both of these were unfavorable options. It wasn’t until we saw the third orthopedic surgeon that creative critical thinking took place. The doctor took one look at him and asked if he always wore his leather belt around his hips in the same place. When my brother answered in the affirmative, the doctor recommended he switch belts, replacing his leather one with a softer, more elastic material. With this change, his ailments were cured within a week.

All three doctors had the same impressive credentials and experience in the science behind their specialties; however, the third doctor utilized creative critical thinking to problem-solve.

Whether you’re training or in any level of schooling for a career, the “science” of that field is where the education lies. You’re receiving a hard, factual, standardized education, based on data and a proven methodology. Likewise, whether it’s accounting or food service, you’re also being schooled in the best practices of your industry.

Even in the creative fields, you still learn both the science and the art of your craft in order to find professional success in it. Writers must learn grammatical and syntactical convention, but they also have to learn how to write something everyone must read. Musicians need to learn scales, notation, and instrumental technique, but they also need to learn how to touch the hearts and souls of listeners to achieve musical greatness.

So where does the “art” come into these fields?

Artistic aspects of a career are picked up by professionals through years of experience and another, more flexible, less standardized type of “education,” one of induction. The first method of becoming more creative within your career through personal and professional experience is somewhat obvious — the longer you do something, you’ll become better at problem solving and thinking “outside the box.”

The second method, the nonstandard educational method of developing intuitive insights coupled with creativity, involves gleaning the best-kept secrets and most well-honed, time-honored methods, the knowledge and wisdom of your profession from other professionals. These should be people who’ve already distinguished themselves through their own creativity. You might seek these people out, like a musician choosing to take lessons from one of his favorite players, or an entrepreneur asking the advice of someone who’s already established herself as a success in business. You might also stumble into these people during the course of your life, like having a captivating, inspirational professor or being trained by a capable manager who knows the secrets to making your job fun and interesting.

You can learn the science of your job from books, manuals, and classroom lessons and know that you will be good at what you do — but you need to learn the art from the artists of your field to become exceptional. This knowledge and wisdom transfer is key not only to success, but to a rewarding career as well. Not only does it provide professionals an essential balance of skills, it’s what keeps industries thriving and innovative. It’s what pushes us to compete with others by bettering ourselves and, in doing so, to push our very professions forward.

Pick up a copy of my latest best selling book The Anticipatory Organization to help shape your future and accelerate your success.

The Risks of Sticking with Legacy Technology

Legacy technology is like that old pair of jeans you wore as a teenager. “They are comfortable” was always your answer to any inquiry.

Legacy technology is like that old pair of jeans you wore as a teenager. “They are comfortable” was always your answer to any inquiry.

Move that anecdote onto a larger stage and you have a fairly accurate picture of why many organizations hold on to legacy technology—tools that are long outdated: comfort.

In a world of exponential change, legacy technology is trouble. Continuing to use outdated technology of all sorts is costly beyond the financial spectrum.

Legacy Technology Defined

A definition of legacy technology describes the term as “an old method, technology, computer system or application program, of, relating to, or being a previous or outdated computer system.”

This particular definition frames legacy technology in a negative light. There’s no getting around the fact that legacy technology is pervasive.  

In more recent news, several organizations have experienced setbacks from legacy technology:

  • Last year, Data Breaches compromised 15.1M patient records with 503 incidents.
  • In late 2016, British bank Tesco shut down online banking in early November after 40,000 accounts were compromised, half by hackers for fraudulent purposes. Andrew Tschonev, technical specialist at security firm Darktrace, stated: “With attackers targeting everyone and anyone, today’s businesses cannot safely assume that it won’t happen to them.”
  • In July 2016, Southwest Airlines canceled 2,300 flights when a router failed, delaying hundreds of thousands of passengers. The same issue grounded 451 Delta Air Lines flights weeks later.
  • In November 2015, Orly Airport in Paris was forced to ground planes for several hours when the airport’s weather data management system crashed. The system was Windows 3.1.

Bad PR? Yes, but Much More Than That

Reputations are important, and high-profile incidents like these don’t create great headlines. But the reasons to move on from legacy technology stretch further:

Data breaches. As Tesco discovered, legacy technology is open to cyber crime. Vendor support is often nonexistent, which limits valuable upgrades. Furthering security risks, advantages of improvements in security measures are not easily accessible for old systems.

Expensive functionality. Revamping outdated technology can be an expensive proposition, but running outdated technology increases operating costs also. Old hardware versions lack modern power-saving technology and the systems’ maintenance is expensive.

Compliance penalties. Depending on your industry, legacy technology may not be in compliance. In the medical industry, outdated software will fail to meet compliance standards, such as the Health Insurance Portability and Accountability Act (HIPAA), resulting in severe financial penalties.

Customer loss. No matter the industry, offering outdated solutions and ideas derived from equally outdated technology will prompt customers to look elsewhere for better answers.

Unreliability. Many organizations hold on to legacy systems in the belief that the systems still work. If that’s not the case, consider what happens when something goes wrong, as seen in the detrimental examples above.

Perception issues. Leaders need to be aware of the message they’re sending to their employees. Consider how a younger employee who’s comfortable with technology might react to coping with the limitations of legacy technology. Aside from lost productivity, they may consider a new employer more willing to invest in current infrastructures.

“No” Can Be More Costly Than “Yes”

Replacing legacy technology is not entirely devoid of downsides, the most obvious being cost. Other deterrents include legacy replacement projects failing or the time and cost involved in system testing and end-user retraining.

But the question remains: Are you and your organization comfortable with the old, or are you identifying the Hard Trends that are shaping the future and embracing the new? Are you anticipating the need to invest and upgrade before tragedy occurs? There’s not one organization in the examples provided that doesn’t wish to go back and pre-solve the problems of outdated systems.

Before making any decisions, assess both Hard Trends and Soft Trends that affect your organization and industry. Consider the positive and negative impacts that replacing legacy systems may carry both internally and externally. Be certain that every element for the new system serves a well-defined business goal, now and in the future.

As I emphasize in my Anticipatory Organization Learning System, saying yes can be expensive, but saying no could be catastrophic.

Shaping the Future of A.I.

One of the biggest news subjects in the past few years has been artificial intelligence. We have read about how Google’s DeepMind beat the world’s best player at Go, which is thought of as the most complex game humans have created; witnessed how IBM’s Watson beat humans in a debate; and taken part in a wide-ranging discussion of how A.I. applications will replace most of today’s human jobs in the years ahead.

Way back in 1983, I identified A.I. as one of 20 exponential technologies that would increasingly drive economic growth for decades to come. Early rule-based A.I. applications were used by financial institutions for loan applications, but once the exponential growth of processing power reached an A.I. tipping point, and we all started using the Internet and social media, A.I. had enough power and data (the fuel of A.I.) to enable smartphones, chatbots, autonomous vehicles and far more.  

As I advise the leadership of many leading companies, governments and institutions around the world, I have found we all have different definitions of and understandings about A.I., machine learning and other related topics. If we don’t have common definitions for and understanding of what we are talking about, it’s likely we will create an increasing number of problems going forward. With that in mind, I will try to add some clarity to this complex subject.

Artificial intelligence applies to computing systems designed to perform tasks usually reserved for human intelligence using logic, if-then rules, decision trees and machine learning to recognize patterns from vast amounts of data, provide insights, predict outcomes and make complex decisions. A.I. can be applied to pattern recognition, object classification, language translation, data translation, logistical modeling and predictive modeling, to name a few. It’s important to understand that all A.I. relies on vast amounts of quality data and advanced analytics technology. The quality of the data used will determine the reliability of the A.I. output.  

Machine learning is a subset of A.I. that utilizes advanced statistical techniques to enable computing systems to improve at tasks with experience over time. Chatbots like Amazon’s Alexa, Apple’s Siri, or any of the others from companies like Google and Microsoft all get better every year thanks to all of the use we give them and the machine learning that takes place in the background.

Deep learning is a subset of machine learning that uses advanced algorithms to enable an A.I. system to train itself to perform tasks by exposing multi-layered neural networks to vast amounts of data, then using what has been learned to recognize new patterns contained in the data. Learning can be Human Supervised Learning, Unsupervised Learning and/or Reinforcement Learning like Google used with DeepMind to learn how to beat humans at the complex game Go. Reinforcement learning will drive some of the biggest breakthroughs.

Autonomous computing uses advanced A.I. tools such as deep learning to enable systems to be self-governing and capable of acting according to situational data without human command. A.I. autonomy includes perception, high-speed analytics, machine-to-machine communications and movement.  For example, autonomous vehicles use all of these in real time to successfully pilot a vehicle without a human driver.  

Augmented thinking: Over the next five years and beyond, A.I. will become increasingly embedded at the chip level into objects, processes, products and services, and humans will augment their personal problem-solving and decision-making abilities with the insights A.I. provides to get to a better answer faster.   

A.I. advances represent a Hard Trend that will happen and continue to unfold in the years ahead. The benefits of A.I. are too big to ignore and include:

  1. Increasing speed
  2. Increasing accuracy
  3. 24/7 functionality
  4. High economic benefit
  5. Ability to be applied to a large and growing number of tasks
  6. Ability to make invisible patterns and opportunities visible

Technology is not good or evil, it is how we as humans apply it. Since we can’t stop the increasing power of A.I., I want us to direct its future, putting it to the best possible use for humans. Yes, A.I. — like all technology — will take the place of many current jobs. But A.I. will also create many jobs if we are willing to learn new things. There is an old saying “You can’t teach an old dog new tricks.” With that said, it’s a good thing we aren’t dogs!

Start off The New Year by Anticipating disruption and change by reading my latest book The Anticipatory Organization. Click here to claim your copy!