Category: Professional Services


Category: Professional Services
West Corporation

Posted on April 30, 2013 by West Corporation 


Sustaining BPO Mojo: Challenges With Business Process Optimization in Your Care Organization

Do you have responsibility for business process optimization (BPO) in your organization? Are you partnered closely with your internal and external technology providers and focused on the same BPO objectives? Does your enterprise have BPO mojo?

Total quality management (TQM) zealots were frustrated in the mid-1990s as the new Six Sigma process rose in popularity. In some people’s opinion it wasn’t all that new, it was just different. The return on investment (ROI) shared by the manufacturing segments that incorporated Six Sigma principles into their daily practices had enterprises, in all verticals, rushing to certify their employees as Six Sigma Black Belts and challenging them to incorporate process measurement into delivery and support organizations.

Various adaptations of these early methodologies faded and morphed through the last decade into what is now broadly categorized as “enterprise process performance improvement,” the simple concept of optimizing processes to ensure value and quality delivery of services to customers. It isn’t really new; it is just different. In addition, while enterprise leaders all fundamentally believe in business process improvement, few organizations are able to sustain the resource commitment to bring enough energy and focus into the efforts to find those big pockets of cost reduction or innovation that move the needle on company profitability. So, many enterprises have lost their BPO mojo.

To be able to identify meaningful or measureable improvement, processes need to be defined, documented and have a clear start and finish. The more simple and straight-lined the process is, the better. But many processes are messy, especially in service-oriented enterprises. Some are routine, others are on-demand. Some are contained within a single business unit while others may span across multiple organizations with varying service level objectives. Core processes generally get priority, but non-core processes rarely get the attention they need. When enterprises also have objectives to be nimble and to customize services to fit the needs of their customer’s businesses, the quality manager of the care organization can go crazy trying to keep processes “within control limits.” He/she may have a hard time finding their BPO mojo, month over month.

There are plenty of service-based organizations, consultants and software development companies that will help enterprises with business process improvement. They’ll gladly interview clients to identify their most important satisfaction factors. They’ll try to draw innovative process ideas out of employees through facilitated brainstorming sessions. They’ll look internally at existing processes and try to measure the costs of the activities to help prioritize areas for automation. Identifying and executing on business process initiatives is definitely a combination of art and a science. It’s difficult to say whose approach is better, they are just different. Sometimes you need multiple perspectives and find the common theme between sources.

As a technology and service provider in the customer care space, one of the things I hear most when talking with clients about their quality and satisfaction goals is how high (or low) their confidence is that they are getting every possible bit of value out of the technology and tools they have invested in. We must constantly show proof of the ROI of our services against the client’s objective, and sometimes subjective, measures. Since BPO helps organizations gain higher customer satisfaction, product quality, delivery speed and time to market speed, we find our best success is when our programs closely align with the BPO initiatives of our customers.

The closer we are able to collaborate with the resources who are accountable for improving processes and are empowered to align all aspects of an organization with the wants and needs of clients, the more effective our programs are. We have also found that the more an enterprise has to respond to changing consumer, competitive, market and regulatory demands, the more we can help our clients create competitive advantage. Synchronization between process and technology is where enterprises find their BPO mojo.

Does your enterprise have BPO mojo?

West Corporation

Posted on April 24, 2013 by West Corporation 


Becoming More Agile

The software industry includes many different methods for developing and publishing code/software, including waterfall, prototyping, spiral and so on. Software development is like Baskin-Robbins ice cream — there are 31 flavors.

OK, maybe not exactly 31, but there is a wide diversity of software development methodologies. Rather than go into each one and touch on the pros and cons, I just want to briefly talk about agile development at a high level. It may be necessary to choose your software development life cycle (SDLC) flavor based on the size of the project and the complexity of the requirements. If you decide to practice some form of agile development, which is a strong possibility, then there are some things you should be aware of.

The agile software methodology is an incremental and iterative approach to developing projects and, much like software development life cycles, in general, it has a variety of diversity and practice. It focuses more individuals and interactions where those elements are more important than processes and tools. In some cases, in an agile approach, developers are required to have extensive knowledge and past experience developing systems as well as be co-located with the actual end-user as much as possible.

The customer or end-user needs to be just as knowledgeable as the development team, providing constant feedback, due to the ever-evolving requirements. Requirements in an agile approach are not initially well-defined, and for the most part, are incomplete at the outset of the project. Requirements rapidly change through end-user feedback loops and evolve over time as the project progresses.

There are many advantages to agile programming, the biggest of which is minimizing project risk due to developing an evolving product with minimal initial expectations. Constant end-user/stakeholder feedback grows the product into what is actually requested and at any point in the project, an actual functioning prototype exists.

Traditional waterfall methods do not provide that advantage; rather, the end-user’s involvement is mainly prevalent in the requirements phase and product-acceptance phase. Basically, the end-user provides a set of requirements, and the rest is up to the interpretation and understanding of the project team.

It is silly to assume or even believe that requirements are never going to change during a project. It’s the way of life for any project in any industry. If the outcome isn’t what the end-user requested, then one can end up in a costly situation. Agile methodology mitigates that risk by slicing the project life cycle into smaller more manageable milestones, with the end-user reviewing the product at the end of each milestone and providing feedback. Rework at this point is not as costly, since the product has yet to evolve. Lower-risk release cycles promote better design and communication.

Another benefit is that the end-user actually has a working product in their hands during the initial stages of the project as opposed to the traditional delivery near the end of the project. This gives not only the end-user but also the project team a sense of good progress.

One of the best practices for agile software development is to focus on the task and not the individual status. Focusing on smaller tasks, instead of the overall status of the project means more focused, smaller and more frequent releases, which ultimately equates to a feeling of accomplishment and progress. Smaller tasks are easier to track, and progress is measurable. Win-win all the way around.

In summary, the agile methodology is powerful. It offers a different style in managing and executing projects. Better communication, more sound expectations, greater flexibility and improved task-monitoring are only some of the benefits to being agile. There are plenty more SDLC methodologies out there. But, they aren’t all sunshine and rainbows, either. There are also consequences to adopting the agile methodology that fuel many great debates about which methodology is best.

As I stated earlier, there is no one-size-fits-all approach, and, often, the choice is driven by the need. Up to this point, West has been consistent in our approach and application of the waterfall methodology. But over the years, I have been involved in many projects that have progressed iteratively that have ended successfully. As we place more emphasis on better speed to market, there may be good reason to tweak our methodology. We don’t have to go all-in on agile, but we can always become more agile, in general.

West Corporation

Posted on April 18, 2013 by West Corporation 


Sometimes Humans Are Still Better Than Technology

Do you know what this popular game is and when it was invented?

The Speak & Spell was first introduced in 1978 at the summer Consumer Electronics Show. This early speech technology was only the tip of the iceberg to where we are now with advanced speech and voice recognition.

Fast-forward 35 years: Apple launched a new iPhone with Siri speech-recognition software making the decades-old technology good enough for the average consumer. Rather than recognizing a simple word, Siri can interpret a stream of words and provide intelligent feedback.

Even with all of the speech and voice recognition software in place today, can we really afford to do away with the traditional transcription services? Sure, recognition software costs less than transcription; however, what is truly the “cost” to consumers?

Technology is important, but not more important than the quality that affects organizations and their consumers. Transcribers can intuitively correct simple errors of confusion, whereas the most advanced speech technology cannot. Speaking clearly and distinctly is essential for voice recognition software to work effectively. If a caller is in a noisy place or has an accent, then it may throw off the accuracy of the software. Conference settings and free-form feedback are other sources of inaccurate recording for the software. Transcription agents can filter out extraneous speech like, “umm” and “aah.”

An undisputed advantage of voice recognition is its speed. However, transcription can also be done in real time and doesn’t need to sit on backup tapes or servers overnight causing the common delays.

Transcription has been seen by some as a “routine” task. With so many technological advances of voice automation, it was thought that transcription would become obsolete. However, there are many aspects of transcription jobs that are not always routine. These instances require human judgment, error correction, formatting and clarification of the unclear. The judgment, experience and plain common sense can be an invaluable and priceless contribution.

Just because a computer can do something doesn’t necessarily mean that it should.

West Corporation

Posted on April 10, 2013 by West Corporation 


What’s Too High? A Six Sigma Approach to Caller Behavior Analysis

Is 95 percent transfer rate too high? Is 10 minutes in IVR to authenticate too long? Anyone in the IVR business would respond with a resounding, “Yes!” But what about a 60 percent transfer rate? Or 20 seconds to authenticate? The idea is that it’s easy to use business experience to judge the obvious extremes. It is not so easy when the numbers are in the gray area. The outliers can hide just close enough to “normal” to go undetected by the human eye, yet they can be far enough away to cause a financial impact.

Luckily there is a solution: Six Sigma. This well-known technique statistically defines exactly what is normal and to identifies outliers falling outside of the normal range.

The idea behind Six Sigma is to track a particular metric (e.g., number of calls made by a customer) over time to generate a distribution or histogram of its acceptable values (those that are close to the average) and unacceptable values. (those beyond a certain distance from the average). Sigma, or σ, is a symbol used by statisticians to denote this distance and is also known as standard deviation. Six Sigma says that more than two-thirds of the values of a particular metric would fall within one standard deviation from the average, while nearly all values would fall within three standard deviations (a total of six) of the average. Anything outside of that is a serious outlier.

West uses this approach to establish normal boundaries for repeat calls made by a single customer in a particular month. This study was important because our clients’ customers make upward of 600,000 calls per day, and understanding repeat callers is the key to decreasing such calls. The resulting distribution was not exactly text-book “bell curve” due to the nature of data; however, it did turn out that roughly 10 percent of callers were outside of normal.

Zeroing in on particular extreme behaviors, certain questionable customer practices were identified. For example, one customer who made 200 calls per month (with an average customer making three calls during the same time frame), turned out to be a small-business owner who used his familiarity with IVR and agent negotiation to get his clients (other customers) discounts for a cool $30 per service, i.e., per call.

West Corporation

Posted on March 19, 2013 by West Corporation 


Data Analysis: The Needle Has a Thread

In our world of big business intelligence, where the size and complexity of our data keeps multiplying before our eyes, it can become an overwhelming task to find the answers we’re looking for. But do we even know what questions to ask or what problems we are trying to solve?

As an example, I received a request for my analytics group a few weeks ago. The requester was in a panic. He asked for a full performance analysis of his IVR application, top to bottom, and needed it in a week. He was in a panic because his client was complaining that “something was wrong” and too many calls were getting transferred to the call center. A full behavioral analysis of the entire application with thousands of data points and millions of monthly calls, if done thoroughly could take several weeks. I grabbed him for a chat, had him take a deep breath and asked him the proverbial question, “What is your true objective, or what problem are you trying to solve?” He had to pause and think about that for a moment.

“Well,” he said, “I want the client to be pleased with our performance. If we are letting too many calls through to the contact center, then we need to fix that.” That’s a pretty broad objective, so I broke it down for him.

Question: Has anything changed recently that would cause more transfers?
Answer: No, there haven’t been any changes in the past quarter.

Question: Through your reports, have you seen or observed increased transfers?
Answer: Yes. Transfers have increased in the past week but for no apparent reason.

Question: So, if nothing has changed in the program, but something has changed in the transfers, did the client change something?

The account manager immediately went back to his desk and called his client and asked the ultimate question. Within an hour he called back with the answer. Unbeknownst to us, a new caller type had been added to the client-side database which feeds in real time to our application. Because our application was not programmed to recognize it, callers of this type were transferred to the contact center by default.

So, the moral of the story is that sometimes we feel as if we need to find the needle in the haystack by sifting through every straw when the solution to the problem may be as simple as tracking the thread back to the first stitch.

West Corporation

Posted on March 18, 2013 by West Corporation 


Supporting IT Infrastructure Can Be Like Painting the Golden Gate Bridge

The Golden Gate Bridge is one of the most photographed objects in the world. Built over 75 years ago for a mere $35 million, this majestic suspension bridge required the tallest towers, the longest, thickest cables and the largest underwater foundation piers ever built. The view from Golden Gate National Recreation Area on a clear day — overlooking the Bridge and San Francisco Bay from the north side — is breathtaking. Its trademark color is a brilliant vermillion (“international orange”) that resists rust and suits the natural beauty of the area.

On my first visit to the area in the 70s during a family vacation, an interesting fact stuck in my mind. The bridge is constantly being repainted — a maintenance initiative that is never finished. It turns out that exposure to salt-laden fog wreaks havoc on the paint job and also limits the hours when painting can be done.

Fast-forward a couple of decades, and one of my responsibilities was maintaining a very large IVR platform. The Golden Gate Bridge came to mind as I worked to justify the constant need to refresh system hardware. Keeping the platform up to date was a big job, requiring lots of planning, coordinating, purchasing, configuring, staging and installing — followed by more configuring, testing, and occasionally troubleshooting, fixing and retesting. Sometimes, we refreshed because of age; other times, we were upgrading because of tremendous growth. Still other times, we replaced systems because new software versions required more power, more disk, and/or more storage. Over time, we developed a great deal of expertise in this maintenance activity — just as the Golden Gate painters no doubt did.

Avoiding the burden of installing, managing and refreshing IT infrastructure is one of the significant advantages of using a hosted service provider. Other advantages include avoiding the effort associated with software upgrades, bug fixes, and security scans and audits. And, of course, there is application design, development, integration, testing, and support — all of which require a considerable amount of skill and experience.

A cloud solution negates the need to hire, train, manage retain and/or replace the subject matter experts who make it all happen. You get to focus on your core business, and we get to focus on ours. Let us do the painting — you get to use the bridge and enjoy the view.

West Corporation

Posted on March 7, 2013 by West Corporation 


‘Functioning as Designed’ Is Not the Same as ‘Designed for Functioning’

Have you ever started using a system and thought to yourself, “This was designed by a developer”?

Whether you have or not, the phrase usually means that someone who does not understand the working process of the users designed the system. So, how does this happen, where users cannot adequately use the system that the developers produced?

In theory, if the users provide their requirements and these requirements are what the developer designs, then there shouldn’t be such a gap between what was intended versus what was understood.

I think it starts at the beginning, in the requirements phase. Users cannot provide complete requirements. They may be able to express a laundry list of desired functions, and that has some value. However, the devil is in the details. There is far more to the big picture than that.

Most users would love perfection, but they never expect it; they usually tolerate something in between. Interview your users and have a conversation with them. This discovery phase is critical. This phase allows you to explore the role of the user to the bigger picture. So, who conducts the interview? Usually the interviewer is a business requirements analyst (BSA).

The user can describe the usual processes, where they start, what conflicts or decisions they have to make throughout a process, where things fall through the cracks, and the order of tasks (often overly complicated) — half of which they perform outside of a system. The interviewer‘s ability to remain neutral and objective encourages users to express what they love and what they hate, as well as describe best- and worst-case working scenarios. This starts to create a boundary around the general tolerance of the users.

Along with an interviewer, it is best to include someone else in the interview to take notes. Have you ever participated in a lively conversation and tried to write down everything while staying engaged with the other person? Unless you are recording the audio of the conversation, it is nearly impossible. Spend the time and money, and have someone else play the role of scribe. Who plays the role of scribe well? Like the interviewer also, a BSA.

Sometimes more is just more, so don’t bother interviewing a cast of thousands. Take a fair sample and interview a few different users, those with different tenures, jobs or skill sets. The interviewer and scribe should review the recorded details of all interviews, and compile these into a summary for a complete profile of users and use cases. From the summary, the BSAs can derive or extract the requirements and formally present these as the business requirements document.

You can never ask too many questions. And, keep in mind: If it goes without saying, then it goes without coding.

West Corporation

Posted on January 14, 2013 by West Corporation 


Big Data: Size Does Matter

Data shapes the world. It tells us all sorts of interesting things. It helps us plan, it helps us make decisions and it tells us what’s right and what’s wrong with a business. But are businesses really making the most use of their data?

There are many types of data out there. Transactional (or dynamic) data is generated or modified by the systems that are used for transactional or operational purposes, such as a cash register or an ATM. Closer to home, an IVR application or a Web page could be considered transactional systems. Customers interact with them and execute various tasks.

Examples of transactional data include what a caller said, what options they pressed, how much they owe and how they paid. Each of these events could be tied to a date, time and even a part of the world they are calling from.

Another flavor of data is analytical data. Analytical data provides the business intelligence that allows organizations to make key decisions. This type of data is often stored in enterprise data warehouses and data marts and is optimized for decision support. An example of analytical data is identifying how many people own red two-door cars in any major U.S. cities starting with the letter “M.” A more practical example is identifying the ratio of people who prefer Android to iOS and then organizing them by demographics such as sex, location or age.

The point is that data is everywhere, and if you were able to acquire the right data, it could  make you competent enough to confidently run your business and help others run their businesses. This is how the buzz term “big data” got started. Let’s get some big data and all our problems will be solved, right? Wrong. It is easy to get sucked into the hype of big data or even get overwhelmed by it.

Three Data Characteristics

According to Wikipedia, “Big data is a term applied to data sets whose size is beyond the ability of commonly used software tools to capture, manage and process the data within a tolerable elapsed time.” Most of the data out there is unstructured, epic in volume and grows at an exponential rate – all qualities that make it quite challenging and costly to manage. But if you can organize this data, then you have a leg up on the competition.

According to the big data report by the McKinsey Global Institute, “If U.S. health care were to use big data creatively and effectively to drive efficiency and quality, [then] the sector could create more than $300 billion in value every year. Two-thirds of that would be in the form of reducing U.S. health care expenditure by about 8 percent.”

The following are typical big data characteristics:

  • Volume: Massive quantities of data that require extremely intense analysis and lots of hardware.
  • Variety: Data is not organized, is not simple and is not just text. It could come in the form of audio, video and even imagery.
  • Velocity: The data comes quick and requires fast processing.

Because of these qualities, it is becoming challenging to store the data and become savvy enough to handle it. The data comes from various sources and can be unstructured and difficult to query out of traditional relational databases and even data warehouses. There is also the security aspect of maintaining big data. With all that data being captured, it is even more important to make sure data is secure. Again, with the right plan, it is possible to reap the benefits of big data. But how do you know if your company is ready for it?

Fortunately, there are experts out there who have already gone through this exercise and provide a guideline for evaluating the feasibility of adopting big data (Forrester). And, with West’s expertise and partnerships, it is safe to say that we are a big data expert.

 


Intrado

utility-side-menu