Sunday, January 22, 2017

Five Trends of Knowledge Management

Many issues affect knowledge management. The five most important are big data, cybersecurity, mobility, social analytics, and customer engagement.

The availability of big data has opened many options for understanding everything from customer preferences to medical outcomes.

Amidst all that data, concerns about security have grown, so cybersecurity is taking on new importance. Mobility has become pervasive and affects nearly every element in KM, while social analytics is providing insights at a personal level that were never possible before.

Finally, although those four factors feed into many KM objectives, enhancing customer engagement has taken a place at the top of the priority list for virtually every company and is likely to remain there for some time.

Big data

The most dramatic trend impacting knowledge management is harvesting and analyzing big data. An esoteric phenomenon just a few years ago with a new set of technologies and terminology, big data is now wrapped into the strategic plans of many organizations, and not just the big ones.

There are few applications to help with this challenge.

One of them is Hadoop. It can help to integrate complex sets of data to make business decisions and marketing efforts.

Actian Analytics Platform is a big data analytics solution that is accessible and affordable for small businesses, but also scalable to large ones. It can be used to target right customers. It can also be used to generate an economic case for potential buyers.

For example, Yahoo uses Actian to segment millions of users across 10,000 variables, looking for clues that will help predict customer behavior. Amazon uses Actian to provide the core technology components for its cloud-based data warehouse.

The technology can pull together diverse data in near real time as it flows through the data pipeline, marketing, customer engagement, risk assessment and many other applications. At both ends of the spectrum, from startups to large-scale users, big data is the central force in converting large amounts of data to decision-supporting information.

Cybersecurity

With so much information at large, unauthorized access to it has the potential to be destructive. Knowledge management is focused on information. What makes KM so important is that people can get information and analyze it better. In the past, it was hard to find out who was buying products and how they felt about them. Now an enormous amount of information is available, which has benefits. The information can be stolen and used financial gain.

The cybersecurity market is expected to increase from $95.6 billion in 2014 to $155.7 billion by 2019, resulting in a 10.3% per year increase during that time period. This amount includes network, endpoint, application, content and wireless security as well as many other types of technology. Innovative products are emerging in response to increased threats.

The volume of data, including an entire new collection from the Internet of Things, the challenges of mobile devices, greater use of the cloud for data storage and the broad impact of consumer concern are all sparking the growth.

Cybercrime comes in many forms, from stealing credit card numbers out of a merchant’s database to identity theft of consumers. A common strategy is for a cyberthief to obtain some publicly available information about an individual and use it to open an account or figure out a password that provides them access to an account. Users need to be vigilant about changing their passwords and making them strong. Technological safeguards can be put into place, but security depends a great deal on the human effort.

Mobile devices add another element of risk. They are much easier to lose or to steal, and often contain sensitive information such as bank passwords. Technological advances such as the ability to remotely disable a phone will continue to emerge to protect users from the impact of cybertheft. However, the result of users being careless with physical security, such as leaving a laptop in an unlocked car, remains a threat.

Companies can mitigate the impact on their customers by limiting the responsibility of users in the event of fraud or identity theft. Industries are growing up around providing insurance for such scenarios, either to the merchant or the customer.

Mobility

Although mobility brings hazards, it has brought even more advantages, and it will continue to drive the pervasiveness of knowledge management. Increasingly, knowledge management solutions, including content management, process management and analytics, have mobile versions of the solution. No longer a miniaturized version of the desktop browser, mobile apps are delivering usable KM applications.

Mobility is also forging new paths. For example, Apple Pay allows use of the smartphone as a wallet.

One mistake merchants make in designing mobile apps is to try to duplicate a physical purchase experience on a mobile device. Merchants should not necessarily automate an existing process, but instead should look at the experience holistically. Mobile experiences have to be simpler and as good as, if not better than, the non-mobile experience in order to gain loyalty from the customer.

Barriers remain in the use of mobile devices for enterprise applications, but the barriers also represent opportunities. In a study of U.S. and U.K. information technology decision makers conducted by Vanson Bourne, respondents reported that although more than 400 enterprise applications were typically deployed in each organization, only 22% of them could be easily accessed on mobile devices.

One reason for that is the diversity of enterprise applications. Some are custom, some are SaaS and some are off-the-shelf, and the technology for accessing each one is different. Therefore, development of mobile apps for such applications is needed, but organizations are hampered by the high cost. More efficient development techniques would be a big benefit.

The proliferation of mobile devices has also increased a number of other supporting sectors beside mobile application management (MAM), including mobile content management (MCM) and mobile device management (MDM). Each of them has a touchpoint to knowledge management and should be viewed in conjunction with an overall KM strategy.

Social analytics

Social analytics is a booming market which is expected to triple over the next five years to nearly $9 billion and showing a growth rate of nearly 25% per year. Initially based on simple counts of the number of times a brand was mentioned in social media, analytics has evolved to the point where it is using sophisticated algorithms that support the use of social data for targeted marketing and for initiating customer service.

Social analytics moved from hindsight to insight and now to foresight, with predictive capabilities. SAS social media solutions include integration and storage of social data, general text analytics and analysis of comments for sentiment, and a social conversation module that can work directly or integrate with third-party engagement solutions.

Real-time analysis allows marketing or brand campaigns to be synchronized with the topic threads that are emerging. Decision trees allow ‘what-if’ scenarios such as the impact of increasing the frequency of an ad, or combining customer segments. These analysis allows users to determine the relationships among various factors and to present visualizations of the relationships for better marketing decisions.

The value of social media analytics is also increased by combining it with data such as purchasing information from the data warehouse, to compare customers’ stated intentions with actual behavior. There is tremendous growth in analyzing social media information along with data from the Internet of Things which measures physical activity to build a profile not just of transactions but of tone and behavior along the customer journey.

Social media analytics should not be isolated. The information should be tightly connected to upstream data so different departments can use it to drive the customer experience.

Customer engagement

The driving force for all of the above is customer engagement - collecting and managing big data, keeping information secure, enabling mobility and analyzing social media inputs. The ultimate goal is to engage the customer, whether for marketing, customer support, participation in loyalty programs or some other outcome.

The key for customer engagement is omni-channel. Whether the interaction is initiated by the customer or the organization, customers want options in the delivery channels.

Customer engagement is not a static business area. The feedback obtained through social analytics and traditional business intelligence can be merged to explain both what customers are doing and why. That information can guide the delivery of marketing materials and help provide better customer service.

Galaxy Consulting has 17 years experience in knowledge management. We have lead knowledge management initiatives. Contact us for a free consultation.

Saturday, December 31, 2016

Search in the Land of Information Silos

Information access and retrieval within most organizations is a work in progress. There might be a general search system for marketing information, and probably one or more database search systems.

The larger the organization, the greater the number of information retrieval systems. Each laptop and mobile device has a search system. Mobile phone apps sport their own search systems. The lawyers in an organization may have different search systems for specific types of legal matters. The enterprise resource planning (ERP) users have a search system. When it comes to enterprise search, there are many silos.

A “silo” is a content collection available to certain users. In the face of the reality of silos, it might be impractical idea of providing access to “all” information. “All” may not mean all or even some available information. Big data is easy to talk about but difficult to make accessible. The same challenge exists for images, audio recordings, and engineering drawings with details hidden into the proprietary system’s database.

Search which is variously called universal, unified or federated search is a solution to the challenge of information silos. The term meta-search is often used to describe an integrating function that passes the user’s query across discrete content indexes and returns a single results list to the user. Endeca, Inxight Software, Northern Light, Sagemaker and Vivisimo are search applications that can be used for universal, unified or federated search in an organization.

The initial query might not unlock the information stored in the system’s index. The facets, topics and suggests make it easy for the user to click through the links without having to craft additional queries.

Behind the curtains, federated search results requires some maintenance. A user does not want to know the file format in which the information he or she needs is stored. The user wants answers. Early federating systems like WAIS relied on standards for content representation. Today, however, there are many “standards,” and content processing systems must be able to process content in the hundreds of formats found in organizations.

It is important to deliver a system that makes an organization’s disparate types of digital content available.

There are barriers to unified, federated or integrated search.

Some digital content cannot be included in a general purpose search system for security, business or legal reasons. Technical content such as chemical structure information at a pharmaceutical company requires special purpose systems. The same need applies to product manufacturing data, legal information and engineering drawings.

Most search applications exclude video streams from the index. If video is indexed, the system processes the text included in the digital file or indexing provided by the video owner.

The cost of creating connectors to connect with certain content types could be too high, or license fees could be required to gain access to the file formats.

The computational burden required to process certain types of content might exceed the organization’s ability to fund the content processing. Big data, for example, requires a computing capability able to handle the Twitter stream, RSS feeds and telemetry data from tracking devices. Cost could be prohibitive for processing all content types.

The most important challenge is the need for confidentiality. The legal department does not want unauthorized access to information related to a legal matter out of its control.

Some government contracts required that for certain types of government work, the information related to that project must be separated. Common sense dictates that plans for a new product and its pricing remain protected. If someone needs access to that information, a different search system may be used to ensure confidentiality.

Even in the absence of business or legal requirements, some professionals do not want to share content. That may be a management problem. When a manager locks up information in a no-access silo, a software script will skip the flagged server.

To summarize, silos of information present a challenge to process and effectively use in organizations. In the enterprise, integration should take place within silos of content.

Galaxy Consulting has 17 years experience in integrating information silos using universal, unified or federated search. We have experience with search applications. Contact us for no obligation free consultation!

Wednesday, November 30, 2016

Three Values of Big Data

Big Data is everywhere. But to harness its potential, organizations should understand the challenges that come with collecting and analyzing Big Data. 

The three values that are important in managing big data are volume, velocity, and variety. These three factors serve as guidance for Big Data management, highlighting what businesses should look for in solutions.

But even as organizations have started to get a handle on these three V’s, two other V’s, veracity and value are important as well, if not more so.

Volume is the ability to ingest, process, and store very large data sets. Definition of "very large" can vary by business and is dependent upon the particular circumstances of the business problem, as well as the preceding volumes used by that business.

Volume can also be defined as the number of rows, or the number of events that are happening in the real world that are getting captured in some way, a row at a time. Accordingly, the more rows that you have, the bigger the data set is going to be.

Bigger Volumes, Higher Velocities

In today’s digital age, having huge volumes of data is hardly rare. The proliferation of mobile devices ensures that companies can gather more data on consumers than ever before, and the rise of the Internet of Things will only increase this plethora of data. Moreover, businesses will have even more information on customers as they begin to use one-on-one messaging channels to interact directly with them.

The sheer volume of data available to us is greater than ever before. In fact, in many ways, nearly every human action can be quantified and logged in a bank of data that’s growing at an incredibly fast rate. All of this data can be turned into actionable insights that drive business decisions and can help transform every customer interaction, create operational efficiency, and more.

This increase in data volume is paired with a simultaneous increase in speed. The speed with which the volume is increasing, as well as the volume itself, are both increasing. These increases have forced IT staff to spend more time trying to figure out how to process and analyze that data.

Velocity is the key V of the three V’s. For example, a customer will visit a company’s site or use its mobile application but only for a short amount of time. The business may have just seconds to gather customer information and deliver a relevant response based on that information, usually just one message or offer.

This quick turnaround time requires you to process all of that real-time behavioral data as fast as possible. If you only understand that your customer was on your Web site the day after, you’re not able to contact them anymore. One aspect of a successful customer journey is being able to send the right message at the right time to the right customer. Timeliness and relevancy are the foundation of delivering personalized customer experiences in real time.

A Variety of Formats

Data sets are in a variety of formats, and the number of data types continues to grow. Radio-frequency identification (the use of electromagnetic fields to gather information from tags attached to objects), smart metering (devices that monitor information on energy consumption for billing purposes), and the ubiquity of mobile devices with geo-location capabilities are only few examples of diverse sources of consumer information.

All of these technologies have their own methods of capturing and publishing data, which adds to the complexity of the information environment.

But overcoming these data complexities could be well worth it. Having a large variety of data is crucial for creating a holistic customer view. Access to data such as a customer’s purchasing history, personal preferences based on social media postings, exercising habits, caloric intake, and time spent in the car can help companies understand that customer on a deeper level, and thus build experiences that are tailored to that customer.

But this diversity of data sources can be a blessing and a curse. A blessing because businesses have an increasingly large range of channels from which to pull customer information, but a curse because it can be difficult to filter through that information to find the most valuable content.

Variety is a little overstated in what people talk about for Big Data. Audio and video as examples of data formats that can be particularly difficult to analyze. Usually what companies do is they try to come up with an intermediate representation of that data, and then use that intermediate representation to apply old or new algorithms to try to extract signals, whatever the definition of signal is for that business problem they’re trying to solve.

Volume, velocity, and variety are undoubtedly important to managing customer information. Companies should keep in mind other important aspects of big data if they want to make the most of it.

Data tools such as Apache Hadoop and Apache Spark have enabled new methods of data processing that were previously out of reach for most organizations. While the growing volume of data, the time needed to process it, and the sheer number of input sources pose challenges for businesses, all three can largely be addressed through technology.

New V's Emerge

Investment in Big Data has begun to stabilize and enter a maturity phase over the past year. It will take time for infrastructure and architectures to mature, and best practices should be developed and refined against these architectures.

Organizations should consider how to use Big Data to bring about specific outcomes, in other words, organizations should examine the challenges of Big Data from a business perspective as opposed to a technical one. A framework that incorporates the business-oriented characteristics of veracity and value can help enterprises harness Big Data to achieve specific goals.

Not all data is the same, but organizations may not be paying enough attention to changes within individual data sets. Contextualizing the structure of the data stream is essential. This includes determining whether it is regular and dependable or subject to change from record to record, or even with each individual transaction. Organizations need to determine how the nature and context of data content in all its forms, text, audio, or video, can be interpreted in a way that makes it useful for analytics.

This is where the veracity of data or the trustworthiness of data comes in. Determining trustworthiness is particularly important when it comes to third-party data. It passes through a set of edits and validation rules.

Veracity entails verifying that data is suitable for its intended purpose, and usable within a given analytic model. Organizations should use several measurements to determine the trustworthiness and usefulness of a given data set. Establishing the degree of confidence in data is crucial so that analytic outputs based on that data can be a stimulus for business change.

Important metrics for evaluating and cleaning up data records are:
  • completeness measurements, or the percentage of instances of recorded data versus all available data within a business ecosystem or market (or the percentage of missing fields within a data record);
  • uniqueness measurements, or the percentage of alternate or duplicate data records;
  • accessibility measurements, or the number of business processes and personnel that can benefit from access to specific data, or that can actually access that data;
  • relevancy measurements, or the number of business processes that utilize or could benefit from specific data;
  • scarcity measurements, or the probability that other organizations including competitors and partners have access to the same data (the scarcer the data, the more it has impact).
Value is Paramount

While veracity can’t be overlooked, value is the most important factor. The first three V’s are really talking about architecture, infrastructure, representation of data, things that are important to IT organizations and, by far, less interesting to the business stakeholders.

The business stakeholders really don’t care about the first three, they only care about the value they can extract from the data. Executives often expect the analytical teams at their organizations to hide the first three V’s (volume, velocity, and variety) and only generate the last V - the value that is fundamental to the success of the business.

The concept of value is essential for organizations to succeed in monetizing their data assets. Value is a property that helps identify the purpose, scenario, or business outcomes that analytic solutions seek to address. It helps to confirm what questions are to be answered and what actions will be taken as a result, and defines what benefits are anticipated from collecting and analyzing the data.

Value is a motivating force when it comes to developing new and innovative ideas that can be tested by exploring data in different ways.

The ability to pull valuable information from Big Data and use that information to build a holistic view of the customer is absolutely critical. It’s no longer just an option to develop one-to-one relationships with customers; it’s a requirement. And to build that relationship, companies have to leverage all the customer information they can to personalize every interaction with them.

By using such information to lead customers on a personal journey, companies can help ensure that customers will stay with them long term, and even become brand advocates. Value is derived from making the data actionable. Organizations can have all the information about a customer, but it’s what we they can do with it that drives value for the business.

The Three V’s model of volume, velocity, and variety is useful for organizations that are just beginning to take control of their data, and certainly should not be forgotten by enterprises that have advanced further in their management of customer information.

The first three V’s are equally important. In the digital age, companies have accumulated more data than ever before, are pulling data from a variety of sources, and are increasing the rate at which that data flows, and that a combination of these three factors can help organizations to create relevant, personal, and one-on-one customer interactions.

Deriving value is the ultimate business goal for any enterprise. The standard Three V’s model does not satisfactorily identify any data properties from a business usage perspective. Even though Big Data, and data in general, provides organizations with a lot of capabilities, the challenge for businesses is to make sure that they adapt how they think about the business processes, how they report on them, and how they define key performance indicators.

Organizations should try to get to the value. They need to turn that data into value. It’s figuring out how to use that data to optimize business processes. In the end, the Three V’s model for Big Data is a useful start point. But then it becomes about the ultimate goal, the one organizations must not lose sight of: driving value.

Galaxy Consulting has 17 years experience in big data management. We are on the forefront of driving value of big data.

Friday, November 11, 2016

Voice Search

Roughly 56% of teenagers and 41% of adults use voice search on their mobile phones every day, according to Northstar Research. 

For example, modern consumers in Boston are much more likely to ask Google Now, Siri, Cortana, or Amazon’s Alexa to find the nearest coffee shop than they are to type "coffee shops near Boylston Street in Boston" into a search bar on Google's homepage.

This truly creates a challenge for search engine providers and for the providers of those personal assistants. But as consumers increasingly turn to voice search on their mobile phones, those Boston coffee shops will now have to rethink their search engine optimization (SEO) strategy if they hope to show up in voice search results.

Search is a science, and the rules are different for text and voice search.

It will only become more vital that business data on a company web site such as store locations, hours, and contact information is accurate and up to date. Businesses will also need to make sure that they are portrayed accurately on local review sites like Yelp.

Companies also need to consider how and where consumers are conducting their voice-based searches. When using computer-based search, it's assumed that a user is at a computer, so there is more screen space and more time to search. When using a mobile device, it's assumed a user is out, time is short, and the user needs access to quick bits of information on a small screen.

Therefore web sites need to be designed so that they dynamically adjust to fit any screen the consumer is using.

Google and Bing use mobile-friendliness as a ranking factor in their SEO algorithms. Both Microsoft and Google offer tools to help companies determine whether their sites are mobile-friendly. The tools look at factors such as loading speed, the width of page content, the readability of text on the page, the spacing of links and other elements on the page, and the use of plug-ins.

When it comes to voice search, web content that delivers the answers consumers want, in the quickest way possible, will ultimately win. The information should be concise and to the point, with more of an emphasis on usefulness than visual appeal.

Web content should be presented in more of a natural, conversational style and structured more like FAQs, answering the questions consumers might pose in voice search queries without requiring them to click on additional links or take other actions. Voice searches might be initiated in the car while someone is driving.

Companies also need to consider how consumers ask for information through voice search. More often than not, voice search queries are phrased using the same types of who, what, when, where, how, and why questions that are part of natural conversations. During these conversational, natural language search queries, consumers do not typically use the same keywords or metadata that are the hallmarks of text-based searches. Using basic keywords to set SEO parameters alone is no longer enough.

Use long-tail keywords. Rather than relying on a single word or phrase, long-tail keywords involve multiple keyword phrases that are very specific to whatever the company is selling.

Companies need to teach their systems and the search engines a very specialized lexicon that corresponds to their product and service names.

SEO rules keep changing and so SEO strategy needs to change accordingly.

Galaxy Consulting has many years experience with search. Contact us for a free consultation.

Sunday, October 2, 2016

Content Localization

Content is the fuel of your organization. Content spills into all areas of an organization and fuels absolutely everything with which customers interact, whether it's a commerce web site, marketing or sales materials, or any other channels.

Preferences for content vary by continent. Companies looking to expand operations overseas will inevitably face cultural and language barriers. They will have to think about how content is resonating with global audiences and they will face the challenge of determining which content to adjust to meet the needs of local markets.

To help ensure a quality customer experience, the top global brands employ a localization strategy to adapt their online content for regional specificity. Content should clearly resonate with companies' international audiences.

Content localization is the process of modifying content to make it usable for a new locale. Often, this includes translating the content from the source language into the language used in the locale. It is an integral part of adapting a product for a particular market. For example, if a company is currently selling products in US and is expanding into France, the company would need to translate the content into French.

Content translation and localization are not the same. Translation is the process of changing an original language of content into a different language. Content localization is a more specialized process of adapting content for regional or local consumption. It goes beyond translation to modify the source language and other site elements to appeal to the customer’s cultural preferences in their own target language. Localization is about refining your message and curating your brand to meet the cultural, functional, and language expectations of your global markets.

Why to localize the content? There are few reasons:
  • catering to local customers - to show your local customers that you care about them and to bring more value to your products;
  • reduce risks - to avoid liability by not using words that might be offensive in a different country;
  • enhance marketing - having a great campaign or advertisement is useless if your target audience doesn’t speak English well enough to understand your marketing message;
  • increase sales - localized content will sell more often because consumers will be able to fully understand what they are buying.
Content localization includes other processes in addition to translation. For example, U.S. and French currencies are different, so you might need to change references from dollars to euros. It is possible to localize information without translating it such as changing examples, currency, geographically specific information, and perhaps color schemes.

These are standard best practices that help to minimize the cost of localization:

Content Development:
  • source text should use clear, simple, concise language;
  • do not use jargon, idiom, metaphors, or other creative language;
  • use simple words;
  • use simple language structure;
  • use consistent terminology
  • limit text in graphics, and put text into an editable layer;
  • use legends rather than embedded text for graphics.
Formatting:
  • use predictable structure;
  • use styles or structured authoring;
  • do not use embedded formatting.
Delivery

Include a language flag on content so the system can deliver content to readers in the language they want.

Localization
  • use professional translators;
  • use translation memory systems;
  • work with localization professionals to identify potential problems;
  • look for opportunities to change source content instead of requiring changes in each target language.
When creating graphics, make sure that they are culturally neutral and acceptable for global audience. These are the best practices for graphics:
  • use color carefully because colors meaning varies across cultures;
  • images of people are not appropriate in some cultures and in other cultures, images of women are not acceptable;
  • images of hands or feet are not appropriate in some cultures;
  • when designing icons make sure that they are universally understood.
Some examples of cultural content include:
  • colors, shapes, sizes, styles;
  • images, icons, graphics;
  • societal codes; i.e. humor, etiquette, rituals, myths, symbols;
  • societal values, power, relationships, beliefs.
Some examples of functional content include:
  • date and time formats, telephone numbers, contact information;
  • weights, measurements, geographical references;
  • language and linguistic content; product descriptions, reviews.
Content translation and localization also differ on a tactical level. While simple translation may be appropriate for some content types in certain markets, localization is most often required for adapting highly emotive, creative marketing content so that it clearly resonates across locales.

There are several content types, from marketing collateral to legal and technical information and user-generated forum content. For reasons of efficiency and cost, is is the best practice to map these content types to the most appropriate translation or localization methods.

Not all company's content needs to be translated. Companies that located in one location should make it a priority to determine which content should be tailored for various regions. For example, blogs and tweets are not as significant as marketing and training materials, one help sheets, business forms, support email, FAQs.

It is generally easier to select the best fit when you consider your audience and the content’s source and intent. Other parameters include volume, update frequency, content lifecycle, and budgetary considerations. Depending on your language service provider’s (LSP) capabilities, there are several methods from which to choose. When making these decisions, it’s best to consult an experienced LSP that offers a wide range of services.

Following these best practices will help to reduce the overall cost of localization. Translators very often use memory tools and following these best practices would help these tools to recognize the same patterns.

Companies can try to discover which locales and languages provide the greatest return on investment. Identify languages that make up 90% of global online economic opportunities.

Be sure to follow industry best practices to reduce complexity, speed time-to-market, control costs and ensure quality localized content for all of your global markets. Companies will need to adopt stronger content strategies, keep closer tabs on the ways in which customers are consuming content, and take into consideration consumer preferences based on geographical region and digital trends.

Achieving global excellence means improving local experiences and getting better insights into those markets.

Wednesday, July 27, 2016

Navigating Big Data

Big Data is an ever-evolving term which is used to describe the vast amount of unstructured data. Published reports have indicated that 90% of the world’s data was created during the past two years alone.

Whether it’s coming from social media sites such as Twitter, Instagram, or Facebook, or from countless other Web sites, mobile devices, laptops, or desktops, data is being generated at an astonishing rate. Making use of Big Data has gone from a desire to a necessity. The business demands require its use.

Big Data can serve organizations in many ways. Ironically, though, with such a wealth of information at a company's disposal, the possibilities border on the limitless, and that can be a problem. Data is not going to automatically bend to a company's will. On the contrary, it has the potential to stir up organizations from within if not used correctly. If a company doesn't set some ground rules and figure out how to choose the appropriate data to work with, as well as how to make it align with the organization's goals, it's unlikely to get anything worthy out of it.

There are three layers of Big Data analytics, two of which lead to insights. The first of these, and the most basic, is descriptive analytics, which simply summarize the state of a situation. They can be presented in the form of dashboards, and they tell a person what's going on, but they don't predict what will happen as a result. Predictive analytics forecast what will likely happen, prescriptive analytics guide users to action. Predictive and prescriptive analytics provide insights.

Presenting the analytics on a clean, readable user interface is vital but sometimes is ignored. Users get frustrated when they see content that they can't decipher. A canned dashboard does not work for users. They need to know what action they have to take. Users demand a sophisticated alert engine that will tell them very contextually what actions to take.

Using such analytics, ZestFinance was able to glean this insight: those who failed to properly use uppercase and lowercase letters while filling out loan applications were more likely to default on them later on. Knowing this helped them identify a way to improve on traditional underwriting methods, pushing them to incorporate updated models that took this correlation into consideration. As a result, the company was able to reduce the loan default rate by 40% and increase market share by 25%.

Unfortunately, insights have a shelf life. They must be interpretable, relevant, and novel. Once an insight has been incorporated into a strategy, it's no longer an insight, and the benefits it generates will cease to make a noticeable difference over time.

Getting the Right Data

To get the right data leading to truly beneficial insights, a company must employ a sophisticated plan for its collection. Having a business case around the usage of data is the first important step. A company should figure out what goals it would like to meet, how and why data is crucial to reaching them, and how this effort can help increase revenue and decrease costs.

Data relevance is the key and what is important to a company is determined by the problems it is trying to solve. There is useful data and not useful data. It is important to distinguish them and weed out not useful data. Collecting more than what is useful and needed is impractical.

Often data is accumulating before a set of goals has been outlined by stakeholders. It is being collected irrespective of any specific problem, question, or purpose. Data warehouses and processing tools such as Hadoop, NoSQL, InfoGrid, Impala, and Storm make it especially easy for companies to quickly attain large amounts of data. Companies are also at liberty to add on third-party data sources to enrich the profiles they already have, from companies such as Dun & Bradstreet. Unfortunately, most of the data, inevitably, is irrelevant. The key is to find data that pertains to the problem.

Big Data is nothing if not available, and it takes minimal effort to collect it. But unfortunately, it will not be of use to anyone if it’s not molded to meet the particular demands of those using it. Some people are under the impression that they are going to get a lot of information simply from having data. But businesses don’t really need Big Data - information and insight are what they need. While a vast amount of data matter might be floating around in the physical and digital universes, the information it contains may be considerably less substantial.

While it might seem advisable to collect as much information as possible, some of that information just might not be relevant. Relevant insights, on the other hand, allow companies to act on information and create beneficial changes.

It is a good idea to set parameters for data collection by identifying the right sources early on. It could be a combination of internal and external data sources. Determine some metrics that you monitor on an ongoing basis. Having the key performance indicators (KPIs) in place will help companies identify the right data sources, the types of data sources that can help solve their problems.

Technology plays a key role in harnessing Big Data. Companies should figure out what kinds of technology make sense for them. Choice of technology should be based on company's requirements.

Data collection is an ongoing process that can be adjusted over time. As the business needs change, newer data sources are integrated, and newer business groups or lines of businesses are brought in as stakeholders, the dynamics and qualities of data collection will change. So this needs to be treated not as a one-time initiative, but as an ongoing program in which you continually enrich and enhance your data quality.

Companies should continually monitor the success of their data usage and implementation to ensure they're getting what they need out of it. There should be a constant feedback stream so that a company knows where it stands in relation to certain key metrics it has outlined.

Risks

Companies must always be aware of the risks involved in using data. Companies shouldn't use prescriptive analytics when there is significant room for error. It takes good judgment, of course, to determine when the payoffs outweigh the potential risks. Unfortunately, it's not always possible to get a prescriptive read on a situation. There are certain limitations. For one thing, collecting hard data from the future is impossible.

People and Processes

Big Data adoption often becomes a change management issue and companies often steer clear of it. When a company implements something that's more data-driven, there's a lot of resistance to it.

Like most initiatives that propose technology as a central asset, Big Data adoption can create conflicts among the various departments of an organization. People struggle to accept data, but people also aren’t willing to give it up. To avoid such clashes, companies should make it clear from the outset which department owns the data. Putting the owner in charge of the data, having this person or department outline the business rules and how they should be applied to customers would be helpful to overcome this issue.

These are two good tips to follow: Give credit where credit is due and don't dehumanize the job. Don’t attribute the success to the data, but to the person who does something with the data. Remember that change can't just come from the top down. Big Data adoption requires more than executive support. It needs buy-in from everyone.

Saturday, July 23, 2016

Successful Change Management in Content and Knowledge Management

It is getting more unlikely to find paper documents in filing cabinets or electronic documents in shared network drives. Filing cabinets and shared network drives have been replaced by content management systems, knowledge base application, and collaboration tools in majority of organizations.

At a certain point, it's inevitable that organizations have to make adjustments to keep up with the times. users must constantly adapt to the tools of an evolving world. After all, if customers are using advanced technology, it makes sense that companies should be interacting with them using tools that are up to date as well.

If technology adoption is to have an effect on an organization, users' commitment becomes a required element. But getting that kind of cooperation is not always a simple task. Users might not immediately take to the new processes without some resistance.

Though it's counter-intuitive that anyone would resist technology designed to make their job easier, the resistance is unavoidable element of content and knowledge management initiatives. Organizations should anticipate a number of challenges and do their best to ease their users's resistance through the transition and change management.

Drawing from our 16 years of experience successfully managing user adoption and change management in content and knowledge management initiatives, these are our guidelines for organizations to overcome challenges in these areas.

1. Communicate the Goals

There may be myriad practical reasons for why the change in how your organization manages its content needs to be put in place. Before proposing any major change, establish clear reasons for why the change is being proposed, and how it is going to enhance users' experience.

For users to understand how technology is going to help them, they need to understand what their future will look like with this technology in place. What it amounts to is if you can't articulate the benefits of making the change, you have no business of making this change.

It is very important to create a consistent narrative that instills confidence in users as well as the language you use to deliver this narrative to users. Avoid using the term "change management". The reason is that employees hear "change management" as "Whatever you have done until now is wrong, and now we are going to put you on the right track." That is not a good message.

You may want to use the term "cause management" which attributes any need for adjustment within a company to a cause. Under this approach, organizations would make an effort to craft a story that communicates the idea that this is the outcome that will best benefit the company.

Highlighting what is not going to be changing can be a source of encouragement for users. This way you are introducing a consistency while asking users to evolve.

2. Fear of Change is not Necessarily Fear of Technology

Technology itself is not usually the reason that employees are resistant to change. People are becoming less resistant to using technology. Problems begin to surface when employees are not given enough notice about what technology they are expected to use.

Even before it's been decided which technology organizations have settled on, organizations should give their employees an outline of the problems they are trying to fix. This would give them the opportunity to provide input and make suggestions about what types of processes they would like to see streamlined and how they envision their ideal work environment. Though organizations might not always have the budget for what the employees have in mind, they will at least be involving them and making them feel as though they are part of the equation from the outset.

Also important is that workers are given the time to develop the kinds of skills necessary to make full use of technology. It takes employees some conditioning to see how new technology and procedures can be of aid to them. If you can be proactive about teaching people these new skills and how to use the technology in small segments, this definitely can accelerate the change.

3. New Technology Can Bruise the Ego

All employees are proud of their work. They like to feel as though they possess an innate talent, and that there's a reason they're doing what they've chosen to dedicate so much of their time to. Regardless of age or experience level, there are certain natural emotions that might come into play when companies are proposing changes. If employees are led to believe that so much of what they spent a great deal of time mastering can be transferred to anyone with an ease, they might resent it on an emotional level that they might not even share.

Thus, it would be a good idea to communicate how the technology is going to help them work together and be more connected.

4. Technology is not Only for Managers

It goes without saying that technology should never force people to do more work than they are already doing. If you force people to use a system that is making their jobs worse, they' are going to do everything they can to avoid it.

Employees should never feel as though technology is being deployed solely for the benefit of the managers. Granted, content management system provides managers with more visibility into work processes, but the central message managers should be sending is that the technology is there to help employees do their jobs better.

It is helpful to illustrate that higher management is using the technology as well, for the sake of driving home the idea that the technology is being universally adopted by the organization.

5. Deploy Gradually

When it comes to deploying the systems that employees are going to be using regularly over an extended period of time, it is a good idea to steer clear of an abrupt implementation in favor of a more gradual one.

Use Pilot Periods. During these periods, a small subset of the company is selected to test the technology and share its experiences with the others. Keeping employees updated via email, meetings, or through other internal communication channels can be helpful, as it also lets people know what to expect. Likewise, getting user testimonials and videos in which those who have piloted the product attest to its benefits could prove useful.

However, it's important to be all-inclusive when deciding who is going to be participating in such trial periods. While it might be tempting to recruit the most enthusiastic and vocal representatives of a company to test the materials, it might be a better idea to go for a mix to act as testers. Use a subset of users that will represent those who are ultimately going to be expected to use the new technology. Of course, asking volunteers to step forward is advised, but testers should also be drawn from a segment of those who are not as keen on trying it.

Including people who are not technology experts is a good idea, because it helps drive home the point that anyone can use the solution effectively. It also reinforces the idea that there will be support and training opportunities available.

If the right group of people is selected for the pilot program, they can generate excitement about the system and show how the program has helped them do their jobs.

One small factor to keep in mind about the pilot period, however, is the capacity of the system. Since the entire program will eventually be inhabited by more users, the experience that the small subset reports might differ from the one that is waiting further down the line. For example, a system that works fine when you have ten users on it may not work as quickly when there are 200,000 users connected to it. You need to be able to account for things like that.

6. Maintain the Change

Change management is not as simple as preparing employees for the transition that is about to be introduced. It has just as much to do with ensuring that employees don't revert to outdated and inefficient methods as it does with ensuring that people begin to use it. Managing resistance is a process, not a series of events.

Because it's a process, managers should be very careful to communicate the fact that the improvements might not come all at once, but rather in small increments. Incentives can also act as fruitful aids in encouraging adoption. For this very reason, gamification applications have been gaining popularity because they allow employees to compete against one another and display to the rest of the company how well they have done by showing off their achievements.

It is important to build employees confidence and a positive environment. Set specific event days to encourage the use of the new technology. Typically held once a month, these are known as blitz days. The idea is to set aside a time period during which everybody is forced to use the technology in a fun environment. At the end of the day, the users share those results. The goal is to say that if this can be done on one particular day, why can't it be done every day? Over time, the benefits of these events could be substantial.

Change is ongoing. As time goes on, the window for change for technology is becoming much narrower than it used to be, with updates occurring far more frequently. For some people, it might seem that just as they are getting used to one change, another one is on the way. Organizations need to create an infrastructure that better supports that.

Characteristics for Driving Change
  • Be outwardly focused - avoid being locked into one area of the company. Look for ways to make an impact across organizations.
  • Be Persuasive - be clever and persuasive enough to gain the support of users.
  • Be Persistent - do not give up. Constantly work through the channels of the organization to ensure that new systems and processes are factored into organizations' way of work.

Sunday, June 26, 2016

Better Business Operations with Better Data

Businesses today understand that data is an important enterprise asset, relied on by employees to deliver on their customers' needs, among other uses of data such as making business decisions and many others.

Yet too few organizations realize that addressing data quality is necessary to improve customer satisfaction. A recent Forrester survey shows that fewer than 20% of companies see data management as a factor in improving customer relationships. This is very troubling number.

Not paying attention to data quality can have a big impact both on companies and the customers they serve. Following are just two examples:

Garbage in/garbage out erodes customer satisfaction. Customer service agents need to have the right data about their customers, their purchases, and prior service history presented to them at the right point in the service cycle to deliver answers. When their tool sets pull data from low-quality data sources, decision quality suffers, leading to significant rework and customers frustration.

Lack of trust in data has a negative impact on employees productivity. Employees begin to question the validity of underlying data when data inconsistencies and quality issues are left unchecked. This means employees will often ask a customer to validate product, service, and customer data during an interaction which makes the interaction less personal, increases call times, and instills in the customer a lack of trust in the company.

The bottom line: high-quality customer data is required to support every point in the customer journey and ultimately deliver the best possible customer experience to increase loyalty and revenue. So how can organizations most effectively manage their data quality?

While content management systems (CMS) can play a role in this process, they can't solve the data-quality issue by themselves. A common challenge in organizations in their content management initiatives is the inability to obtain a complete trusted view of the content. To get started on the data-quality journey, consider this five-step process:

1. Don't view poor data quality as a disease. Instead, it is often a symptom of broken processes. Using data-quality solutions to fix data without addressing changes in a CMS will yield limited results. CMS users will find a work-around and create other data-quality issues. Balance new data-quality services with user experience testing to stem any business processes that are causing data-quality issues.

2. Be specific about bad data's impact on business effectiveness. Business stakeholders have enough of data-quality frustrations. Often, they will describe poor data as "missing," "inaccurate," or "duplicate" data. Step beyond these adjectives to find out why these data-quality issues affect business processes and engagement with customers. These stories provide the foundation for business cases, highlight what data to focus on, and show how to prioritize data-quality efforts.

3. Scope the data-quality problem. Many data-quality programs begin with a broad profiling of data conditions. Get ahead of bottom-up approaches that are disconnected from CMS processes. Assess data conditions in the context of business processes to determine the size of the issue in terms of bad data and its impact at each decision point or step in a business process. This links data closely to business-process efficiency and effectiveness, often measured through key performance indicators in operations and at executive levels.

4. Pick the business process to support. For every business process supported by CMS, different data and customer views can be created and used. Use the scoping analysis to educate CMS stakeholders on business processes most affected and the dependencies between processes on commonly used data. Include business executives in the discussion as a way to get commitment and a decision on where to start.

5. Define recognizable success by improving data quality. Data-quality efforts are a key component of data governance that should be treated as a sustainable program, not a technology project. The goal is always to achieve better business outcomes. Identify qualitative and quantitative factors that demonstrate business success and operational success. Take a snapshot of today's CMS and data-quality conditions and continuously monitor and assess them over time. This will validate efforts as effective and create a platform to expand data-quality programs and maintain ongoing support from business stakeholders and executives.

Galaxy Consulting has over 16 years experience helping organizations to make the best use of their data and improve it. Please contact us today for a free consultation!

Sunday, June 12, 2016

Hadoop Adoption

True to its iconic logo, Hadoop is still very much the elephant in the room. Many organizations heard of it, yet relatively few can say they have a firm grasp on what the technology can do for their business, and even fewer have actually implemented it successfully at their organization.

Forrester Research predicted that Hadoop will become a cornerstone of the business technology agenda at most organizations.

Scalability, affordability, and flexibility make Hadoop uniquely suited to change the big data scene. An open-source software framework, Hadoop allows for the processing of big data sets across clusters on commodity hardware either on-premises or in the cloud.

At roughly one-thirtieth the cost of traditional data storage and processing, Hadoop makes it realistic and cost effective to analyze all data instead of just a data sample. Its open-source architecture enables data scientists and developers to build on top of it to form customized connectors or integrations.

Typically, data analysis requires some level of data preparation, such as data cleansing and eliminating errors, outside of traditional data warehouses. Once the data is prepared, it is transferred to a high-performance analytics tool, such as a Teradata data warehouse. With data stored in Hadoop, however, users can see "instant ROI" by moving the data workloads off of Teradata and running analytics right where the data resides.

Other use of Hadoop is for live archiving. Instead of backing up data and storing it in a data recovery system, such as Iron Mountain, users can store everything in Hadoop and easily pull it up whenever necessary.

The greatest power of Hadoop lies in its ability to house and process data that couldn't be analyzed in the past due to its volume and unstructured form. Hadoop can parse emails and other unstructured feedback to reveal similar insight.

The sheer volume of data that businesses can store on Hadoop changes the level of analytics and insight that users can expect. Because it allows users to analyze all data and not just a segment or sample, the results can better anticipate customer engagement. Hadoop is surpassing model analytics that can describe certain patterns and is now delivering full data set analytics that can predict future behavior.

There are few challenges.

Hadoop's ability to process massive amounts of data, for example, is both a blessing and a curse. Because it's designed to handle large data loads relatively quickly, the system runs in batch mode, meaning it processes massive amounts of data at once, rather than looking at smaller segments in real time. As a result, the system often forces users to choose between quantity and quality. At this point in Hadoop's life cycle, the focus is more on enormous data size than high-performance analytics.

Because of the large size of the data sets fed into Hadoop, the number-crunching doesn't take place in real time. This is problematic because as the time between when you input the data and the time at which you have to make a decision based on that data grows, the effectiveness of that data decreases.

The biggest problem of all is that Hadoop's seeming boundlessness instills a proclivity for data exploration in those who use it. Relying on Hadoop to deliver all the answers without asking the right questions is inefficient.

As companies begin to recognize Hadoop's potential, demand is increasing, and vendors are actively developing solutions that promise to painlessly transfer data onto Hadoop, improve its processing performance, and operationalize data to make it more business-ready.

Big data integration vendor Talend, for example, offers solutions that help organizations transition their data onto Hadoop in high volume. The company works with more than 800 connectors that link up to other data systems and warehouses to "pull data out, put it into Hadoop, and transform it into a shape that you can run analytics on.

While solutions such as those offered by Talend make the Hadoop migration more manageable for companies, vendors such as MapR tackle the batch-processing lag. MapR developed a solution that enhances the Hadoop data platform to make it behave like enterprise storage. It enables Hadoop to be accessed as easily as network-attached storage is accessed through the network file system; this means faster data management and system administration without having to move any data.

Veteran data solution vendors such as Oracle are innovating as well, developing platforms that make Hadoop easier to use and to incorporate into existing data infrastructures. Its latest updates revolved around allowing users to store and analyze structured and unstructured data together and giving users a set of tools to visualize data and find data patterns or problems.

RapidMiner's approach to Hadoop has been to simplify it, eliminate the need for end users to code, and do for Hadoop analytics what Wordpress did for Web site building. Once usable insights are collected, RapidMiner can connect the data platform to a marketing automation system or other digital experience management system to deploy campaigns or make changes based on data predictions.

Moving forward, analysts predict that leveraging Hadoop's potential will become a more attainable goal for companies. Because it's open-source, the possibilities are vast. Hadoop's ability to connect openly to other systems and solutions will increase adoption in the coming months and years.

Sunday, May 29, 2016

Oracle Knowledge Management

As customer expectations rise, delivering personalized experiences that improve customer loyalty, increase customer acquisition and optimize efficiency is increasingly more challenging to achieve. It is very important to engage customers in their preferred channel and to minimize the overall effort of that engagement.

A key to minimizing the customer effort is to deploy a knowledge management platform that crosses all channels, presents accurate content from multiple sources, maintains, relevance, and captures feedback for continuous improvement. Oracle Knowledge is a complete knowledge management solution which provides personalized cross-channel service and support.

Knowledge Platform

Oracle Knowledge Platform is an integrated set of knowledge management capabilities including advanced natural language processing, search, flexible authoring and publishing, rich analytics, customizable self-service, and agent facing knowledge applications. Oracle Knowledge is built on a highly scalable J2EE architecture and on Oracle technologies including WebLogic, Oracle Data Integrator, and Oracle Business Intelligence.

Semantic Search

The Oracle Knowledge semantic search capabilities are built on the fundamental understanding of language. Core language dictionaries are available in 20 languages understanding everyday terminology. In addition, multilingual industry dictionaries are available for major industries including high tech, telecommunications, insurance, finance and automotive.

This core understanding of a user’s language is key to finding precise answers from multiple external sources including the knowledge base, web sites, file systems and other internal knowledge repositories. The most recent release of Oracle Knowledge continues to build on this foundation with widely expanded language and geography coverage, significantly increased performance and reduced footprint with faster search response times, faster content processing performance, and a reduced semantic index size, as well as learning-based search ranking for reduced incident handling time. These improved features deliver increased productivity and lower operation costs.

Authoring, Publishing, and Workflow

Oracle Knowledge is designed to help companies to develop a knowledge base as an integral part of a user’s job. Contact center agents and customers create content as a by-product of solving support issues using a powerful, web-based, WYSIWYG rich text editor. Product experts and contact center agents can collaborate with other users and customers to refine or expand the knowledge base.

Advanced editing capability such as global find and replace and replacement tokens improve the article accuracy while lowering operation and knowledge administration costs. Oracle Knowledge comes with valuable tools to manage the life-cycle of articles. Customers can create their own article templates and metadata. The software tracks all revisions of the articles and provides detailed history. Articles may be routed for approval through the use of a workflow. Providing users with the ability to attach files to forum posts allows them to provide additional information to explain their issues. These capabilities improve self-service rates, while expanding the knowledge base and the user community.

The user interface of the authoring system is available in 24 languages, but content can be created in nearly any language. Oracle Knowledge allows customers to manage the relationship of an article across different locales and languages, while providing authors with the ability to develop locale-specific content and metadata allowing fine-tuning of the customer experience.

Analytics

Analytics Dashboards are tailored to functional roles across the service organization. They harness the optimal value of company stakeholders by providing relevant insights at a glance to reduce operational costs, increase employee productivity, and strengthen customer relationships. With the configurable custom KPI wizard for creating KPI with thresholds and triggers, organizations can increase the efficiency of authoring content, increase answer relevancy, and improve the overall insight of knowledge activity.

InfoCenter: Self-Service Knowledge

InfoCenter provides a Knowledge portal for customers and employees with integrated browse and search functionality via a customizable user interface and knowledge widgets. InfoCenter surfaces the power of industry-based libraries, knowledge federation abilities, and natural language processing abilities of the platform to deliver true, intent-based best possible answer to customers. It transforms the self-service experience for customers by providing contextual, and relevant answers to their questions.

iConnect: Agent Knowledge

iConnect provides robust and scalable answer-delivery framework aiding the agent-facing service delivery model. The context-driven user interface simplifies and enhances the user experience and is tuned for increased performance. iConnect is available as an out-of-the box integration into Oracle Service, and Oracle Service Cloud. Open APIs allow for integration into most industry standard CRM applications.

AnswerFlow: Guided Knowledge

AnswerFlow provides consistent service resolution for agents and customers with the prescriptive delivery of knowledge. AnswerFlow combines decision trees with external data that leverages and increases the strategic value of Knowledge Platform across self and assisted service customer interaction channels. AnswerFlow enables to create and deploy automated interactive processes that guide users toward appropriate answers or solutions in cases where:
  • answers are conditional, and can vary based on factors such as account status, location, or specific product or model;
  • diagnosis is complex, and identifying the best response among many possible answers involves asking detailed questions and eliminating alternatives.

Galaxy Consulting has over 16 years experience in many knowledge base applications. We can help you to deploy Oracle Knowledge Management. Contact us today for a free consultation!