Wednesday, May 31, 2017

Specialized Strategies for Enterprise Search

Enterprise search is the practice of making content from multiple enterprise sources such as databases and intranets, searchable to a defined audience. "Enterprise search" is also used to describe the software of search information within an enterprise.

Enterprise search systems index data and documents from a variety of sources such as file systems, intranets, content management systems, e-mail, and databases. Many enterprise search systems integrate structured and unstructured data in their collections. Enterprise search systems also use access controls to enforce a security policy on their users.

Enterprise search as a standalone application for locating documents throughout the enterprise is still going strong, but many search engines are now embedded in applications that people use as their primary work environment. Search solutions are also used on more complex tasks such as locating relevant information found in either databases or diverse document repositories.

New search tools are emerging for searching geospatial data and processing it with other types of information to provide an enriched and informative blend.

SAVO

One of the new enterprise search applications is SAVO.

SAVO focuses on driving greater sales productivity, and one way of doing this is providing the content that is needed by sales reps in context. Content can be stored in such a way where users can access it either locally or remotely. The SAVO platform provides search capabilities in a content repository, which contains approved information designed to support salespeople as they do their jobs. The search function is configured with a profile for each salesperson to push out the information that will be most useful to each individual.

By using profiles, overall selling methodology and model can be reinforced. The information pushed out might include a list of documents for a client meeting, a video clip, information by product line or a summary of facts about a competitor. To support the sales reps’ conversations, SAVO provides context-relevant information.

Although content for different phases of the sales cycle is predefined, if unexpected questions arise, sales reps need to have the right information at their fingertips.

By setting up rules for tagging and reviewing information as it is added to the repository, SAVO is able to keep the right information in front of the sales rep. All the information they need is in the backend. The strategy is not to present the rep with a blank search box. Instead, at the frontend, the rep is asked a series of questions such as what product they are dealing with, what type of document they want, whether it’s a customer-facing presentation or a brochure. In two or three clicks, they have reduced a potential list of 60 documents to about five.

Also searchable on the SAVO platform is previously tacit knowledge that has been captured in forums or in comments. Referred to as “tribal knowledge” by SAVO, the content is generated by employees who post questions and answers on a subject matter expert moderated forum within SAVO. That content can be sought on a proactive basis by the sales reps when they have a specific question. Between the profiled information and the responses to ad hoc queries, the sales reps are able to respond with accurate and timely information.

The search function is enabled on mobile devices as well as the desktop. Because sales is both a global and regional activity, SAVO supports multiple languages, including ideographic languages such as Japanese and Chinese, and the search function is available for all the foreign language versions of the product.

The search capability provided by SAVO is useful not only for salespeople in the field but also in onboarding. It helps new employees who are learning the ropes, because they can access instructional material, reference material and conversations posted by more experienced sales reps.

DtSearch

DtSearch is effective because it is a mature product with a very stable API.

Search applications with complex data models pose special challenges. One such example was a search application Contegra built for Carrier Corporation. The number of parts is large, and many parts have revisions to them that have been built over the years. Serial numbers, model numbers and other identifiers must be incorporated. Also, the system had to be able to identify what product the customer originally received because that would affect the replacement part needed. In addition to searching the database, the application must be able to locate technical documentation, brochures and other files in a variety of formats.

DtSearch is able to search across SQL databases and PDF documents, and then search parts referenced within documents. It sounds like a simple task to recognize parts numbers, but when you have 50,000 of them with complex relationships and want to do the search quickly, using the right search tool is very important.

It is able to understand the search needs of different users. Engineers see the world as a set of different systems, such as engines and drives systems, which have specific code numbers. There are hierarchical taxonomies, model numbers and serial numbers. Retrieving the exact information for each installation is critical when considering the action being taken is a certain type of repair or service. Each user group could have a different perspective on the data that requires it to be searched and viewed differently.

Particularly in specialized applications, the upfront work is critical. You would want to be sure that when the users do a keyword search, the information is presented in a meaningful context, so they can narrow it down to the exact part and the nature of the document, whether it is a technical publication, leaflet or catalog. A good content model enables both the field service staff and external customers to access information in a self-service mode, which cuts costs for the company.

Geospatial Search

Geospatial information has become increasingly important for many different applications and analyses ranging from marketing to agriculture. Yet the management of geospatial information has lagged that of any other kind of information. Although maps and imagery can be stored in a repository like any other digital files and searched according to indexed metadata, the ability to perform more complex searches on the data and process it once retrieved has been limited.

Voyager Search is a search solution designed specifically to manage spatial information. It combines modern search technologies with a unique understanding for geospatial data.

It includes an indexing solution that can extract information from nearly any type of geospatial data regardless of format. It later began supporting other non-spatial documents such as PDF and various Microsoft Office formats. It also offers a solution that enriches data through linking documents to a map.

Users are able to define a geographical area on a map and then search for relevant information about it. The user can find all the river data or stream flow data in that area, for example, or reports by dragging a box on a map. This ability is not available in other search engines.

Voyager Search can manage very large quantities of spatial data. Making data available and accessible while keeping it secure is a big challenge. A geospatially enabled search solution is critical not only to providing online access to geospatial intelligence, but also to broadening the analytical expertise of the organization. Voyager accomplishes that by providing the tools to index a wide variety of content and to add a layer of geospatial intelligence to the index.

The process for narrowing results from a search of geospatial data differs from that of other types of data. Search results from a demographics database, for example, might be narrowed by selecting only one income category. In a spatial search, “refine your search” might mean “view only content in a specific geography” in order to look at one aspect of the results.

Voyager Search can also create new files derived from geospatial and image data. For example, a petroleum engineer could look for seismic shockwave data from a certain time interval for use in the analysis. Emergency responders might look for recent areal photographs taken in their area and run them through an imagery analysis process to look for hot spots that would indicate the most recent fire perimeter. Voyager Search offers that capability, known as “geoprocessing,” which includes a variety of ways to manipulate geospatial data at the desktop.

These are just few new search applications currently on the market. Galaxy Consulting has over 17 years experience in enterprise search and enterprise search applications.

Sunday, April 30, 2017

E-Discovery Tools

Electronic discovery or e-discovery refers to discovery in legal proceedings such as litigation or government investigations where the information sought is in electronic format. The ever increasing amount of litigation, greater volumes of data and a move toward adding in-house e-discovery capabilities require strong tools for e-discovery.

Data is scattered throughout companies and has become progressively more difficult to manage. Companies are dealing with big data, data in shared repositories such as Box.com, data on mobile devices, etc.

Data must be protected during e-discovery just as it does when it is a part of any other business activity. The degree of security risk depends on the nature of the data. Standard business contracts might not be highly sensitive and thus create minimal risk, but exposure of intellectual property that represents the crown jewels of a company could be a major risk.

When legal hold is used effectively, companies can meet their preservation duties, then do targeted collections as needed in the case. Good hold process plus targeted collections can significantly reduce the amount of information that must be reviewed by attorneys, which accounts for 70 percent of e-discovery costs.

Another value proposition in using an automated legal hold solution that is integrated with collections and first-pass review is the ability to re-purpose a collection.

Cloud offerings could be used to centralize all this data in one place for efficient reuse and risk management.

Several trends are contributing to strong growth in tools for the e-discovery. In addition to a group of large e-discovery vendors, many smaller vendors have products that are working well for their customers, and there is also room for new entrants that improve performance or address specific needs.

Each product has particular strengths, and that wide array offers options that can be used very selectively or in conjunction with each other to meet a company’s goals.

Sometimes, legal holds are required. Legal holds are required when a company might reasonably expect litigation and therefore should not delete information that might be relevant to the litigation.

Legal Hold Pro

This application has templates for the system and the database with the contact information for employees who are custodians of data. The system can also be used to track the information and people affected, automate the interviews with custodians, send reminders and release holds when appropriate. It allows to check the information of terminated employees to see if it might be subject to hold, and review responses from custodians to create the collection plan.

The same collection and review tagging could be used again by adding only the incremental data generated since the original one.

As a cloud product, Legal Hold Pro is quick and easy to launch, and is updated frequently.

Technology-assisted review (TAR)

Once a set of documents is located that may be responsive to the e-discovery request, it needs to be searched. The effective use of human skills in conjunction with computer capabilities is a key ingredient in lowering down the volume of data that needs to be reviewed by attorneys or other legal professionals.

Technology-assisted review (TAR), also called predictive coding, is a method for training a computer to spot documents that may be relevant and distinguish them from those that are not.

Catalyst

Catalyst provides e-discovery software and services.

Catalyst Insight is a secure cloud-based platform where clients can search, review, mark and produce documents. It can be augmented with Insight Predict, a predictive ranking TAR 2.0 solution that uses continuous active learning (CAL) to speed the review process by allowing technology to work alongside the judgments that human reviewers make. The solution brings the most relevant documents to the top of the list rather than working in a linear fashion.

The company’s TAR 2.0 software is specially designed for e-discovery. Some of the early TAR products were re-purposed machine learning tools. They can work in situations where the target documents are a large proportion of the total, but if you are looking for the one percent that are ‘hot docs,’ then they are not as effective. With TAR 2.0, attorneys and legal professionals who are subject matter experts do the initial coding for relevancy. Each of their judgments about the relevancy of a document is fed back to the system as a means of “training” to identify others that also might be relevant.

In the case of earlier versions of TAR, adding new documents caused the random sampling assumptions to no longer be correct. Unlike earlier products, which had a finite learning phase and then a production phase, TAR 2.0 allows new coding to be immediately incorporated into the algorithm for searching the document repository so that it is correctly tuned to the current problem domain.

It allows every decision made by an attorney to be put to maximum use, allowing humans to do what they do best, and then let the computer do what it does best, which is to quickly surface the relevant documents.

One practical limitation of early versions of TAR was that it could not handle small volumes of documents because the usual percentage of samples did not provide enough examples from which the computer could learn. This became improved in later versions of the tool.

Recommind

In 2006, the federal rules for discovery changed to include discovery of electronic information. E-discovery includes the collection, processing and analysis of e-mail and other electronic documents that might be relevant to a case, including determination of whether the documents are indeed relevant.

What sets Recommind apart from many industry solutions, is the ability to prioritize records and pull together similar records.

Recommind’s Axcelerate product can research, collate and assemble electronic records into reports. The electronic records for a single case can sometimes number into the millions.

Axcelerate’s adaptive batching expedites the feedback loop on search or analytics-based document sets, making continued batching not just automatic, but also conditional on the relevancy found through sampling. That enables a law firm to determine by batch if certain records are indeed relevant to a case, rather than reviewing them individually.

Magnum Software

It allows to quickly search, annotate and link to portions of documents. The collaboration capability is quite robust. Users can share their work product with any other users or groups of users via a one-click e-mail alert.

The alert automatically includes a direct link to the note and passage so the recipient can log in from anywhere, review the remarks and continue the discussion thread within Opus 2 Magnum. Additionally, multiple users can “chat” within the application.

The application works much better with smaller files than loading them all to a large database, but Magnum can also scale for larger file sizes.

Exterro

This in an excellent tool for eDiscovery. It provides eDiscovery and other records management needs in a single platform. Genome data mapping module can be added which will create an excellent solution for the data mapping.

With the increasing number of records and need to keep track of them and pull them together efficiently, the demand for KM technology for records and information management will continue to grow.

Galaxy Consulting has 17 years experience in ensuring that ediscovery process is going smoothly.

Sunday, January 22, 2017

Five Trends of Knowledge Management

Many issues affect knowledge management. The five most important are big data, cybersecurity, mobility, social analytics, and customer engagement.

The availability of big data has opened many options for understanding everything from customer preferences to medical outcomes.

Amidst all that data, concerns about security have grown, so cybersecurity is taking on new importance. Mobility has become pervasive and affects nearly every element in KM, while social analytics is providing insights at a personal level that were never possible before.

Finally, although those four factors feed into many KM objectives, enhancing customer engagement has taken a place at the top of the priority list for virtually every company and is likely to remain there for some time.

Big data

The most dramatic trend impacting knowledge management is harvesting and analyzing big data. An esoteric phenomenon just a few years ago with a new set of technologies and terminology, big data is now wrapped into the strategic plans of many organizations, and not just the big ones.

There are few applications to help with this challenge.

One of them is Hadoop. It can help to integrate complex sets of data to make business decisions and marketing efforts.

Actian Analytics Platform is a big data analytics solution that is accessible and affordable for small businesses, but also scalable to large ones. It can be used to target right customers. It can also be used to generate an economic case for potential buyers.

For example, Yahoo uses Actian to segment millions of users across 10,000 variables, looking for clues that will help predict customer behavior. Amazon uses Actian to provide the core technology components for its cloud-based data warehouse.

The technology can pull together diverse data in near real time as it flows through the data pipeline, marketing, customer engagement, risk assessment and many other applications. At both ends of the spectrum, from startups to large-scale users, big data is the central force in converting large amounts of data to decision-supporting information.

Cybersecurity

With so much information at large, unauthorized access to it has the potential to be destructive. Knowledge management is focused on information. What makes KM so important is that people can get information and analyze it better. In the past, it was hard to find out who was buying products and how they felt about them. Now an enormous amount of information is available, which has benefits. The information can be stolen and used financial gain.

The cybersecurity market is expected to increase from $95.6 billion in 2014 to $155.7 billion by 2019, resulting in a 10.3% per year increase during that time period. This amount includes network, endpoint, application, content and wireless security as well as many other types of technology. Innovative products are emerging in response to increased threats.

The volume of data, including an entire new collection from the Internet of Things, the challenges of mobile devices, greater use of the cloud for data storage and the broad impact of consumer concern are all sparking the growth.

Cybercrime comes in many forms, from stealing credit card numbers out of a merchant’s database to identity theft of consumers. A common strategy is for a cyberthief to obtain some publicly available information about an individual and use it to open an account or figure out a password that provides them access to an account. Users need to be vigilant about changing their passwords and making them strong. Technological safeguards can be put into place, but security depends a great deal on the human effort.

Mobile devices add another element of risk. They are much easier to lose or to steal, and often contain sensitive information such as bank passwords. Technological advances such as the ability to remotely disable a phone will continue to emerge to protect users from the impact of cybertheft. However, the result of users being careless with physical security, such as leaving a laptop in an unlocked car, remains a threat.

Companies can mitigate the impact on their customers by limiting the responsibility of users in the event of fraud or identity theft. Industries are growing up around providing insurance for such scenarios, either to the merchant or the customer.

Mobility

Although mobility brings hazards, it has brought even more advantages, and it will continue to drive the pervasiveness of knowledge management. Increasingly, knowledge management solutions, including content management, process management and analytics, have mobile versions of the solution. No longer a miniaturized version of the desktop browser, mobile apps are delivering usable KM applications.

Mobility is also forging new paths. For example, Apple Pay allows use of the smartphone as a wallet.

One mistake merchants make in designing mobile apps is to try to duplicate a physical purchase experience on a mobile device. Merchants should not necessarily automate an existing process, but instead should look at the experience holistically. Mobile experiences have to be simpler and as good as, if not better than, the non-mobile experience in order to gain loyalty from the customer.

Barriers remain in the use of mobile devices for enterprise applications, but the barriers also represent opportunities. In a study of U.S. and U.K. information technology decision makers conducted by Vanson Bourne, respondents reported that although more than 400 enterprise applications were typically deployed in each organization, only 22% of them could be easily accessed on mobile devices.

One reason for that is the diversity of enterprise applications. Some are custom, some are SaaS and some are off-the-shelf, and the technology for accessing each one is different. Therefore, development of mobile apps for such applications is needed, but organizations are hampered by the high cost. More efficient development techniques would be a big benefit.

The proliferation of mobile devices has also increased a number of other supporting sectors beside mobile application management (MAM), including mobile content management (MCM) and mobile device management (MDM). Each of them has a touchpoint to knowledge management and should be viewed in conjunction with an overall KM strategy.

Social analytics

Social analytics is a booming market which is expected to triple over the next five years to nearly $9 billion and showing a growth rate of nearly 25% per year. Initially based on simple counts of the number of times a brand was mentioned in social media, analytics has evolved to the point where it is using sophisticated algorithms that support the use of social data for targeted marketing and for initiating customer service.

Social analytics moved from hindsight to insight and now to foresight, with predictive capabilities. SAS social media solutions include integration and storage of social data, general text analytics and analysis of comments for sentiment, and a social conversation module that can work directly or integrate with third-party engagement solutions.

Real-time analysis allows marketing or brand campaigns to be synchronized with the topic threads that are emerging. Decision trees allow ‘what-if’ scenarios such as the impact of increasing the frequency of an ad, or combining customer segments. These analysis allows users to determine the relationships among various factors and to present visualizations of the relationships for better marketing decisions.

The value of social media analytics is also increased by combining it with data such as purchasing information from the data warehouse, to compare customers’ stated intentions with actual behavior. There is tremendous growth in analyzing social media information along with data from the Internet of Things which measures physical activity to build a profile not just of transactions but of tone and behavior along the customer journey.

Social media analytics should not be isolated. The information should be tightly connected to upstream data so different departments can use it to drive the customer experience.

Customer engagement

The driving force for all of the above is customer engagement - collecting and managing big data, keeping information secure, enabling mobility and analyzing social media inputs. The ultimate goal is to engage the customer, whether for marketing, customer support, participation in loyalty programs or some other outcome.

The key for customer engagement is omni-channel. Whether the interaction is initiated by the customer or the organization, customers want options in the delivery channels.

Customer engagement is not a static business area. The feedback obtained through social analytics and traditional business intelligence can be merged to explain both what customers are doing and why. That information can guide the delivery of marketing materials and help provide better customer service.

Galaxy Consulting has 17 years experience in knowledge management. We have lead knowledge management initiatives. Contact us for a free consultation.

Saturday, December 31, 2016

Search in the Land of Information Silos

Information access and retrieval within most organizations is a work in progress. There might be a general search system for marketing information, and probably one or more database search systems.

The larger the organization, the greater the number of information retrieval systems. Each laptop and mobile device has a search system. Mobile phone apps sport their own search systems. The lawyers in an organization may have different search systems for specific types of legal matters. The enterprise resource planning (ERP) users have a search system. When it comes to enterprise search, there are many silos.

A “silo” is a content collection available to certain users. In the face of the reality of silos, it might be impractical idea of providing access to “all” information. “All” may not mean all or even some available information. Big data is easy to talk about but difficult to make accessible. The same challenge exists for images, audio recordings, and engineering drawings with details hidden into the proprietary system’s database.

Search which is variously called universal, unified or federated search is a solution to the challenge of information silos. The term meta-search is often used to describe an integrating function that passes the user’s query across discrete content indexes and returns a single results list to the user. Endeca, Inxight Software, Northern Light, Sagemaker and Vivisimo are search applications that can be used for universal, unified or federated search in an organization.

The initial query might not unlock the information stored in the system’s index. The facets, topics and suggests make it easy for the user to click through the links without having to craft additional queries.

Behind the curtains, federated search results requires some maintenance. A user does not want to know the file format in which the information he or she needs is stored. The user wants answers. Early federating systems like WAIS relied on standards for content representation. Today, however, there are many “standards,” and content processing systems must be able to process content in the hundreds of formats found in organizations.

It is important to deliver a system that makes an organization’s disparate types of digital content available.

There are barriers to unified, federated or integrated search.

Some digital content cannot be included in a general purpose search system for security, business or legal reasons. Technical content such as chemical structure information at a pharmaceutical company requires special purpose systems. The same need applies to product manufacturing data, legal information and engineering drawings.

Most search applications exclude video streams from the index. If video is indexed, the system processes the text included in the digital file or indexing provided by the video owner.

The cost of creating connectors to connect with certain content types could be too high, or license fees could be required to gain access to the file formats.

The computational burden required to process certain types of content might exceed the organization’s ability to fund the content processing. Big data, for example, requires a computing capability able to handle the Twitter stream, RSS feeds and telemetry data from tracking devices. Cost could be prohibitive for processing all content types.

The most important challenge is the need for confidentiality. The legal department does not want unauthorized access to information related to a legal matter out of its control.

Some government contracts required that for certain types of government work, the information related to that project must be separated. Common sense dictates that plans for a new product and its pricing remain protected. If someone needs access to that information, a different search system may be used to ensure confidentiality.

Even in the absence of business or legal requirements, some professionals do not want to share content. That may be a management problem. When a manager locks up information in a no-access silo, a software script will skip the flagged server.

To summarize, silos of information present a challenge to process and effectively use in organizations. In the enterprise, integration should take place within silos of content.

Galaxy Consulting has 17 years experience in integrating information silos using universal, unified or federated search. We have experience with search applications. Contact us for no obligation free consultation!

Wednesday, November 30, 2016

Three Values of Big Data

Big Data is everywhere. But to harness its potential, organizations should understand the challenges that come with collecting and analyzing Big Data. 

The three values that are important in managing big data are volume, velocity, and variety. These three factors serve as guidance for Big Data management, highlighting what businesses should look for in solutions.

But even as organizations have started to get a handle on these three V’s, two other V’s, veracity and value are important as well, if not more so.

Volume is the ability to ingest, process, and store very large data sets. Definition of "very large" can vary by business and is dependent upon the particular circumstances of the business problem, as well as the preceding volumes used by that business.

Volume can also be defined as the number of rows, or the number of events that are happening in the real world that are getting captured in some way, a row at a time. Accordingly, the more rows that you have, the bigger the data set is going to be.

Bigger Volumes, Higher Velocities

In today’s digital age, having huge volumes of data is hardly rare. The proliferation of mobile devices ensures that companies can gather more data on consumers than ever before, and the rise of the Internet of Things will only increase this plethora of data. Moreover, businesses will have even more information on customers as they begin to use one-on-one messaging channels to interact directly with them.

The sheer volume of data available to us is greater than ever before. In fact, in many ways, nearly every human action can be quantified and logged in a bank of data that’s growing at an incredibly fast rate. All of this data can be turned into actionable insights that drive business decisions and can help transform every customer interaction, create operational efficiency, and more.

This increase in data volume is paired with a simultaneous increase in speed. The speed with which the volume is increasing, as well as the volume itself, are both increasing. These increases have forced IT staff to spend more time trying to figure out how to process and analyze that data.

Velocity is the key V of the three V’s. For example, a customer will visit a company’s site or use its mobile application but only for a short amount of time. The business may have just seconds to gather customer information and deliver a relevant response based on that information, usually just one message or offer.

This quick turnaround time requires you to process all of that real-time behavioral data as fast as possible. If you only understand that your customer was on your Web site the day after, you’re not able to contact them anymore. One aspect of a successful customer journey is being able to send the right message at the right time to the right customer. Timeliness and relevancy are the foundation of delivering personalized customer experiences in real time.

A Variety of Formats

Data sets are in a variety of formats, and the number of data types continues to grow. Radio-frequency identification (the use of electromagnetic fields to gather information from tags attached to objects), smart metering (devices that monitor information on energy consumption for billing purposes), and the ubiquity of mobile devices with geo-location capabilities are only few examples of diverse sources of consumer information.

All of these technologies have their own methods of capturing and publishing data, which adds to the complexity of the information environment.

But overcoming these data complexities could be well worth it. Having a large variety of data is crucial for creating a holistic customer view. Access to data such as a customer’s purchasing history, personal preferences based on social media postings, exercising habits, caloric intake, and time spent in the car can help companies understand that customer on a deeper level, and thus build experiences that are tailored to that customer.

But this diversity of data sources can be a blessing and a curse. A blessing because businesses have an increasingly large range of channels from which to pull customer information, but a curse because it can be difficult to filter through that information to find the most valuable content.

Variety is a little overstated in what people talk about for Big Data. Audio and video as examples of data formats that can be particularly difficult to analyze. Usually what companies do is they try to come up with an intermediate representation of that data, and then use that intermediate representation to apply old or new algorithms to try to extract signals, whatever the definition of signal is for that business problem they’re trying to solve.

Volume, velocity, and variety are undoubtedly important to managing customer information. Companies should keep in mind other important aspects of big data if they want to make the most of it.

Data tools such as Apache Hadoop and Apache Spark have enabled new methods of data processing that were previously out of reach for most organizations. While the growing volume of data, the time needed to process it, and the sheer number of input sources pose challenges for businesses, all three can largely be addressed through technology.

New V's Emerge

Investment in Big Data has begun to stabilize and enter a maturity phase over the past year. It will take time for infrastructure and architectures to mature, and best practices should be developed and refined against these architectures.

Organizations should consider how to use Big Data to bring about specific outcomes, in other words, organizations should examine the challenges of Big Data from a business perspective as opposed to a technical one. A framework that incorporates the business-oriented characteristics of veracity and value can help enterprises harness Big Data to achieve specific goals.

Not all data is the same, but organizations may not be paying enough attention to changes within individual data sets. Contextualizing the structure of the data stream is essential. This includes determining whether it is regular and dependable or subject to change from record to record, or even with each individual transaction. Organizations need to determine how the nature and context of data content in all its forms, text, audio, or video, can be interpreted in a way that makes it useful for analytics.

This is where the veracity of data or the trustworthiness of data comes in. Determining trustworthiness is particularly important when it comes to third-party data. It passes through a set of edits and validation rules.

Veracity entails verifying that data is suitable for its intended purpose, and usable within a given analytic model. Organizations should use several measurements to determine the trustworthiness and usefulness of a given data set. Establishing the degree of confidence in data is crucial so that analytic outputs based on that data can be a stimulus for business change.

Important metrics for evaluating and cleaning up data records are:
  • completeness measurements, or the percentage of instances of recorded data versus all available data within a business ecosystem or market (or the percentage of missing fields within a data record);
  • uniqueness measurements, or the percentage of alternate or duplicate data records;
  • accessibility measurements, or the number of business processes and personnel that can benefit from access to specific data, or that can actually access that data;
  • relevancy measurements, or the number of business processes that utilize or could benefit from specific data;
  • scarcity measurements, or the probability that other organizations including competitors and partners have access to the same data (the scarcer the data, the more it has impact).
Value is Paramount

While veracity can’t be overlooked, value is the most important factor. The first three V’s are really talking about architecture, infrastructure, representation of data, things that are important to IT organizations and, by far, less interesting to the business stakeholders.

The business stakeholders really don’t care about the first three, they only care about the value they can extract from the data. Executives often expect the analytical teams at their organizations to hide the first three V’s (volume, velocity, and variety) and only generate the last V - the value that is fundamental to the success of the business.

The concept of value is essential for organizations to succeed in monetizing their data assets. Value is a property that helps identify the purpose, scenario, or business outcomes that analytic solutions seek to address. It helps to confirm what questions are to be answered and what actions will be taken as a result, and defines what benefits are anticipated from collecting and analyzing the data.

Value is a motivating force when it comes to developing new and innovative ideas that can be tested by exploring data in different ways.

The ability to pull valuable information from Big Data and use that information to build a holistic view of the customer is absolutely critical. It’s no longer just an option to develop one-to-one relationships with customers; it’s a requirement. And to build that relationship, companies have to leverage all the customer information they can to personalize every interaction with them.

By using such information to lead customers on a personal journey, companies can help ensure that customers will stay with them long term, and even become brand advocates. Value is derived from making the data actionable. Organizations can have all the information about a customer, but it’s what we they can do with it that drives value for the business.

The Three V’s model of volume, velocity, and variety is useful for organizations that are just beginning to take control of their data, and certainly should not be forgotten by enterprises that have advanced further in their management of customer information.

The first three V’s are equally important. In the digital age, companies have accumulated more data than ever before, are pulling data from a variety of sources, and are increasing the rate at which that data flows, and that a combination of these three factors can help organizations to create relevant, personal, and one-on-one customer interactions.

Deriving value is the ultimate business goal for any enterprise. The standard Three V’s model does not satisfactorily identify any data properties from a business usage perspective. Even though Big Data, and data in general, provides organizations with a lot of capabilities, the challenge for businesses is to make sure that they adapt how they think about the business processes, how they report on them, and how they define key performance indicators.

Organizations should try to get to the value. They need to turn that data into value. It’s figuring out how to use that data to optimize business processes. In the end, the Three V’s model for Big Data is a useful start point. But then it becomes about the ultimate goal, the one organizations must not lose sight of: driving value.

Galaxy Consulting has 17 years experience in big data management. We are on the forefront of driving value of big data.

Friday, November 11, 2016

Voice Search

Roughly 56% of teenagers and 41% of adults use voice search on their mobile phones every day, according to Northstar Research. 

For example, modern consumers in Boston are much more likely to ask Google Now, Siri, Cortana, or Amazon’s Alexa to find the nearest coffee shop than they are to type "coffee shops near Boylston Street in Boston" into a search bar on Google's homepage.

This truly creates a challenge for search engine providers and for the providers of those personal assistants. But as consumers increasingly turn to voice search on their mobile phones, those Boston coffee shops will now have to rethink their search engine optimization (SEO) strategy if they hope to show up in voice search results.

Search is a science, and the rules are different for text and voice search.

It will only become more vital that business data on a company web site such as store locations, hours, and contact information is accurate and up to date. Businesses will also need to make sure that they are portrayed accurately on local review sites like Yelp.

Companies also need to consider how and where consumers are conducting their voice-based searches. When using computer-based search, it's assumed that a user is at a computer, so there is more screen space and more time to search. When using a mobile device, it's assumed a user is out, time is short, and the user needs access to quick bits of information on a small screen.

Therefore web sites need to be designed so that they dynamically adjust to fit any screen the consumer is using.

Google and Bing use mobile-friendliness as a ranking factor in their SEO algorithms. Both Microsoft and Google offer tools to help companies determine whether their sites are mobile-friendly. The tools look at factors such as loading speed, the width of page content, the readability of text on the page, the spacing of links and other elements on the page, and the use of plug-ins.

When it comes to voice search, web content that delivers the answers consumers want, in the quickest way possible, will ultimately win. The information should be concise and to the point, with more of an emphasis on usefulness than visual appeal.

Web content should be presented in more of a natural, conversational style and structured more like FAQs, answering the questions consumers might pose in voice search queries without requiring them to click on additional links or take other actions. Voice searches might be initiated in the car while someone is driving.

Companies also need to consider how consumers ask for information through voice search. More often than not, voice search queries are phrased using the same types of who, what, when, where, how, and why questions that are part of natural conversations. During these conversational, natural language search queries, consumers do not typically use the same keywords or metadata that are the hallmarks of text-based searches. Using basic keywords to set SEO parameters alone is no longer enough.

Use long-tail keywords. Rather than relying on a single word or phrase, long-tail keywords involve multiple keyword phrases that are very specific to whatever the company is selling.

Companies need to teach their systems and the search engines a very specialized lexicon that corresponds to their product and service names.

SEO rules keep changing and so SEO strategy needs to change accordingly.

Galaxy Consulting has many years experience with search. Contact us for a free consultation.

Sunday, October 2, 2016

Content Localization

Content is the fuel of your organization. Content spills into all areas of an organization and fuels absolutely everything with which customers interact, whether it's a commerce web site, marketing or sales materials, or any other channels.

Preferences for content vary by continent. Companies looking to expand operations overseas will inevitably face cultural and language barriers. They will have to think about how content is resonating with global audiences and they will face the challenge of determining which content to adjust to meet the needs of local markets.

To help ensure a quality customer experience, the top global brands employ a localization strategy to adapt their online content for regional specificity. Content should clearly resonate with companies' international audiences.

Content localization is the process of modifying content to make it usable for a new locale. Often, this includes translating the content from the source language into the language used in the locale. It is an integral part of adapting a product for a particular market. For example, if a company is currently selling products in US and is expanding into France, the company would need to translate the content into French.

Content translation and localization are not the same. Translation is the process of changing an original language of content into a different language. Content localization is a more specialized process of adapting content for regional or local consumption. It goes beyond translation to modify the source language and other site elements to appeal to the customer’s cultural preferences in their own target language. Localization is about refining your message and curating your brand to meet the cultural, functional, and language expectations of your global markets.

Why to localize the content? There are few reasons:
  • catering to local customers - to show your local customers that you care about them and to bring more value to your products;
  • reduce risks - to avoid liability by not using words that might be offensive in a different country;
  • enhance marketing - having a great campaign or advertisement is useless if your target audience doesn’t speak English well enough to understand your marketing message;
  • increase sales - localized content will sell more often because consumers will be able to fully understand what they are buying.
Content localization includes other processes in addition to translation. For example, U.S. and French currencies are different, so you might need to change references from dollars to euros. It is possible to localize information without translating it such as changing examples, currency, geographically specific information, and perhaps color schemes.

These are standard best practices that help to minimize the cost of localization:

Content Development:
  • source text should use clear, simple, concise language;
  • do not use jargon, idiom, metaphors, or other creative language;
  • use simple words;
  • use simple language structure;
  • use consistent terminology
  • limit text in graphics, and put text into an editable layer;
  • use legends rather than embedded text for graphics.
Formatting:
  • use predictable structure;
  • use styles or structured authoring;
  • do not use embedded formatting.
Delivery

Include a language flag on content so the system can deliver content to readers in the language they want.

Localization
  • use professional translators;
  • use translation memory systems;
  • work with localization professionals to identify potential problems;
  • look for opportunities to change source content instead of requiring changes in each target language.
When creating graphics, make sure that they are culturally neutral and acceptable for global audience. These are the best practices for graphics:
  • use color carefully because colors meaning varies across cultures;
  • images of people are not appropriate in some cultures and in other cultures, images of women are not acceptable;
  • images of hands or feet are not appropriate in some cultures;
  • when designing icons make sure that they are universally understood.
Some examples of cultural content include:
  • colors, shapes, sizes, styles;
  • images, icons, graphics;
  • societal codes; i.e. humor, etiquette, rituals, myths, symbols;
  • societal values, power, relationships, beliefs.
Some examples of functional content include:
  • date and time formats, telephone numbers, contact information;
  • weights, measurements, geographical references;
  • language and linguistic content; product descriptions, reviews.
Content translation and localization also differ on a tactical level. While simple translation may be appropriate for some content types in certain markets, localization is most often required for adapting highly emotive, creative marketing content so that it clearly resonates across locales.

There are several content types, from marketing collateral to legal and technical information and user-generated forum content. For reasons of efficiency and cost, is is the best practice to map these content types to the most appropriate translation or localization methods.

Not all company's content needs to be translated. Companies that located in one location should make it a priority to determine which content should be tailored for various regions. For example, blogs and tweets are not as significant as marketing and training materials, one help sheets, business forms, support email, FAQs.

It is generally easier to select the best fit when you consider your audience and the content’s source and intent. Other parameters include volume, update frequency, content lifecycle, and budgetary considerations. Depending on your language service provider’s (LSP) capabilities, there are several methods from which to choose. When making these decisions, it’s best to consult an experienced LSP that offers a wide range of services.

Following these best practices will help to reduce the overall cost of localization. Translators very often use memory tools and following these best practices would help these tools to recognize the same patterns.

Companies can try to discover which locales and languages provide the greatest return on investment. Identify languages that make up 90% of global online economic opportunities.

Be sure to follow industry best practices to reduce complexity, speed time-to-market, control costs and ensure quality localized content for all of your global markets. Companies will need to adopt stronger content strategies, keep closer tabs on the ways in which customers are consuming content, and take into consideration consumer preferences based on geographical region and digital trends.

Achieving global excellence means improving local experiences and getting better insights into those markets.

Wednesday, July 27, 2016

Navigating Big Data

Big Data is an ever-evolving term which is used to describe the vast amount of unstructured data. Published reports have indicated that 90% of the world’s data was created during the past two years alone.

Whether it’s coming from social media sites such as Twitter, Instagram, or Facebook, or from countless other Web sites, mobile devices, laptops, or desktops, data is being generated at an astonishing rate. Making use of Big Data has gone from a desire to a necessity. The business demands require its use.

Big Data can serve organizations in many ways. Ironically, though, with such a wealth of information at a company's disposal, the possibilities border on the limitless, and that can be a problem. Data is not going to automatically bend to a company's will. On the contrary, it has the potential to stir up organizations from within if not used correctly. If a company doesn't set some ground rules and figure out how to choose the appropriate data to work with, as well as how to make it align with the organization's goals, it's unlikely to get anything worthy out of it.

There are three layers of Big Data analytics, two of which lead to insights. The first of these, and the most basic, is descriptive analytics, which simply summarize the state of a situation. They can be presented in the form of dashboards, and they tell a person what's going on, but they don't predict what will happen as a result. Predictive analytics forecast what will likely happen, prescriptive analytics guide users to action. Predictive and prescriptive analytics provide insights.

Presenting the analytics on a clean, readable user interface is vital but sometimes is ignored. Users get frustrated when they see content that they can't decipher. A canned dashboard does not work for users. They need to know what action they have to take. Users demand a sophisticated alert engine that will tell them very contextually what actions to take.

Using such analytics, ZestFinance was able to glean this insight: those who failed to properly use uppercase and lowercase letters while filling out loan applications were more likely to default on them later on. Knowing this helped them identify a way to improve on traditional underwriting methods, pushing them to incorporate updated models that took this correlation into consideration. As a result, the company was able to reduce the loan default rate by 40% and increase market share by 25%.

Unfortunately, insights have a shelf life. They must be interpretable, relevant, and novel. Once an insight has been incorporated into a strategy, it's no longer an insight, and the benefits it generates will cease to make a noticeable difference over time.

Getting the Right Data

To get the right data leading to truly beneficial insights, a company must employ a sophisticated plan for its collection. Having a business case around the usage of data is the first important step. A company should figure out what goals it would like to meet, how and why data is crucial to reaching them, and how this effort can help increase revenue and decrease costs.

Data relevance is the key and what is important to a company is determined by the problems it is trying to solve. There is useful data and not useful data. It is important to distinguish them and weed out not useful data. Collecting more than what is useful and needed is impractical.

Often data is accumulating before a set of goals has been outlined by stakeholders. It is being collected irrespective of any specific problem, question, or purpose. Data warehouses and processing tools such as Hadoop, NoSQL, InfoGrid, Impala, and Storm make it especially easy for companies to quickly attain large amounts of data. Companies are also at liberty to add on third-party data sources to enrich the profiles they already have, from companies such as Dun & Bradstreet. Unfortunately, most of the data, inevitably, is irrelevant. The key is to find data that pertains to the problem.

Big Data is nothing if not available, and it takes minimal effort to collect it. But unfortunately, it will not be of use to anyone if it’s not molded to meet the particular demands of those using it. Some people are under the impression that they are going to get a lot of information simply from having data. But businesses don’t really need Big Data - information and insight are what they need. While a vast amount of data matter might be floating around in the physical and digital universes, the information it contains may be considerably less substantial.

While it might seem advisable to collect as much information as possible, some of that information just might not be relevant. Relevant insights, on the other hand, allow companies to act on information and create beneficial changes.

It is a good idea to set parameters for data collection by identifying the right sources early on. It could be a combination of internal and external data sources. Determine some metrics that you monitor on an ongoing basis. Having the key performance indicators (KPIs) in place will help companies identify the right data sources, the types of data sources that can help solve their problems.

Technology plays a key role in harnessing Big Data. Companies should figure out what kinds of technology make sense for them. Choice of technology should be based on company's requirements.

Data collection is an ongoing process that can be adjusted over time. As the business needs change, newer data sources are integrated, and newer business groups or lines of businesses are brought in as stakeholders, the dynamics and qualities of data collection will change. So this needs to be treated not as a one-time initiative, but as an ongoing program in which you continually enrich and enhance your data quality.

Companies should continually monitor the success of their data usage and implementation to ensure they're getting what they need out of it. There should be a constant feedback stream so that a company knows where it stands in relation to certain key metrics it has outlined.

Risks

Companies must always be aware of the risks involved in using data. Companies shouldn't use prescriptive analytics when there is significant room for error. It takes good judgment, of course, to determine when the payoffs outweigh the potential risks. Unfortunately, it's not always possible to get a prescriptive read on a situation. There are certain limitations. For one thing, collecting hard data from the future is impossible.

People and Processes

Big Data adoption often becomes a change management issue and companies often steer clear of it. When a company implements something that's more data-driven, there's a lot of resistance to it.

Like most initiatives that propose technology as a central asset, Big Data adoption can create conflicts among the various departments of an organization. People struggle to accept data, but people also aren’t willing to give it up. To avoid such clashes, companies should make it clear from the outset which department owns the data. Putting the owner in charge of the data, having this person or department outline the business rules and how they should be applied to customers would be helpful to overcome this issue.

These are two good tips to follow: Give credit where credit is due and don't dehumanize the job. Don’t attribute the success to the data, but to the person who does something with the data. Remember that change can't just come from the top down. Big Data adoption requires more than executive support. It needs buy-in from everyone.

Saturday, July 23, 2016

Successful Change Management in Content and Knowledge Management

It is getting more unlikely to find paper documents in filing cabinets or electronic documents in shared network drives. Filing cabinets and shared network drives have been replaced by content management systems, knowledge base application, and collaboration tools in majority of organizations.

At a certain point, it's inevitable that organizations have to make adjustments to keep up with the times. users must constantly adapt to the tools of an evolving world. After all, if customers are using advanced technology, it makes sense that companies should be interacting with them using tools that are up to date as well.

If technology adoption is to have an effect on an organization, users' commitment becomes a required element. But getting that kind of cooperation is not always a simple task. Users might not immediately take to the new processes without some resistance.

Though it's counter-intuitive that anyone would resist technology designed to make their job easier, the resistance is unavoidable element of content and knowledge management initiatives. Organizations should anticipate a number of challenges and do their best to ease their users's resistance through the transition and change management.

Drawing from our 16 years of experience successfully managing user adoption and change management in content and knowledge management initiatives, these are our guidelines for organizations to overcome challenges in these areas.

1. Communicate the Goals

There may be myriad practical reasons for why the change in how your organization manages its content needs to be put in place. Before proposing any major change, establish clear reasons for why the change is being proposed, and how it is going to enhance users' experience.

For users to understand how technology is going to help them, they need to understand what their future will look like with this technology in place. What it amounts to is if you can't articulate the benefits of making the change, you have no business of making this change.

It is very important to create a consistent narrative that instills confidence in users as well as the language you use to deliver this narrative to users. Avoid using the term "change management". The reason is that employees hear "change management" as "Whatever you have done until now is wrong, and now we are going to put you on the right track." That is not a good message.

You may want to use the term "cause management" which attributes any need for adjustment within a company to a cause. Under this approach, organizations would make an effort to craft a story that communicates the idea that this is the outcome that will best benefit the company.

Highlighting what is not going to be changing can be a source of encouragement for users. This way you are introducing a consistency while asking users to evolve.

2. Fear of Change is not Necessarily Fear of Technology

Technology itself is not usually the reason that employees are resistant to change. People are becoming less resistant to using technology. Problems begin to surface when employees are not given enough notice about what technology they are expected to use.

Even before it's been decided which technology organizations have settled on, organizations should give their employees an outline of the problems they are trying to fix. This would give them the opportunity to provide input and make suggestions about what types of processes they would like to see streamlined and how they envision their ideal work environment. Though organizations might not always have the budget for what the employees have in mind, they will at least be involving them and making them feel as though they are part of the equation from the outset.

Also important is that workers are given the time to develop the kinds of skills necessary to make full use of technology. It takes employees some conditioning to see how new technology and procedures can be of aid to them. If you can be proactive about teaching people these new skills and how to use the technology in small segments, this definitely can accelerate the change.

3. New Technology Can Bruise the Ego

All employees are proud of their work. They like to feel as though they possess an innate talent, and that there's a reason they're doing what they've chosen to dedicate so much of their time to. Regardless of age or experience level, there are certain natural emotions that might come into play when companies are proposing changes. If employees are led to believe that so much of what they spent a great deal of time mastering can be transferred to anyone with an ease, they might resent it on an emotional level that they might not even share.

Thus, it would be a good idea to communicate how the technology is going to help them work together and be more connected.

4. Technology is not Only for Managers

It goes without saying that technology should never force people to do more work than they are already doing. If you force people to use a system that is making their jobs worse, they' are going to do everything they can to avoid it.

Employees should never feel as though technology is being deployed solely for the benefit of the managers. Granted, content management system provides managers with more visibility into work processes, but the central message managers should be sending is that the technology is there to help employees do their jobs better.

It is helpful to illustrate that higher management is using the technology as well, for the sake of driving home the idea that the technology is being universally adopted by the organization.

5. Deploy Gradually

When it comes to deploying the systems that employees are going to be using regularly over an extended period of time, it is a good idea to steer clear of an abrupt implementation in favor of a more gradual one.

Use Pilot Periods. During these periods, a small subset of the company is selected to test the technology and share its experiences with the others. Keeping employees updated via email, meetings, or through other internal communication channels can be helpful, as it also lets people know what to expect. Likewise, getting user testimonials and videos in which those who have piloted the product attest to its benefits could prove useful.

However, it's important to be all-inclusive when deciding who is going to be participating in such trial periods. While it might be tempting to recruit the most enthusiastic and vocal representatives of a company to test the materials, it might be a better idea to go for a mix to act as testers. Use a subset of users that will represent those who are ultimately going to be expected to use the new technology. Of course, asking volunteers to step forward is advised, but testers should also be drawn from a segment of those who are not as keen on trying it.

Including people who are not technology experts is a good idea, because it helps drive home the point that anyone can use the solution effectively. It also reinforces the idea that there will be support and training opportunities available.

If the right group of people is selected for the pilot program, they can generate excitement about the system and show how the program has helped them do their jobs.

One small factor to keep in mind about the pilot period, however, is the capacity of the system. Since the entire program will eventually be inhabited by more users, the experience that the small subset reports might differ from the one that is waiting further down the line. For example, a system that works fine when you have ten users on it may not work as quickly when there are 200,000 users connected to it. You need to be able to account for things like that.

6. Maintain the Change

Change management is not as simple as preparing employees for the transition that is about to be introduced. It has just as much to do with ensuring that employees don't revert to outdated and inefficient methods as it does with ensuring that people begin to use it. Managing resistance is a process, not a series of events.

Because it's a process, managers should be very careful to communicate the fact that the improvements might not come all at once, but rather in small increments. Incentives can also act as fruitful aids in encouraging adoption. For this very reason, gamification applications have been gaining popularity because they allow employees to compete against one another and display to the rest of the company how well they have done by showing off their achievements.

It is important to build employees confidence and a positive environment. Set specific event days to encourage the use of the new technology. Typically held once a month, these are known as blitz days. The idea is to set aside a time period during which everybody is forced to use the technology in a fun environment. At the end of the day, the users share those results. The goal is to say that if this can be done on one particular day, why can't it be done every day? Over time, the benefits of these events could be substantial.

Change is ongoing. As time goes on, the window for change for technology is becoming much narrower than it used to be, with updates occurring far more frequently. For some people, it might seem that just as they are getting used to one change, another one is on the way. Organizations need to create an infrastructure that better supports that.

Characteristics for Driving Change
  • Be outwardly focused - avoid being locked into one area of the company. Look for ways to make an impact across organizations.
  • Be Persuasive - be clever and persuasive enough to gain the support of users.
  • Be Persistent - do not give up. Constantly work through the channels of the organization to ensure that new systems and processes are factored into organizations' way of work.

Sunday, June 26, 2016

Better Business Operations with Better Data

Businesses today understand that data is an important enterprise asset, relied on by employees to deliver on their customers' needs, among other uses of data such as making business decisions and many others.

Yet too few organizations realize that addressing data quality is necessary to improve customer satisfaction. A recent Forrester survey shows that fewer than 20% of companies see data management as a factor in improving customer relationships. This is very troubling number.

Not paying attention to data quality can have a big impact both on companies and the customers they serve. Following are just two examples:

Garbage in/garbage out erodes customer satisfaction. Customer service agents need to have the right data about their customers, their purchases, and prior service history presented to them at the right point in the service cycle to deliver answers. When their tool sets pull data from low-quality data sources, decision quality suffers, leading to significant rework and customers frustration.

Lack of trust in data has a negative impact on employees productivity. Employees begin to question the validity of underlying data when data inconsistencies and quality issues are left unchecked. This means employees will often ask a customer to validate product, service, and customer data during an interaction which makes the interaction less personal, increases call times, and instills in the customer a lack of trust in the company.

The bottom line: high-quality customer data is required to support every point in the customer journey and ultimately deliver the best possible customer experience to increase loyalty and revenue. So how can organizations most effectively manage their data quality?

While content management systems (CMS) can play a role in this process, they can't solve the data-quality issue by themselves. A common challenge in organizations in their content management initiatives is the inability to obtain a complete trusted view of the content. To get started on the data-quality journey, consider this five-step process:

1. Don't view poor data quality as a disease. Instead, it is often a symptom of broken processes. Using data-quality solutions to fix data without addressing changes in a CMS will yield limited results. CMS users will find a work-around and create other data-quality issues. Balance new data-quality services with user experience testing to stem any business processes that are causing data-quality issues.

2. Be specific about bad data's impact on business effectiveness. Business stakeholders have enough of data-quality frustrations. Often, they will describe poor data as "missing," "inaccurate," or "duplicate" data. Step beyond these adjectives to find out why these data-quality issues affect business processes and engagement with customers. These stories provide the foundation for business cases, highlight what data to focus on, and show how to prioritize data-quality efforts.

3. Scope the data-quality problem. Many data-quality programs begin with a broad profiling of data conditions. Get ahead of bottom-up approaches that are disconnected from CMS processes. Assess data conditions in the context of business processes to determine the size of the issue in terms of bad data and its impact at each decision point or step in a business process. This links data closely to business-process efficiency and effectiveness, often measured through key performance indicators in operations and at executive levels.

4. Pick the business process to support. For every business process supported by CMS, different data and customer views can be created and used. Use the scoping analysis to educate CMS stakeholders on business processes most affected and the dependencies between processes on commonly used data. Include business executives in the discussion as a way to get commitment and a decision on where to start.

5. Define recognizable success by improving data quality. Data-quality efforts are a key component of data governance that should be treated as a sustainable program, not a technology project. The goal is always to achieve better business outcomes. Identify qualitative and quantitative factors that demonstrate business success and operational success. Take a snapshot of today's CMS and data-quality conditions and continuously monitor and assess them over time. This will validate efforts as effective and create a platform to expand data-quality programs and maintain ongoing support from business stakeholders and executives.

Galaxy Consulting has over 16 years experience helping organizations to make the best use of their data and improve it. Please contact us today for a free consultation!