Saturday, September 30, 2017

Advances in Digital Asset Management

Digital assets continues to grow in importance as an enterprise asset, but it remains difficult to manage these assets from capture to distribution, as well as subsequent access to stored content.

Huge amount of digital asses are being created by marketing and communications teams that they’re being overwhelmed by the sheer number of media assets.

Many companies don’t have a central repository for their rich media assets. The files are typically scattered in multiple locations, including Dropbox, SharePoint or share drives.

Digital assets have multiple use cases such as training and education, marketing and corporate communication, and different departments have these functions.

Improving management of digital assets can have a rapid and significant return on investment, so organizations should be motivated to address the issue. Moreover, being able to find a particular video and also information within that video is growing in importance.

It is important to be able to find digital assets, re-purpose small parts of them, and keep track of the digital rights for the assets.

Digital Assets Management Systems (DAM) can greatly help to manage and use digital assets.

MediaValet

One of such system is MediaValet.

MediaValet is the Digital Assets Management System that is worth to consider. It is a cloud product and does not need any internal IT support. The system can be up and running very quickly. MediaValet company is very supportive of customers' needs. The system has flexibility both in finding and re-purposing digital assets. The entire user interface was redesigned in Version 3.0.

The latest version of MediaValet has a new menu feature that simplifies the metadata entry process by allowing users to check boxes for the desired metadata. A controlled vocabulary is used for keywords which is highly recommended to use.

The video can be put in a lightbox that allows to structure and sequence the final product. Music is also kept in MediaValet. It is possible to send a link to assets in the system for review by users.

MediaValet handles all media types, including HD video formats, and can manage file sizes up to 200 GB per asset.

There is a new feature that gives users the ability to search a dialog within videos. This allows users to find any spoken word within their video library with a keyword search. Once the word is found, the user is able to jump right to the sequence in a video where the searched word is spoken. The video will start a few seconds before the target word, and continue as long as the user wishes.

Videos Management

Video has been isolated from other enterprise content because it needs to be managed differently. By enabling videos to be checked into DAM systems, organizations can let users gain access to video content along with their other rich-media assets.

Enterprises can gain more if they begin to view video not as a separate entity, but as a knowledge asset just as they do documents, graphics and structured content. Video is a great way to capture tacit knowledge and integrate it with enterprise content.

Portals for knowledge management and training, certification, compliance and many other applications are fueling the need for more efficient management of these assets. Videos need to be as accessible as any other knowledge asset in an organization.

Mediasite

Video content management (VCM) for platforms designed for enterprise use is an emerging market. These platforms are intended not only for management of stored content but also for live streaming.

Mediasite by Sonic Foundry captures and distributes video both live and on demand. This is an extension from technology developed at Carnegie Mellon University, Mediasite is typically used for a presentation room or lecture hall.

Mediasite is a purpose-built family of hardware and software that creates a workflow from presentation to a URL. It provides software to organize and curate the video, creates metadata automatically and OCRs all the text on the screen. It also provides closed captioning by automatically uploading the video files to the customer’s captioning service provider. The product can operate on premises or in the cloud.

Many companies are using Mediasite for training and corporate communication. The system can support enterprise video initiative that enables employees to create user-generated content. In addition, numerous law schools and medical schools are recording lectures so that students can listen to them again or access the lecture if they missed it at the time it was presented.

University can place Mediasite technology in its classrooms, so every lecture can be recorded. Some have sophisticated tracking cameras or fixed cameras, while others use webcams in the laptop. All have screen capture for the presentations and microphones for audio.

Making video as useful as text in the enterprise has been a challenge. ,Because Mediasite indexes the videos, captures the screen presentations and makes them searchable, users can find topics of interest. By clicking on a slide below the video window, users can sync to the presenter’s spoken presentation.

Enhancing SharePoint Video Management

As a content management and collaboration platform, Microsoft SharePoint is a likely place to store videos, but it is limited in its ability to manage them, including searching for them.

Ramp Video Management for SharePoint

Ramp Video Management for SharePoint is a video content solution that stores, distributes, streams, and automatically creates metadata that facilitates search of video content. Ramp can manage video and audio content from virtually any source, including recordings generated by Web conferencing platforms such as WebEx.

Ramp software ingests video and audio content and uses natural language processing to convert the audio to text, then automatically creates time-coded metadata for the video. That processing allows keyword searching of the content and easy identification of the segments of interest to a user.

Because many enterprise networks do not have wide area networks (WANs) that are adequate to reach outlying regional areas, Ramp has joined the Riverbed-Ready Technology Alliance program. Riverbed technology supports a private caching network within a company. When multiple users are accessing the same content, it is cached locally to relieve traffic on the WAN to improve performance.

With the phasing out of Microsoft’s Windows Media Server and Silverlight, companies are seeking alternatives for multi-casting video, long accepted as an efficient approach to stream video to multiple recipients.

Ramp introduced the Ramp Multicast Engine (RME), which uses a company’s existing WAN infrastructure to support live video delivery of HTTP Live Streaming (HLS) video to iPhones, iPads and other mobile devices, which standard multicast solutions cannot do.

Digital Asset Management technology helps organizations leverage their existing investment, whether that is their network, their document management system or other resource. On the customer side, understanding the different use cases and the resulting implications for technology requirements should be a priority.

Galaxy Consulting has over 17 years experience in digital asset management. Please contact us for a free consultation.

Tuesday, August 1, 2017

Successful Knowledge Management

Knowledge Management (KM) main goal is to improve the internal processes of the organization so that it operates better, faster, cheaper, safer or cleaner. 

It is important to create KM strategy. KM strategy is about ensuring that KM processes are as good as they can be throughout the organization. 

Regardless of what industry your organization operates in, it is likely that you are concerned about operational efficiency and effectiveness, which means that operational excellence is a cornerstone of your KM strategy.

The KM strategy should include development and deployment of continually improving KM practices, process innovation, the use of communities of practice and knowledge base, and standardization of process wherever possible.

In a customer service organizations, KM aims to improve the delivery of knowledge to the customers and the people who work with the customers on a day-to-day basis so that customer relationships are maintained, service levels are high and sales volumes are increased. 

In a not-for-profit or non-governmental organization, your customers are the beneficiaries of your KM programs. Similar ideas apply in this circumstance as in a for-profit organization.

The crucial knowledge of an organization is that of the products or services that the organization offers, as well as knowledge about the customers themselves, the market, competitors and other participants in the sector. The majority of this knowledge will be internal with some external knowledge (knowledge from outside the organization) which is needed to fully understand the client, the market/environment, the competitors, etc.

Your KM strategy should include the creation of a reliable knowledge base of products or services for use by your sales force, your service force or your call center or your employees, allied with close attention to customer relationship management (CRM). 

There may also be elements of your strategy focused on the processes of selling and bidding, as even the best product or service will not make money if you can’t sell it. If your organization is in the service sector or is largely concerned with marketing and selling, customer knowledge is likely to be the cornerstone of your KM strategy.

Customer knowledge also applies to internal customers, for example, the IT department’s help desk for internal use. The help desk will need to be able to address employee technology issues based on what services and equipment an employee is using. There is less focus on sales and marketing in working with internal customer knowledge, but the other issues and concerns exist in this case also.

An innovation focus for KM involves the creation of new knowledge in order to create new products and services. The crucial knowledge is knowledge of the technology and of the marketplace. Much of this knowledge will be external, which is what primarily differentiates an innovation strategy from other KM strategies.

The strategy should include knowledge-creating activities such as business-driven action learning, think tanks, deep dives and other creativity processes, as well as knowledge-gathering activities such as technology watch and market research. 

There may also be elements of your strategy focused on reducing the cycle time for new products, as even the best product will not make money if takes too long to get it to the market. If your company is in the high-tech, bio-tech or pharmaceutical sectors, or any other sector with a focus on research and development and/or new products, then innovation is likely to be the cornerstone of your KM strategy.

A growth and change focus for KM involves replicating existing success in new markets or with new staff. It is critical to identify lessons learned and successful practices, so that good practices can be duplicated and mistakes learned from, and to transfer existing knowledge to new staff. 

New staff needs to be integrated efficiently and effectively with adequate training and knowledge transfer so that they become valuable members of the team as quickly as possible. Regardless of what industry you are in, growth and dealing with changing market and organization conditions are often considerations in your KM strategy.

Strategic Focus

In reality, companies may have elements of all four focus areas. They may be concerned about operating their companies efficiently, while also developing customer knowledge and retaining a focus on creating new products. 

However, the KM strategy should primarily address the most important of these four. Don’t spread yourself too thin; don’t try to do everything all at once. Instead, pick the most important driver, and devote your attention to developing an effective KM solution that addresses this focus area.

Doers vs. makers vs. sellers

Some companies do things, some make things, and some sell things. Different organizational focus, different approach to KM. The doers are concerned with operational efficiency, the makers are concerned either with operational efficiency or product innovation (depending on the product and the market they are in), and the sellers are concerned with customer knowledge.

Most organizations are a mix of doing and making, and all sell something, but the point is that depending on the market you are in and the type of product or service you have, you will have a different focus to your KM strategy. One of the main differences in KM strategies is the amount of attention placed on practice knowledge vs. product knowledge.

If an organization does things, its KM approach is all about the development and improvement of practice. The strategy would be to develop policies and procedures, develop communities of practice, and focus on operational excellence and continual practice improvement.

The same is true for the professional services sector and the oil and gas sector. In the case of the oil companies, selling the product requires little knowledge about oil (except for those few specialists concerned with selling crude oil to refineries), and the main focus for KM is on practice improvement. The KM framework involves communities of practice, best practices, practice owners, and practice improvement.

A typical product-based maker organization would be an aircraft or car manufacturer. They make things. Their KM approach is all about the development and improvement of product. They develop product guidelines for their engineers, their sales staff and their service staff. For example, Electronic Book of Knowledge would be a wonderful source to contain information about automobile components, Tech Clubs would be communities of practice.

In a maker organization, the experts are more likely to be experts on a product than on a practice area. With the more complex products, where design knowledge is critical, KM can become knowledge-based engineering, with design rationale embedded into CAD files and other design products.

If an organization is focused primarily on product learning, much of which learning is shared with the product manufacturer. For a product-based organization, the entire focus is on knowledge of product and product improvement.

The danger in KM comes when you try to impose a solution where it doesn’t apply. For example, imposing a maker KM solution onto a doer business, or an operational excellence KM solution onto an innovation business. This is why the best practice is to choose one area of focus for your KM strategy, and work with the parts of the business where that focus area is important.

Workforce demographics

Another factor that can influence your KM strategy is the demographic composition of your workforce. In a Western engineering-based organization, for example, the economy is static, and the population growth is stable. The workforce is largely made up of baby boomers. A large proportion of the workforce is over 50, with many staff approaching retirement. Within a company, very high levels of knowledge are dispersed around the organization, scattered around many teams and locations.

Communities of practice are important in a situation like this, so that employees can ask each other for advice, and receive advice from anywhere. Experienced staff collaborate with each other to create new knowledge out of their shared expertise. The biggest risk to many in an engineering-based organizations is knowledge loss, as so many of the workforce will retire soon.

In a Far Eastern engineering-based organization, the economy is growing, the population is growing, there is a hunger for prosperity, and engineering is also a growth area. The workforce is predominantly young with many of them employed less than two years in the company. There are only very few real experts and a host of inexperienced staff. 

Experience is a rare commodity, and is centralized within the company, retained within the centers of excellence and the small expert groups. Here the issue is not collaboration, but rapid integration and enhanced training. The risk is not retention of knowledge, it is deployment of knowledge.

These two demographic profiles would lead you to take two different approaches to your KM strategy. It is possible to combine the demographic view with the focus areas described previously.

Create your KM strategy based on a combination of the four focus areas and the two demographic types, with the addition of another demographic type, a balanced workforce with a good spread of young and experienced staff.

Galaxy Consulting has over 17 years experience creating and implementing KM strategy. Please contact us for a free consultation!

Thursday, June 29, 2017

Connect Content With People

Despite the promise of new technologies, effective enterprise content management feels more elusive than ever for a lot of companies. Part of the challenge is that there is simply more content produced than there used to be.

While organizations still have huge quantities of documents, employees are embracing new ways to record and share information which is everything from wikis and social media to websites and videos.

These new formats open the door for more dynamic, interactive communication. But they also create content management hurdles, especially since the content tends to be unstructured and trapped in different systems.

But even beyond that, the way people access and consume content is changing to reflect a workforce that is increasingly hurried, mobile and collaborative. No one wants to read long reports or manuals any more. Employees demand just the information they need in the format they want. They assume the enterprise search function will work like Google, regardless of technical constraints, security restrictions, and cultural norms about how different parts of the business share information. And they expect content to be delivered to them seamlessly no matter where they are, what network they are on, or what device they’re using.

Organizations need to anticipate content needs and link people to the best information to support their jobs.

Putting Content in the Flow of Work

The best practice organizations use a range of tools in pursuit of the elusive goal of “findability”. It is everything from advanced search solutions to intricately crafted taxonomies and hand-picked search results for particular keywords.

However, making it easy for employees to retrieve content from repositories is only half the battle. To get the most value out of content, the best practice organizations must put it directly in the path of employees so they can access it in the context of what they are doing at a given moment.

Many best practice organizations build templates, guidelines, best practices and FAQs directly into business processes and applications and all either mandate or strongly encourage employees to use that content in the context of their jobs.

What does it mean to build content directly into business processes and applications? It depends on how an organization is structured and the type of work it does. For companies that operate according to well-defined business processes, it might mean arranging content in line with process documentation so people can drill down into information on specific tasks and activities.

But if a company's work revolves around client projects, it may want to push content directly to project team sites so relevant resources are immediately available to people in the field delivering projects. Still other companies embed how-to information, FAQs, and tips and tricks into software applications where employees do their work.

One of the most direct ways to incorporate content into the flow of work is by inserting it directly into enterprise applications where employees enter data, complete tasks and interact with customers. Companies can integrate procedures, job aids, templates and links to customer folders into the account management system and service portal so client service consultants have relevant information at their fingertips.

Underwriting content is similarly embedded in software for sales quotes and renewals. Content integration is customized by job role so that guidance is based on the specific role and workflow in question. The integration points provide a fast track to relevant resources, ensuring that employees benefit from collective knowledge even if they’re in the middle of a quote or speaking with a customer.

Provide Mobile Access with Mobile Apps

The other key to putting content into the flow of work is to make sure that people can access that content when they are working, even if they are not in a traditional office environment.

All the best practice organizations provide native mobile applications enabling access to enterprise content, regardless of location. Those apps are significantly more effective than mobile browsers, which are how most organizations currently provide access to content.

Content apps offer the same search and filtering capabilities available on desktop computers, which means employees have similar experiences in both contexts. Some apps are read-only, but others let people upload data or comments.

Regardless of the specific functionality provided, the best practice organizations agree that mobile access increases engagement and makes it more likely that employees will access content when they need it.

Mobile Security

Of course, when you start putting enterprise IP and data on smartphones and tablets, security becomes very important. The best practice organizations have thought a lot about protecting content in the mobile environment, and each has its own strategy, such as for example a secure firewall that requires login credentials.

But despite the challenges, mobile access is necessary. Organizations can’t ignore the risks, but they have to move forward. Employees expect to interact with enterprise information from their devices, and employers must deliver mobile access if they want to reap the full benefit of their content.

The best practice organizations do everything they can to ensure that relevant content is delivered directly to employees in the course of their work.

Galaxy Consulting has 17 years experience in connecting content with people and delivering it to employees when they need it to do their jobs.

Wednesday, May 31, 2017

Specialized Strategies for Enterprise Search

Enterprise search is the practice of making content from multiple enterprise sources such as databases and intranets, searchable to a defined audience. "Enterprise search" is also used to describe the software of search information within an enterprise.

Enterprise search systems index data and documents from a variety of sources such as file systems, intranets, content management systems, e-mail, and databases. Many enterprise search systems integrate structured and unstructured data in their collections. Enterprise search systems also use access controls to enforce a security policy on their users.

Enterprise search as a standalone application for locating documents throughout the enterprise is still going strong, but many search engines are now embedded in applications that people use as their primary work environment. Search solutions are also used on more complex tasks such as locating relevant information found in either databases or diverse document repositories.

New search tools are emerging for searching geospatial data and processing it with other types of information to provide an enriched and informative blend.

SAVO

One of the new enterprise search applications is SAVO.

SAVO focuses on driving greater sales productivity, and one way of doing this is providing the content that is needed by sales reps in context. Content can be stored in such a way where users can access it either locally or remotely. The SAVO platform provides search capabilities in a content repository, which contains approved information designed to support salespeople as they do their jobs. The search function is configured with a profile for each salesperson to push out the information that will be most useful to each individual.

By using profiles, overall selling methodology and model can be reinforced. The information pushed out might include a list of documents for a client meeting, a video clip, information by product line or a summary of facts about a competitor. To support the sales reps’ conversations, SAVO provides context-relevant information.

Although content for different phases of the sales cycle is predefined, if unexpected questions arise, sales reps need to have the right information at their fingertips.

By setting up rules for tagging and reviewing information as it is added to the repository, SAVO is able to keep the right information in front of the sales rep. All the information they need is in the backend. The strategy is not to present the rep with a blank search box. Instead, at the frontend, the rep is asked a series of questions such as what product they are dealing with, what type of document they want, whether it’s a customer-facing presentation or a brochure. In two or three clicks, they have reduced a potential list of 60 documents to about five.

Also searchable on the SAVO platform is previously tacit knowledge that has been captured in forums or in comments. Referred to as “tribal knowledge” by SAVO, the content is generated by employees who post questions and answers on a subject matter expert moderated forum within SAVO. That content can be sought on a proactive basis by the sales reps when they have a specific question. Between the profiled information and the responses to ad hoc queries, the sales reps are able to respond with accurate and timely information.

The search function is enabled on mobile devices as well as the desktop. Because sales is both a global and regional activity, SAVO supports multiple languages, including ideographic languages such as Japanese and Chinese, and the search function is available for all the foreign language versions of the product.

The search capability provided by SAVO is useful not only for salespeople in the field but also in onboarding. It helps new employees who are learning the ropes, because they can access instructional material, reference material and conversations posted by more experienced sales reps.

DtSearch

DtSearch is effective because it is a mature product with a very stable API.

Search applications with complex data models pose special challenges. One such example was a search application Contegra built for Carrier Corporation. The number of parts is large, and many parts have revisions to them that have been built over the years. Serial numbers, model numbers and other identifiers must be incorporated. Also, the system had to be able to identify what product the customer originally received because that would affect the replacement part needed. In addition to searching the database, the application must be able to locate technical documentation, brochures and other files in a variety of formats.

DtSearch is able to search across SQL databases and PDF documents, and then search parts referenced within documents. It sounds like a simple task to recognize parts numbers, but when you have 50,000 of them with complex relationships and want to do the search quickly, using the right search tool is very important.

It is able to understand the search needs of different users. Engineers see the world as a set of different systems, such as engines and drives systems, which have specific code numbers. There are hierarchical taxonomies, model numbers and serial numbers. Retrieving the exact information for each installation is critical when considering the action being taken is a certain type of repair or service. Each user group could have a different perspective on the data that requires it to be searched and viewed differently.

Particularly in specialized applications, the upfront work is critical. You would want to be sure that when the users do a keyword search, the information is presented in a meaningful context, so they can narrow it down to the exact part and the nature of the document, whether it is a technical publication, leaflet or catalog. A good content model enables both the field service staff and external customers to access information in a self-service mode, which cuts costs for the company.

Geospatial Search

Geospatial information has become increasingly important for many different applications and analyses ranging from marketing to agriculture. Yet the management of geospatial information has lagged that of any other kind of information. Although maps and imagery can be stored in a repository like any other digital files and searched according to indexed metadata, the ability to perform more complex searches on the data and process it once retrieved has been limited.

Voyager Search is a search solution designed specifically to manage spatial information. It combines modern search technologies with a unique understanding for geospatial data.

It includes an indexing solution that can extract information from nearly any type of geospatial data regardless of format. It later began supporting other non-spatial documents such as PDF and various Microsoft Office formats. It also offers a solution that enriches data through linking documents to a map.

Users are able to define a geographical area on a map and then search for relevant information about it. The user can find all the river data or stream flow data in that area, for example, or reports by dragging a box on a map. This ability is not available in other search engines.

Voyager Search can manage very large quantities of spatial data. Making data available and accessible while keeping it secure is a big challenge. A geospatially enabled search solution is critical not only to providing online access to geospatial intelligence, but also to broadening the analytical expertise of the organization. Voyager accomplishes that by providing the tools to index a wide variety of content and to add a layer of geospatial intelligence to the index.

The process for narrowing results from a search of geospatial data differs from that of other types of data. Search results from a demographics database, for example, might be narrowed by selecting only one income category. In a spatial search, “refine your search” might mean “view only content in a specific geography” in order to look at one aspect of the results.

Voyager Search can also create new files derived from geospatial and image data. For example, a petroleum engineer could look for seismic shockwave data from a certain time interval for use in the analysis. Emergency responders might look for recent areal photographs taken in their area and run them through an imagery analysis process to look for hot spots that would indicate the most recent fire perimeter. Voyager Search offers that capability, known as “geoprocessing,” which includes a variety of ways to manipulate geospatial data at the desktop.

These are just few new search applications currently on the market. Galaxy Consulting has over 17 years experience in enterprise search and enterprise search applications.

Sunday, April 30, 2017

E-Discovery Tools

Electronic discovery or e-discovery refers to discovery in legal proceedings such as litigation or government investigations where the information sought is in electronic format. The ever increasing amount of litigation, greater volumes of data and a move toward adding in-house e-discovery capabilities require strong tools for e-discovery.

Data is scattered throughout companies and has become progressively more difficult to manage. Companies are dealing with big data, data in shared repositories such as Box.com, data on mobile devices, etc.

Data must be protected during e-discovery just as it does when it is a part of any other business activity. The degree of security risk depends on the nature of the data. Standard business contracts might not be highly sensitive and thus create minimal risk, but exposure of intellectual property that represents the crown jewels of a company could be a major risk.

When legal hold is used effectively, companies can meet their preservation duties, then do targeted collections as needed in the case. Good hold process plus targeted collections can significantly reduce the amount of information that must be reviewed by attorneys, which accounts for 70 percent of e-discovery costs.

Another value proposition in using an automated legal hold solution that is integrated with collections and first-pass review is the ability to re-purpose a collection.

Cloud offerings could be used to centralize all this data in one place for efficient reuse and risk management.

Several trends are contributing to strong growth in tools for the e-discovery. In addition to a group of large e-discovery vendors, many smaller vendors have products that are working well for their customers, and there is also room for new entrants that improve performance or address specific needs.

Each product has particular strengths, and that wide array offers options that can be used very selectively or in conjunction with each other to meet a company’s goals.

Sometimes, legal holds are required. Legal holds are required when a company might reasonably expect litigation and therefore should not delete information that might be relevant to the litigation.

Legal Hold Pro

This application has templates for the system and the database with the contact information for employees who are custodians of data. The system can also be used to track the information and people affected, automate the interviews with custodians, send reminders and release holds when appropriate. It allows to check the information of terminated employees to see if it might be subject to hold, and review responses from custodians to create the collection plan.

The same collection and review tagging could be used again by adding only the incremental data generated since the original one.

As a cloud product, Legal Hold Pro is quick and easy to launch, and is updated frequently.

Technology-assisted review (TAR)

Once a set of documents is located that may be responsive to the e-discovery request, it needs to be searched. The effective use of human skills in conjunction with computer capabilities is a key ingredient in lowering down the volume of data that needs to be reviewed by attorneys or other legal professionals.

Technology-assisted review (TAR), also called predictive coding, is a method for training a computer to spot documents that may be relevant and distinguish them from those that are not.

Catalyst

Catalyst provides e-discovery software and services.

Catalyst Insight is a secure cloud-based platform where clients can search, review, mark and produce documents. It can be augmented with Insight Predict, a predictive ranking TAR 2.0 solution that uses continuous active learning (CAL) to speed the review process by allowing technology to work alongside the judgments that human reviewers make. The solution brings the most relevant documents to the top of the list rather than working in a linear fashion.

The company’s TAR 2.0 software is specially designed for e-discovery. Some of the early TAR products were re-purposed machine learning tools. They can work in situations where the target documents are a large proportion of the total, but if you are looking for the one percent that are ‘hot docs,’ then they are not as effective. With TAR 2.0, attorneys and legal professionals who are subject matter experts do the initial coding for relevancy. Each of their judgments about the relevancy of a document is fed back to the system as a means of “training” to identify others that also might be relevant.

In the case of earlier versions of TAR, adding new documents caused the random sampling assumptions to no longer be correct. Unlike earlier products, which had a finite learning phase and then a production phase, TAR 2.0 allows new coding to be immediately incorporated into the algorithm for searching the document repository so that it is correctly tuned to the current problem domain.

It allows every decision made by an attorney to be put to maximum use, allowing humans to do what they do best, and then let the computer do what it does best, which is to quickly surface the relevant documents.

One practical limitation of early versions of TAR was that it could not handle small volumes of documents because the usual percentage of samples did not provide enough examples from which the computer could learn. This became improved in later versions of the tool.

Recommind

In 2006, the federal rules for discovery changed to include discovery of electronic information. E-discovery includes the collection, processing and analysis of e-mail and other electronic documents that might be relevant to a case, including determination of whether the documents are indeed relevant.

What sets Recommind apart from many industry solutions, is the ability to prioritize records and pull together similar records.

Recommind’s Axcelerate product can research, collate and assemble electronic records into reports. The electronic records for a single case can sometimes number into the millions.

Axcelerate’s adaptive batching expedites the feedback loop on search or analytics-based document sets, making continued batching not just automatic, but also conditional on the relevancy found through sampling. That enables a law firm to determine by batch if certain records are indeed relevant to a case, rather than reviewing them individually.

Magnum Software

It allows to quickly search, annotate and link to portions of documents. The collaboration capability is quite robust. Users can share their work product with any other users or groups of users via a one-click e-mail alert.

The alert automatically includes a direct link to the note and passage so the recipient can log in from anywhere, review the remarks and continue the discussion thread within Opus 2 Magnum. Additionally, multiple users can “chat” within the application.

The application works much better with smaller files than loading them all to a large database, but Magnum can also scale for larger file sizes.

Exterro

This in an excellent tool for eDiscovery. It provides eDiscovery and other records management needs in a single platform. Genome data mapping module can be added which will create an excellent solution for the data mapping.

With the increasing number of records and need to keep track of them and pull them together efficiently, the demand for KM technology for records and information management will continue to grow.

Galaxy Consulting has 17 years experience in ensuring that ediscovery process is going smoothly.

Sunday, January 22, 2017

Five Trends of Knowledge Management

Many issues affect knowledge management. The five most important are big data, cybersecurity, mobility, social analytics, and customer engagement.

The availability of big data has opened many options for understanding everything from customer preferences to medical outcomes.

Amidst all that data, concerns about security have grown, so cybersecurity is taking on new importance. Mobility has become pervasive and affects nearly every element in KM, while social analytics is providing insights at a personal level that were never possible before.

Finally, although those four factors feed into many KM objectives, enhancing customer engagement has taken a place at the top of the priority list for virtually every company and is likely to remain there for some time.

Big data

The most dramatic trend impacting knowledge management is harvesting and analyzing big data. An esoteric phenomenon just a few years ago with a new set of technologies and terminology, big data is now wrapped into the strategic plans of many organizations, and not just the big ones.

There are few applications to help with this challenge.

One of them is Hadoop. It can help to integrate complex sets of data to make business decisions and marketing efforts.

Actian Analytics Platform is a big data analytics solution that is accessible and affordable for small businesses, but also scalable to large ones. It can be used to target right customers. It can also be used to generate an economic case for potential buyers.

For example, Yahoo uses Actian to segment millions of users across 10,000 variables, looking for clues that will help predict customer behavior. Amazon uses Actian to provide the core technology components for its cloud-based data warehouse.

The technology can pull together diverse data in near real time as it flows through the data pipeline, marketing, customer engagement, risk assessment and many other applications. At both ends of the spectrum, from startups to large-scale users, big data is the central force in converting large amounts of data to decision-supporting information.

Cybersecurity

With so much information at large, unauthorized access to it has the potential to be destructive. Knowledge management is focused on information. What makes KM so important is that people can get information and analyze it better. In the past, it was hard to find out who was buying products and how they felt about them. Now an enormous amount of information is available, which has benefits. The information can be stolen and used financial gain.

The cybersecurity market is expected to increase from $95.6 billion in 2014 to $155.7 billion by 2019, resulting in a 10.3% per year increase during that time period. This amount includes network, endpoint, application, content and wireless security as well as many other types of technology. Innovative products are emerging in response to increased threats.

The volume of data, including an entire new collection from the Internet of Things, the challenges of mobile devices, greater use of the cloud for data storage and the broad impact of consumer concern are all sparking the growth.

Cybercrime comes in many forms, from stealing credit card numbers out of a merchant’s database to identity theft of consumers. A common strategy is for a cyberthief to obtain some publicly available information about an individual and use it to open an account or figure out a password that provides them access to an account. Users need to be vigilant about changing their passwords and making them strong. Technological safeguards can be put into place, but security depends a great deal on the human effort.

Mobile devices add another element of risk. They are much easier to lose or to steal, and often contain sensitive information such as bank passwords. Technological advances such as the ability to remotely disable a phone will continue to emerge to protect users from the impact of cybertheft. However, the result of users being careless with physical security, such as leaving a laptop in an unlocked car, remains a threat.

Companies can mitigate the impact on their customers by limiting the responsibility of users in the event of fraud or identity theft. Industries are growing up around providing insurance for such scenarios, either to the merchant or the customer.

Mobility

Although mobility brings hazards, it has brought even more advantages, and it will continue to drive the pervasiveness of knowledge management. Increasingly, knowledge management solutions, including content management, process management and analytics, have mobile versions of the solution. No longer a miniaturized version of the desktop browser, mobile apps are delivering usable KM applications.

Mobility is also forging new paths. For example, Apple Pay allows use of the smartphone as a wallet.

One mistake merchants make in designing mobile apps is to try to duplicate a physical purchase experience on a mobile device. Merchants should not necessarily automate an existing process, but instead should look at the experience holistically. Mobile experiences have to be simpler and as good as, if not better than, the non-mobile experience in order to gain loyalty from the customer.

Barriers remain in the use of mobile devices for enterprise applications, but the barriers also represent opportunities. In a study of U.S. and U.K. information technology decision makers conducted by Vanson Bourne, respondents reported that although more than 400 enterprise applications were typically deployed in each organization, only 22% of them could be easily accessed on mobile devices.

One reason for that is the diversity of enterprise applications. Some are custom, some are SaaS and some are off-the-shelf, and the technology for accessing each one is different. Therefore, development of mobile apps for such applications is needed, but organizations are hampered by the high cost. More efficient development techniques would be a big benefit.

The proliferation of mobile devices has also increased a number of other supporting sectors beside mobile application management (MAM), including mobile content management (MCM) and mobile device management (MDM). Each of them has a touchpoint to knowledge management and should be viewed in conjunction with an overall KM strategy.

Social analytics

Social analytics is a booming market which is expected to triple over the next five years to nearly $9 billion and showing a growth rate of nearly 25% per year. Initially based on simple counts of the number of times a brand was mentioned in social media, analytics has evolved to the point where it is using sophisticated algorithms that support the use of social data for targeted marketing and for initiating customer service.

Social analytics moved from hindsight to insight and now to foresight, with predictive capabilities. SAS social media solutions include integration and storage of social data, general text analytics and analysis of comments for sentiment, and a social conversation module that can work directly or integrate with third-party engagement solutions.

Real-time analysis allows marketing or brand campaigns to be synchronized with the topic threads that are emerging. Decision trees allow ‘what-if’ scenarios such as the impact of increasing the frequency of an ad, or combining customer segments. These analysis allows users to determine the relationships among various factors and to present visualizations of the relationships for better marketing decisions.

The value of social media analytics is also increased by combining it with data such as purchasing information from the data warehouse, to compare customers’ stated intentions with actual behavior. There is tremendous growth in analyzing social media information along with data from the Internet of Things which measures physical activity to build a profile not just of transactions but of tone and behavior along the customer journey.

Social media analytics should not be isolated. The information should be tightly connected to upstream data so different departments can use it to drive the customer experience.

Customer engagement

The driving force for all of the above is customer engagement - collecting and managing big data, keeping information secure, enabling mobility and analyzing social media inputs. The ultimate goal is to engage the customer, whether for marketing, customer support, participation in loyalty programs or some other outcome.

The key for customer engagement is omni-channel. Whether the interaction is initiated by the customer or the organization, customers want options in the delivery channels.

Customer engagement is not a static business area. The feedback obtained through social analytics and traditional business intelligence can be merged to explain both what customers are doing and why. That information can guide the delivery of marketing materials and help provide better customer service.

Galaxy Consulting has 17 years experience in knowledge management. We have lead knowledge management initiatives. Contact us for a free consultation.

Saturday, December 31, 2016

Search in the Land of Information Silos

Information access and retrieval within most organizations is a work in progress. There might be a general search system for marketing information, and probably one or more database search systems.

The larger the organization, the greater the number of information retrieval systems. Each laptop and mobile device has a search system. Mobile phone apps sport their own search systems. The lawyers in an organization may have different search systems for specific types of legal matters. The enterprise resource planning (ERP) users have a search system. When it comes to enterprise search, there are many silos.

A “silo” is a content collection available to certain users. In the face of the reality of silos, it might be impractical idea of providing access to “all” information. “All” may not mean all or even some available information. Big data is easy to talk about but difficult to make accessible. The same challenge exists for images, audio recordings, and engineering drawings with details hidden into the proprietary system’s database.

Search which is variously called universal, unified or federated search is a solution to the challenge of information silos. The term meta-search is often used to describe an integrating function that passes the user’s query across discrete content indexes and returns a single results list to the user. Endeca, Inxight Software, Northern Light, Sagemaker and Vivisimo are search applications that can be used for universal, unified or federated search in an organization.

The initial query might not unlock the information stored in the system’s index. The facets, topics and suggests make it easy for the user to click through the links without having to craft additional queries.

Behind the curtains, federated search results requires some maintenance. A user does not want to know the file format in which the information he or she needs is stored. The user wants answers. Early federating systems like WAIS relied on standards for content representation. Today, however, there are many “standards,” and content processing systems must be able to process content in the hundreds of formats found in organizations.

It is important to deliver a system that makes an organization’s disparate types of digital content available.

There are barriers to unified, federated or integrated search.

Some digital content cannot be included in a general purpose search system for security, business or legal reasons. Technical content such as chemical structure information at a pharmaceutical company requires special purpose systems. The same need applies to product manufacturing data, legal information and engineering drawings.

Most search applications exclude video streams from the index. If video is indexed, the system processes the text included in the digital file or indexing provided by the video owner.

The cost of creating connectors to connect with certain content types could be too high, or license fees could be required to gain access to the file formats.

The computational burden required to process certain types of content might exceed the organization’s ability to fund the content processing. Big data, for example, requires a computing capability able to handle the Twitter stream, RSS feeds and telemetry data from tracking devices. Cost could be prohibitive for processing all content types.

The most important challenge is the need for confidentiality. The legal department does not want unauthorized access to information related to a legal matter out of its control.

Some government contracts required that for certain types of government work, the information related to that project must be separated. Common sense dictates that plans for a new product and its pricing remain protected. If someone needs access to that information, a different search system may be used to ensure confidentiality.

Even in the absence of business or legal requirements, some professionals do not want to share content. That may be a management problem. When a manager locks up information in a no-access silo, a software script will skip the flagged server.

To summarize, silos of information present a challenge to process and effectively use in organizations. In the enterprise, integration should take place within silos of content.

Galaxy Consulting has 17 years experience in integrating information silos using universal, unified or federated search. We have experience with search applications. Contact us for no obligation free consultation!

Wednesday, November 30, 2016

Three Values of Big Data

Big Data is everywhere. But to harness its potential, organizations should understand the challenges that come with collecting and analyzing Big Data. 

The three values that are important in managing big data are volume, velocity, and variety. These three factors serve as guidance for Big Data management, highlighting what businesses should look for in solutions.

But even as organizations have started to get a handle on these three V’s, two other V’s, veracity and value are important as well, if not more so.

Volume is the ability to ingest, process, and store very large data sets. Definition of "very large" can vary by business and is dependent upon the particular circumstances of the business problem, as well as the preceding volumes used by that business.

Volume can also be defined as the number of rows, or the number of events that are happening in the real world that are getting captured in some way, a row at a time. Accordingly, the more rows that you have, the bigger the data set is going to be.

Bigger Volumes, Higher Velocities

In today’s digital age, having huge volumes of data is hardly rare. The proliferation of mobile devices ensures that companies can gather more data on consumers than ever before, and the rise of the Internet of Things will only increase this plethora of data. Moreover, businesses will have even more information on customers as they begin to use one-on-one messaging channels to interact directly with them.

The sheer volume of data available to us is greater than ever before. In fact, in many ways, nearly every human action can be quantified and logged in a bank of data that’s growing at an incredibly fast rate. All of this data can be turned into actionable insights that drive business decisions and can help transform every customer interaction, create operational efficiency, and more.

This increase in data volume is paired with a simultaneous increase in speed. The speed with which the volume is increasing, as well as the volume itself, are both increasing. These increases have forced IT staff to spend more time trying to figure out how to process and analyze that data.

Velocity is the key V of the three V’s. For example, a customer will visit a company’s site or use its mobile application but only for a short amount of time. The business may have just seconds to gather customer information and deliver a relevant response based on that information, usually just one message or offer.

This quick turnaround time requires you to process all of that real-time behavioral data as fast as possible. If you only understand that your customer was on your Web site the day after, you’re not able to contact them anymore. One aspect of a successful customer journey is being able to send the right message at the right time to the right customer. Timeliness and relevancy are the foundation of delivering personalized customer experiences in real time.

A Variety of Formats

Data sets are in a variety of formats, and the number of data types continues to grow. Radio-frequency identification (the use of electromagnetic fields to gather information from tags attached to objects), smart metering (devices that monitor information on energy consumption for billing purposes), and the ubiquity of mobile devices with geo-location capabilities are only few examples of diverse sources of consumer information.

All of these technologies have their own methods of capturing and publishing data, which adds to the complexity of the information environment.

But overcoming these data complexities could be well worth it. Having a large variety of data is crucial for creating a holistic customer view. Access to data such as a customer’s purchasing history, personal preferences based on social media postings, exercising habits, caloric intake, and time spent in the car can help companies understand that customer on a deeper level, and thus build experiences that are tailored to that customer.

But this diversity of data sources can be a blessing and a curse. A blessing because businesses have an increasingly large range of channels from which to pull customer information, but a curse because it can be difficult to filter through that information to find the most valuable content.

Variety is a little overstated in what people talk about for Big Data. Audio and video as examples of data formats that can be particularly difficult to analyze. Usually what companies do is they try to come up with an intermediate representation of that data, and then use that intermediate representation to apply old or new algorithms to try to extract signals, whatever the definition of signal is for that business problem they’re trying to solve.

Volume, velocity, and variety are undoubtedly important to managing customer information. Companies should keep in mind other important aspects of big data if they want to make the most of it.

Data tools such as Apache Hadoop and Apache Spark have enabled new methods of data processing that were previously out of reach for most organizations. While the growing volume of data, the time needed to process it, and the sheer number of input sources pose challenges for businesses, all three can largely be addressed through technology.

New V's Emerge

Investment in Big Data has begun to stabilize and enter a maturity phase over the past year. It will take time for infrastructure and architectures to mature, and best practices should be developed and refined against these architectures.

Organizations should consider how to use Big Data to bring about specific outcomes, in other words, organizations should examine the challenges of Big Data from a business perspective as opposed to a technical one. A framework that incorporates the business-oriented characteristics of veracity and value can help enterprises harness Big Data to achieve specific goals.

Not all data is the same, but organizations may not be paying enough attention to changes within individual data sets. Contextualizing the structure of the data stream is essential. This includes determining whether it is regular and dependable or subject to change from record to record, or even with each individual transaction. Organizations need to determine how the nature and context of data content in all its forms, text, audio, or video, can be interpreted in a way that makes it useful for analytics.

This is where the veracity of data or the trustworthiness of data comes in. Determining trustworthiness is particularly important when it comes to third-party data. It passes through a set of edits and validation rules.

Veracity entails verifying that data is suitable for its intended purpose, and usable within a given analytic model. Organizations should use several measurements to determine the trustworthiness and usefulness of a given data set. Establishing the degree of confidence in data is crucial so that analytic outputs based on that data can be a stimulus for business change.

Important metrics for evaluating and cleaning up data records are:
  • completeness measurements, or the percentage of instances of recorded data versus all available data within a business ecosystem or market (or the percentage of missing fields within a data record);
  • uniqueness measurements, or the percentage of alternate or duplicate data records;
  • accessibility measurements, or the number of business processes and personnel that can benefit from access to specific data, or that can actually access that data;
  • relevancy measurements, or the number of business processes that utilize or could benefit from specific data;
  • scarcity measurements, or the probability that other organizations including competitors and partners have access to the same data (the scarcer the data, the more it has impact).
Value is Paramount

While veracity can’t be overlooked, value is the most important factor. The first three V’s are really talking about architecture, infrastructure, representation of data, things that are important to IT organizations and, by far, less interesting to the business stakeholders.

The business stakeholders really don’t care about the first three, they only care about the value they can extract from the data. Executives often expect the analytical teams at their organizations to hide the first three V’s (volume, velocity, and variety) and only generate the last V - the value that is fundamental to the success of the business.

The concept of value is essential for organizations to succeed in monetizing their data assets. Value is a property that helps identify the purpose, scenario, or business outcomes that analytic solutions seek to address. It helps to confirm what questions are to be answered and what actions will be taken as a result, and defines what benefits are anticipated from collecting and analyzing the data.

Value is a motivating force when it comes to developing new and innovative ideas that can be tested by exploring data in different ways.

The ability to pull valuable information from Big Data and use that information to build a holistic view of the customer is absolutely critical. It’s no longer just an option to develop one-to-one relationships with customers; it’s a requirement. And to build that relationship, companies have to leverage all the customer information they can to personalize every interaction with them.

By using such information to lead customers on a personal journey, companies can help ensure that customers will stay with them long term, and even become brand advocates. Value is derived from making the data actionable. Organizations can have all the information about a customer, but it’s what we they can do with it that drives value for the business.

The Three V’s model of volume, velocity, and variety is useful for organizations that are just beginning to take control of their data, and certainly should not be forgotten by enterprises that have advanced further in their management of customer information.

The first three V’s are equally important. In the digital age, companies have accumulated more data than ever before, are pulling data from a variety of sources, and are increasing the rate at which that data flows, and that a combination of these three factors can help organizations to create relevant, personal, and one-on-one customer interactions.

Deriving value is the ultimate business goal for any enterprise. The standard Three V’s model does not satisfactorily identify any data properties from a business usage perspective. Even though Big Data, and data in general, provides organizations with a lot of capabilities, the challenge for businesses is to make sure that they adapt how they think about the business processes, how they report on them, and how they define key performance indicators.

Organizations should try to get to the value. They need to turn that data into value. It’s figuring out how to use that data to optimize business processes. In the end, the Three V’s model for Big Data is a useful start point. But then it becomes about the ultimate goal, the one organizations must not lose sight of: driving value.

Galaxy Consulting has 17 years experience in big data management. We are on the forefront of driving value of big data.

Friday, November 11, 2016

Voice Search

Roughly 56% of teenagers and 41% of adults use voice search on their mobile phones every day, according to Northstar Research. 

For example, modern consumers in Boston are much more likely to ask Google Now, Siri, Cortana, or Amazon’s Alexa to find the nearest coffee shop than they are to type "coffee shops near Boylston Street in Boston" into a search bar on Google's homepage.

This truly creates a challenge for search engine providers and for the providers of those personal assistants. But as consumers increasingly turn to voice search on their mobile phones, those Boston coffee shops will now have to rethink their search engine optimization (SEO) strategy if they hope to show up in voice search results.

Search is a science, and the rules are different for text and voice search.

It will only become more vital that business data on a company web site such as store locations, hours, and contact information is accurate and up to date. Businesses will also need to make sure that they are portrayed accurately on local review sites like Yelp.

Companies also need to consider how and where consumers are conducting their voice-based searches. When using computer-based search, it's assumed that a user is at a computer, so there is more screen space and more time to search. When using a mobile device, it's assumed a user is out, time is short, and the user needs access to quick bits of information on a small screen.

Therefore web sites need to be designed so that they dynamically adjust to fit any screen the consumer is using.

Google and Bing use mobile-friendliness as a ranking factor in their SEO algorithms. Both Microsoft and Google offer tools to help companies determine whether their sites are mobile-friendly. The tools look at factors such as loading speed, the width of page content, the readability of text on the page, the spacing of links and other elements on the page, and the use of plug-ins.

When it comes to voice search, web content that delivers the answers consumers want, in the quickest way possible, will ultimately win. The information should be concise and to the point, with more of an emphasis on usefulness than visual appeal.

Web content should be presented in more of a natural, conversational style and structured more like FAQs, answering the questions consumers might pose in voice search queries without requiring them to click on additional links or take other actions. Voice searches might be initiated in the car while someone is driving.

Companies also need to consider how consumers ask for information through voice search. More often than not, voice search queries are phrased using the same types of who, what, when, where, how, and why questions that are part of natural conversations. During these conversational, natural language search queries, consumers do not typically use the same keywords or metadata that are the hallmarks of text-based searches. Using basic keywords to set SEO parameters alone is no longer enough.

Use long-tail keywords. Rather than relying on a single word or phrase, long-tail keywords involve multiple keyword phrases that are very specific to whatever the company is selling.

Companies need to teach their systems and the search engines a very specialized lexicon that corresponds to their product and service names.

SEO rules keep changing and so SEO strategy needs to change accordingly.

Galaxy Consulting has many years experience with search. Contact us for a free consultation.

Sunday, October 2, 2016

Content Localization

Content is the fuel of your organization. Content spills into all areas of an organization and fuels absolutely everything with which customers interact, whether it's a commerce web site, marketing or sales materials, or any other channels.

Preferences for content vary by continent. Companies looking to expand operations overseas will inevitably face cultural and language barriers. They will have to think about how content is resonating with global audiences and they will face the challenge of determining which content to adjust to meet the needs of local markets.

To help ensure a quality customer experience, the top global brands employ a localization strategy to adapt their online content for regional specificity. Content should clearly resonate with companies' international audiences.

Content localization is the process of modifying content to make it usable for a new locale. Often, this includes translating the content from the source language into the language used in the locale. It is an integral part of adapting a product for a particular market. For example, if a company is currently selling products in US and is expanding into France, the company would need to translate the content into French.

Content translation and localization are not the same. Translation is the process of changing an original language of content into a different language. Content localization is a more specialized process of adapting content for regional or local consumption. It goes beyond translation to modify the source language and other site elements to appeal to the customer’s cultural preferences in their own target language. Localization is about refining your message and curating your brand to meet the cultural, functional, and language expectations of your global markets.

Why to localize the content? There are few reasons:
  • catering to local customers - to show your local customers that you care about them and to bring more value to your products;
  • reduce risks - to avoid liability by not using words that might be offensive in a different country;
  • enhance marketing - having a great campaign or advertisement is useless if your target audience doesn’t speak English well enough to understand your marketing message;
  • increase sales - localized content will sell more often because consumers will be able to fully understand what they are buying.
Content localization includes other processes in addition to translation. For example, U.S. and French currencies are different, so you might need to change references from dollars to euros. It is possible to localize information without translating it such as changing examples, currency, geographically specific information, and perhaps color schemes.

These are standard best practices that help to minimize the cost of localization:

Content Development:
  • source text should use clear, simple, concise language;
  • do not use jargon, idiom, metaphors, or other creative language;
  • use simple words;
  • use simple language structure;
  • use consistent terminology
  • limit text in graphics, and put text into an editable layer;
  • use legends rather than embedded text for graphics.
Formatting:
  • use predictable structure;
  • use styles or structured authoring;
  • do not use embedded formatting.
Delivery

Include a language flag on content so the system can deliver content to readers in the language they want.

Localization
  • use professional translators;
  • use translation memory systems;
  • work with localization professionals to identify potential problems;
  • look for opportunities to change source content instead of requiring changes in each target language.
When creating graphics, make sure that they are culturally neutral and acceptable for global audience. These are the best practices for graphics:
  • use color carefully because colors meaning varies across cultures;
  • images of people are not appropriate in some cultures and in other cultures, images of women are not acceptable;
  • images of hands or feet are not appropriate in some cultures;
  • when designing icons make sure that they are universally understood.
Some examples of cultural content include:
  • colors, shapes, sizes, styles;
  • images, icons, graphics;
  • societal codes; i.e. humor, etiquette, rituals, myths, symbols;
  • societal values, power, relationships, beliefs.
Some examples of functional content include:
  • date and time formats, telephone numbers, contact information;
  • weights, measurements, geographical references;
  • language and linguistic content; product descriptions, reviews.
Content translation and localization also differ on a tactical level. While simple translation may be appropriate for some content types in certain markets, localization is most often required for adapting highly emotive, creative marketing content so that it clearly resonates across locales.

There are several content types, from marketing collateral to legal and technical information and user-generated forum content. For reasons of efficiency and cost, is is the best practice to map these content types to the most appropriate translation or localization methods.

Not all company's content needs to be translated. Companies that located in one location should make it a priority to determine which content should be tailored for various regions. For example, blogs and tweets are not as significant as marketing and training materials, one help sheets, business forms, support email, FAQs.

It is generally easier to select the best fit when you consider your audience and the content’s source and intent. Other parameters include volume, update frequency, content lifecycle, and budgetary considerations. Depending on your language service provider’s (LSP) capabilities, there are several methods from which to choose. When making these decisions, it’s best to consult an experienced LSP that offers a wide range of services.

Following these best practices will help to reduce the overall cost of localization. Translators very often use memory tools and following these best practices would help these tools to recognize the same patterns.

Companies can try to discover which locales and languages provide the greatest return on investment. Identify languages that make up 90% of global online economic opportunities.

Be sure to follow industry best practices to reduce complexity, speed time-to-market, control costs and ensure quality localized content for all of your global markets. Companies will need to adopt stronger content strategies, keep closer tabs on the ways in which customers are consuming content, and take into consideration consumer preferences based on geographical region and digital trends.

Achieving global excellence means improving local experiences and getting better insights into those markets.