Tuesday, February 24, 2015

508 Compliance and Content Management

Section 508 compliance seems tricky and confusing, but its implementation on content management systems is vital for many organizations. According to the United States Census Bureau, about one in five Americans are impaired with some sort of disability. That is a rather large number of people that you do not want to ignore.

All federal and state agencies of the U.S. government are required to meet section 508 standards for accessibility. This law was established in 1998 to require that all technology used by the federal government to be accessible to those with disabilities. This includes those with visual, audible, physical or cognitive impairment.

Assistive Technology

Those with disabilities often use assistive technology. This would include things like screen readers for the visually impaired user. Jaws and Microsoft Narrator are a couple of popular software titles that are used for these purposes.

The Standards of 508 Section

There were 16 standards developed to make information and technology accessible to users with disabilities. These standards are the core requirements to making your content section 508 compliant with the federal government. There is no one automated solution that will magically make your site compliant, unfortunately.

These days, it is still a mix of manual and automated processes that drive a site’s compliance. Below are the 16 standards. Sites must be accessible by keyboard only (without a mouse). All screens should be readable by screen readers that can also display alt tags and descriptions of images. Closed captioning should be available (or transcripts of audio/video). Online forms should be able to completed using assistive technology (or only the keyboard).

Good Tips for Usability

Good technology applies great user experience, user interaction and design principles. This is often referred to as UX/UI in the creative and IT industries. The two terms evolved from the guidance involved in usability principles, an area of expertise that has grown over several decades.

Good usability means the user is able to quickly perform a task with little to no difficulty. This often involves employing sensible and logical navigation, understandable required action items, well defined terms, clean design and smooth workflows. There are other options to consider that vary from project to project, but these are the most basic caveats to understand when dealing with usability as it pertains to section 508 compliance.

Software and Technology Function Essentials for Section 508

The software and technology fundamentals here are designed to make computing life a bit easier for those with a disability. Your CMS should follow these guidelines and tips. Remember that all things should be executable from the computer keyboard, without a mouse. Some people can’t use a mouse. This should include shortcuts, object/image manipulation and dropdown list operation to name a few on the computer keyboard side. StickyKeys, FilterKeys, MouseKeys and High Contrast are some useful functions for this.

Your organization should also maintain a well-defined on-screen solution for focusing. An indicator that moves with the other interface elements is the preferred method for your CMS. Assistive technology should help with focus controls.

Web browsing comes natural to most of us that can use all five senses, but someone with impairment will have trouble. That is why it is important to have sufficient information about the user interface, such as identity, operation and state of the elements, available to the assistive technology. An image that represents a program element (icons) should also display text to define that process. Meaning should also be consistent with icons or bitmap images that are used to define elements of an application.

Textual information should be provided through the operating system’s (OS) functionality for text display. Text content, text input caret location and text attributes are the minimum textual information that should be displayed in the CMS.

Display options also need to be tweaked for this with vision impairment. For example, applications should never override a user’s selected contrast levels or color selections on the screen. Display functions should accommodate those with a disability. When there is animation, it should also be available to users as non-animated content or information. The user should also be able to choose the presentation mode of this content prior to viewing or consuming it.

Organization is also key for some. For instance, for section 508 compliance in the CMS, it is important to label items clearly so they may be easily understood. Color coding or highlighting items should not be the sole way to handle the process of conveying information, indicating an action, prompting a response or distinguishing a visual element. A large range of color and contrast items also helps users stay organized with large loads of information that may need to be categorized and organized in a certain fashion within folders, directories, etc.

There are also some key elements to avoid. One good example is blinking or flashing elements. Not only can they be annoying and distracting, but they can also be problematic for the end user. The frequency should be somewhere between 2 Hz and 55 Hz.

Assistive technology should also be available to users who are filling out digital forms. It should be able to assist with accessing information, modifying information and submitting information.

Contextual information is also very important. Users with all five senses may take certain elements for granted, but these items should be accessible to those with disabilities too. Items in color should also be available without it, like markup or context clues for the user. Documents should be readable without a style sheet. Redundant text links should be there for each active region of a server-side image map (replacement of image elements with text elements). Table data should be clearly defined, including row or column headers. Even frames should be clearly defined as an element.

There are many other standards and compliance rules of thumb to follow. It is best to consult with an expert on maintaining section 508 compliance.

Monday, February 9, 2015

Using Big Data Efficiently in 2015

Will 2015 be the year that your enterprise be able to finally harness all of that customer data that they have compiled over the years? Will there be ways to organize and use this information to impact the bottom line? Indeed, this data has become a form of capital for enterprises. So what will change in 2015?

Big Data Brands to Watch

Here are the areas to watch: secure storage and backup with encryption, reliable data management and data visualization (DV) are key ingredients as far as next generation big data software is concerned. 

As far as vendors are concerned, there are several players in the space including Twitter-owned Lucky Sort, Tableau, Advanced Visual Systems, JasperSoft, Pentaho, Infogram Tibco, 1010 Data, Salesforce, IBM, SAP, Hewlett-Packard, SAS, Oracle, Dell and Cisco Systems. These are a mix of independent and majors, but all have solid reputations in the industry. Choosing which one depends on numerous factors like budget, IT systems already in place, preference, reaquirements, etc.

The Coming Influx of Big Data

Big data must be useful and many professionals within all sorts of organizations are actively seeking out ways to use the data they have collected, rather than just consuming it.

Is your organization prepared for the influx of new users and devices that will flood the Internet and electronic communications, encapsulating customer interactions more than ever before? Many enterprises could be unprepared for the massive wave of data coming as billions of devices join the Internet. More devices, not just smartphones and computers, will be connected, bringing more data into organizations' servers. 

Gartner reportedly estimated the Internet of Things, or IoT, market at 26 billion devices by 2020 and Cisco thinks it will add $14 trillion in economic value by 2020. These devices include everything from household and office electronics and appliances to industrial manufacturing equipment. IoT will increase big data exponentially. It will hit pretty much every industry in a big way, but planning and preparing for the road ahead can ensure at least some adaptability for 2015 and beyond.

How to Deal With Data

Assess your organization's needs thoroughly including a checklist of IT systems in place and what needs or opportunities exist there. IT management will likely find a multitude of ways to incorporate new systems or upgrades through the right software options. Try to find robust, dynamic systems that are tailored to the way information is used or may be used within and outside the organization. Also, explore ways to improve customer relationships through the targeting and taxonomy of their data. Big data will be a more useful asset in 2015.

After you have taken the time to make an assessment of need and checklist for problem areas, you have to implement changes so that you can make the most of your information. You want to absolutely make sure that the data that you have collected from customers, suppliers, personnel and others is accessible, useful and organized. 

For example, a good search software that can access thousands of records and display results based on varying factors is a great way to handle the problem of search. Great search software is sometimes already a part of your organization’s CMS or other software for handling data and is just may not be fully utilized to make search more useful or easier.

Using Enterprise Search

Enterprise search applications vary by brand, but you may recognize a few of the names immediately from the larger tech firms such as IBM, HP and Microsoft. There are also open source options. Other vendors include Oracle, LucidWorks, Lexmark Perceptive Software, Sinequa and others. Sharepoint is probably one of the most popular options, which is also a tool for collaboration available from Microsoft. Google and IBM are also top companies in search technology. Many systems support multiple languages too.

HP is a great example of useful search for enterprise. Their flagship system is appropriately named Autonomy. Autonomy can index, or “crawl” (a search software term) millions of records including various types of content like documents, audio, video and social media. Employees and customers have come to expect a great system of search within their companies as expectations for technology have continued to climb higher due to a surge in search application use (such as Google searches on the web).

There are some important facets to search applications that should be noted. The HP Autonomy system, for instance, is capable of searching based on concept and context. This is becoming much more important in the era of big data. Searching through such large volumes of data requires some scrutiny to access the right information assets. Enterprise search applications can help with this obstacle.

Start with Little Data

It has been suggested that to deal with big data, you must first deal with little data. We are talking about metadata of course. Metadata are bits of information that can offer insight to content, helping to optimize search. Essentially it is information about information. Metadata can provide that context and concept information we referenced earlier in search applications.

Working with metadata can help with the overall process of keeping data organized and easy to access. The smaller pieces of information come together to become big information sets. Your team must start there to adequately solve this information overload problem.

Start by analyzing the exact needs or perceived functionality of the information. Taxonomy and terminology can be critical. Defining terms and putting them into contextual and conceptual order will help to provide a road map to access and utilize all of your team’s critical business data. This way, your data will actually become more valuable, too. Your information assets need to be managed in order to fully take advantage of them.

Some Big Data Tips

Here are some general tips to help with organizing and using your data assets:

  • Perform usability testing of your organization’s tools for data management.
  • Develop compliance and governance model for handling information.
  • Develop master data management (MDM) plan to reinforce and promote compliance.
  • Assess taxonomy and develop a controlled vocabulary to keep data structured.
  • Compress files (such as PDF documents) when necessary to save on storage cost.

Thursday, January 29, 2015

Taxonomy in the Age of Agile and Shadow IT

Marketing Meets IT and Merges

Information presented to customers must be capable of meeting near-instant information demands in a multitude of perspectives. This includes all end users, internal and external. Organizations will need to continue to be versatile and clever in their approach to data management.

Digital marketers are becoming quite clever in dealing with data, using it to persuade their (or their clients’) customers or potential customers. Taking principles from usability, marketers now use terms such as “user design”, “user experience” (UX) and “user interaction” (UI) and develop specialty roles to turn data into the most pleasant user experience possible. The food, beverage and hospitality industries aren’t the only ones in the experience this issue. Every industry is.

Build to Further Understand

Taxonomy and information architecture is not just about designing a great way to organize content for the end user. Taxonomy can also be used to understand data. Information, its structure and agility are keys to modern design techniques where such vast volumes of data exist.

Your organization can benefit from developing taxonomy for your organization’s information. Designing your information flow isn’t always as easy as it sounds. Consulting with a trusted professional that can analyze various aspects of your business is often needed to alleviate the stresses of a complex business taxonomy. A specialist can take your data and help you make sense of it. Whether you are a hospital that needs to define protocols for accessing patient data or a retail website seeking comprehensive analysis of information about web traffic, an internal audit of information systems can help to get you on the right track toward efficiency.

Start with assessing your digital (and even non-digital) tools to determine problem areas within the organization such as incomplete records or inconsistent rules or terms. The way in which systems communicate with customers, employees or other stakeholders is important to consider as well. Check that these systems can perform essential functions properly and that proper access and other rules are clearly defined. A master data management solution will help with this. Many fortune 1000 companies are going this route to deal with their organization’s information.

Agile Systems Will Assist in Achieving Maximum Comprehension

The right information asset management tools make all the difference. Having software for terms and concepts will provide users within the organization the right context for use. Thesaurus management, ontology software, metadata or cataloging software, auto-categorization, search, and other tools used in concert will keep the stability.

For example, in dealing appropriately with taxonomy, an agile system including auto-categorization and search tools (including text mining) would contain pre-installed user editable and non-editable taxonomies, be able to auto-generate editable taxonomies, support import of editable and non-editable taxonomies. To be agile for any number of end user type, these must be able to play out in several different varying combinations.

The same principle applies to other software categories like content management or thesaurus software. Search functions, for example, are more useful to the end user when they contain spell-checking functions or multiple display options. Remember, these principles apply to both the internal and external users. Remember that information architecture mastered on the inside will translate better to the outside.

Taxonomy and information architecture should be the foundation of an enterprise view of the customer. So how must an organization view its own vocabulary? Ensure that your master data management is interpreting terms consistently and includes context for those terms. Without context, people are often left searching for clues instead of getting the information they came for.

Maintaining Culture While Establishing Order

The above phrase sounds more like a political statement than one of taxonomy. However, when you think about it from a business perspective, it makes some sense. When you are implementing your agile system for taxonomy and information architecture, you don't want to disrupt the critical flow of business nor the information that is required to actually do business.

What you do want is to be able to open the pipeline of information further to increase productivity and enable efficient processes within the organization. That statement sums up the general need to implement an agile system for handling information.

Part of the solution to this is governance and compliance controls. By introducing hard controls for governance and compliance, you are forming a backbone with controls for how systems are using and integrating data. Your taxonomy, metadata and other information may connect business processes or even use content to complete or help complete a variety of tasks.

The exact structure of any organization varies from enterprise to enterprise and in parallel does their culture. This contrast can be reckoned with reason. The key to being able to harness information collectively, selectively and to varying degree is what will make the major difference in opening that pipeline up with controls in place to establish order.

Taxonomy Soup: Collaboration, Integration and Access

Here is analogy to social science: Just as the language of a region or culture may vary in dialect, so too does the language of business. The language of business can be quite diverse. Between industries, organizations, departments, fields of study/practice and a wide variety of other factors, confusion exists. The real benefit to a system with agility is the ability to communicate more efficiently. The combination of collaboration, integration, and access are the key ingredients to making the perfect taxonomy soup.

The ability to sort terms is very important and will become even more important in the big data era. A great system will be one that can differentiate taxonomy with due diligence. Collaborating will become more efficient this way.

By integrating data that does not fit into the dialect of terms, the organization will be able to make better use of its information assets, whatever they may be. This includes getting all of the information into the right places and ultimately into the right hands in the correct way. Policies and procedures are important examples of such data.

Analyze Your Needs Carefully

Take a look around your organization. Take notes on every detail you can to make an informed decision about what to implement and where. Consulting with a professional is the next step. The aforementioned details of creating an agile environment for taxonomy and information architecture within an enterprise of nearly any size are helpful in beginning to form a strategy for handling your enterprise information. Consulting a professional will help alleviate the overwhelming and burdensome task of data complexity.

Monday, January 12, 2015

Making Information Easier to Find Becomes Ever More Important

Taxonomy is becoming so much more important in the digital age that entire enterprises may one day develop out of the need just to classify information. The many ways we have traditionally classified content has exponentially grown in the digital age to a size un-imagined and continually growing. 

The Library of Congress and other libraries, large and small, have gone to using digital tools to classify and re-classify information about books, documents, texts and even multimedia content. The Internet was, of course, developed to help more easily share and organize documents and other content across a computer network. Now, here we are with the cloud and big data.

Where to begin?

The proportion of data-to-enterprise, or even data-to-individual, can become difficult or even unmanageable without the right tools or experience to guide you. As humans and consumers, we tend to expect that our options will be categorized into specific types based on the larger type. For instance, if we buy a computer, the choices are usually as follows: brand, device type, operating system, etc. 

Then we get into Apple vs Dell, Desktop vs Tablet or BlackBerry vs Android. The more immediate platforms that come to mind are Google or Bing search engines, hashtags or networks on social media like Twitter and Facebook. These are the most recent consumer examples of classifying information in a multitude of ways using software. Enterprises of all industries, however, are becoming more dependent on systems to help them manage information to scale.

The time it takes members of an organization to find important or relevant information is productivity lost. It also adds to personnel frustration, even at management level.

Time to Give Industries Options for Information Management

Companies are responding to the needs of industry. Taxonomy, metadata, ontology, data virtualization and data governance are some of the key areas of need for many organizations dealing with vast amounts of data coming from customers, partners, legal or other channels.

Top Quadrant, who released a web-based taxonomy solution recently, is an example of how these enterprise needs are so far being addressed, according to KMWorld. 

TopBraid, the software referenced, is able to help end users reference data with more easily accessible visual models of the data, laid out in a clean way. 

Much more emphasis on visual representation of data is becoming an IT industry-wide way to tackle some of the problems associated with extrapolating and explaining complex data sets. Asian countries have had a great deal of success, in fact, in using visual models to teach mathematics and transition students into new topics easier, which Americans have had some difficulty with in many educational settings.

TopQuadrant is just one recent example. There are tools and software being developed in the market to deal with this exponentially growing challenge.

Taxonomy Time for Taxonomists

So what does a taxonomist do that can help arrange and set a standard for all of this enterprise information that we are dealing with in ever increasing amounts? Well, for one, taxonomists are tasked, not only with categorization of terms, but also governance and definition of those terms as well. 

They often use a commercial software that is dedicated to this work, such as a dedicated thesaurus or taxonomy management application. Some of these can be developed internally as well, for the right organization, as long as it fits their particular needs. 

Sometimes taxonomy management tools are part of another suite or software, in which taxonomy is a feature. In the case of Drupal, a website content management software tool to build and maintain websites, taxonomy is used to define or classify content, which can then be configured to display nodes, pages, etc. to the end user. Sometimes, other software can be used, such as spreadsheets or other types of software tools. 

Lastly, open source software for taxonomy and ontology are becoming available for use as well.

The benefits to having a system or person that maintains taxonomy within the enterprise are several. One benefit is that information is organized, as I have already alluded to. Another is that this information can also be made easier to find for customers, personnel, vendors, supply partners, etc., which I have also discussed. One reason that we have not discussed is standardization. 

This refers to terminology and jargon within your particular company. Every company can create a manual of terms, glossary, thesaurus, etc. But a taxonomist or someone working with taxonomy software can refine this process and create a standard that efficiently works across the company, so everyone is in compliance. It is kind of like having a style guide, but only for key terms of the business. 

Compliance is another key benefit to all of this. Regulations need to be followed and adhered. There are other legal and regulatory impacts that information has and taxonomy, ontology and information management are a few ways to stay ahead of the mess. Information audits can be a great way to find holes in your system and develop ways to patch those holes for greater governance and compliance. All of this can save us time and money on our business operations in one way or another.

Techniques for Creating a Great Information Structure

Taxonomy and information management starts with a few basic techniques to help guide end users to information they have are trying to navigate to. The less time to navigate, the better.

Not only are terms important, but so are their relationships. Sometimes information can be found using one term or another, depending on the scenario. If you are looking for blue cars that are fast, you could search cars by color or by speed, as one example illustrates this point.

Standards should also weed out content that is irrelevant or invalid. Other types of information related to terms can be used in conjunction with taxonomy. There should be clear hierarchies of information within the enterprise as well. 

All of this data should be able to be used with other tools like content management, indexing, search and others. It should always support ANSI/NISO Z39.19 or ISO 2788 thesaurus standards. Different classes of information may apply to sets or subsets even. 

Make sure that any software you use will generate reports for you on analytics (of terms) and so forth. This is very important.

There are a variety of ontology and thesaurus options available. They are available in a multitude of platform formats. Here are a few: MultiTes Pro (Microsoft Windows), Cognatrix (Apple Mac OS X), One-2-One (Windows) and TheW32 (multi-platform). There don't appear to be many options for the mobile platforms yet (iOS, Android, BlackBerry and Windows Phone). 

The information management problem in the world of big data seems pervasive, but there is a growing trend toward developing new ways of dealing with it. Now is the time to start looking at creating a plan to develop a system for dealing with taxonomy, oncology and information management to help your organization users to access data more quickly, efficiently and sensibly. 

The more content builds up, the more the organization needs to change, adapt and most importantly, handle of the big data involved. 

Tuesday, December 30, 2014

Latest Applications in Enterprise Search

In my previous post, I described the future of enterprise search. In this post, I will describe few new search applications that could be interesting.

Concept Searching

Founded in 2002, Concept Searching provides software products that deliver automatic semantic metadata generation, auto-classification, and powerful taxonomy management tools. Concept Searching is the only platform independent statistical metadata generation and classification software company in the world that uses concept extraction and compound term processing to significantly improve access to unstructured information. The Concept Searching Microsoft suite of technologies runs in all versions of SharePoint, Office 365, and OneDrive for Business.

The technologies are being used to improve search outcomes, deploy an enterprise metadata repository, enable effective records management, identify and secure sensitive information, improve governance and compliance, social tagging, collaboration, text analytics, facilitate eDiscovery, and drive intelligent migration.

Concept Searching, developer of the Smart Content Framework™, provides organizations with a method to mitigate risk, automate processes, manage information, protect privacy, and address compliance issues. This infrastructure framework utilizes a set of technologies that encompasses the entire portfolio of unstructured information assets, resulting in increased organizational performance and agility.

Lexalytics, Inc.

Lexalytics provides enterprise and hosted text analytics software to transform unstructured text into structured data. The software extracts entities (people, places, companies, products, etc.), sentiment, quotes, opinions, and themes (generally noun phrases) from text. Text is considered unstructured data which comprises somewhere between 31% and 85% of what is stored in any given enterprise.

Lexalytics is an OEM vendor of text analytics and sentiment analysis technology for social media monitoring, brand management, and voice-of-customer industries. The software uses natural language processing technology to extract the above-mentioned items from social media and forums; the voice of the customer in surveys, emails, and call-center feedback, traditional media, pharmaceutical research and development, internal enterprise documents, and others.

Lexalytics, provides a text mining engine that is used by a number of search partners like Coveo, Playence, and Oracle to add additional metadata to their search. This is additional intelligence around "just what do those words actually mean?" In other words, this engine is boosting the value of search by providing more information into the index. This enables other applications, and helps search be "smarter".

MaxxCAT

MaxxCAT provides enterprise search solutions for corporate intranets, web sites, databases, file systems and applications, and other environments that require rapid document retrieval from multiple data sources. The flagship products offered by MaxxCAT are the SB-250 series and the EX-5000 series network search appliances. Also available are series of cloud-enables storage appliances.

Basis Technology

Founded in 1995, this software company specializes in applying artificial intelligence techniques to understanding documents written in different languages. Their software enhances parsing tools by classifying the role of words and provides metadata on the role of words to other algorithms. Software from Basis Technology will, for instance, identify the language of an incoming stream of characters and then identify the parts of each sentence like the subject or the direct object.

The company is best known for its Rosette Linguistics Platform which uses Natural Language Processing techniques to improve information retrieval, text mining, search engines and other applications. The tool is used to create normalized forms of text by major search engines, and, translators. Basis Technology software is also used by forensic analysts to search through files for words, tokens, phrases or numbers that may be important to investigators.

dtSearch

Founded in 1991, this company specializes in text retrieval software. Its current range of software includes products for enterprise desktop search, Intranet/Internet spidering and search, and search engines for developers (SDK) to integrate into other software applications

LTU technologies

Founded in 1999, this company is in the field of image recognition for commercial and government customers. The company provides technologies for image matching, similarity and color search for integration into applications for mobile, media intelligence and advertisement tracking, ecommerce and stock photography, brand and copyright protection, law enforcement and more

Sematext Group, Inc.

This company's product SSA - Site Search Analytics - continuously monitors, measures, and improves the search experience. It identifies top queries, problematic zero-hit queries, common misspellings, etc. It measures and compares search relevance and improves conversion rates. It is available It is available on-premises and in the cloud.

Exorbyte

This is a privately held software company which was founded in 2000 in Konstanz, Germany, with an additional office in the United Kingdom (Bristol). The company develops intelligent software for search and analysis of structured and semi-structured data.

Their product MatchMaker is the leading error-tolerant search & match platform for huge master data volumes. The multiple award-winning software technology thinks, searches and finds like a human – but dramatically faster, in much more complex configurations and with no serious data restriction using keys or similar methods. It is available on-premises and in the cloud.

Federal authorities, insurance agencies, ICT firms and more use this software to identity a resolution in diverse, data-intensive business processes such as input management, enterprise search and data quality. It has easy customization and integration.

Inbenta

Founded in 2005,this company provides enterprise semantic search technology based on artificial intelligence and natural language processing. It offers intuitive search solutions and intelligent content support for website and corporate Intranets.

Content Analyst Company

This is a privately held software company which develops concept-aware text analytics software called CAAT, which is licensed to software product companies for use in eDiscovery. In 2013, five CAAT-powered products were named in the Gartner eDiscovery Magic Quadrant Report, and the analyst firm 451 Group referred to CAAT as The Hottest Product in eDiscovery.

Content Analyst's CAAT analytics software is a machine learning system based on latent semantic indexing technology. CAAT provides several text analytics capabilities using both supervised learning and unsupervised learning methods including concept search, categorization, conceptual clustering, email conversation threading, language identification, near-duplicate identification, auto summarization and difference highlighting.

SearchYourCloud

With SearchYourCloud and its patented, federated search technology, a single search request in Outlook simultaneously and transparently searches your email, desktop and all of your cloud storage sources and delivers highly targeted results. You get exactly the information you need with just one query.

Docurated

Docurated aggregates all your documents in one place, turning them into a searchable and customizable database. Docurated will now provide Dropbox integration as well. It accelerates sales in companies looking for fast growth by making the best marketing content readily available to Sales around the world. Docurated works with your existing content stores and uses machine learning to enable your team to find and re-use the most effective content with no manual tagging or uploading.

This is the next generation visual knowledge management platform which solves the information retrieval problem for leading companies like Clorox, Omnicom, Netflix, Weather Channel, and many others. Docurated enables sales, marketing, and technology teams to surface and use the exact chart or slide they need, no matter where it is stored, without slogging through folders and files. Docurated seamlessly integrates with existing folder-based repositories.

Lucene

Apache Lucene is a free open source information retrieval software. It is supported by the Apache Software Foundation and is released under the Apache Software License. While suitable for any application which requires full text indexing and searching capability, Lucene has been widely recognizedfor its utility in the implementation of Internet search engines and local, single-site searching.

At the core of Lucene's logical architecture is the idea of a document containing fields of text. This flexibility allows Lucene's API to be independent of the file format. Text from PDFs, HTML, Microsoft Word, and OpenDocument documents, as well as many others (except images), can all be indexed as long as their textual information can be extracted.

These are just few search applications that are currently on the market. There are many others. Choosing the right application is based on your organization's requirements.

Future of Enterprise Search

Enterprise search is a developing industry. In this post, I will describe the latest developments in enterprise search.

Effective enterprise search represents one of the most challenging areas in business today. The whole area of search has been revolutionized by Google. Employees now expect to be able to locate relevant data as easily as they navigate the web through Google. When this ease of search is not replicated in organizations' systems, it can be quite frustrating. As we create more content than ever before, the importance of effective search across the enterprise continues to grow.

Until recently, much of the enterprise search technology remained unchanged. The general purpose enterprise search offerings were fairly similar in technology and scope. There are now many software companies who direct their efforts towards enterprise search. The future will bring shorter innovation cycles, continuous user experience improvements, deeper integration with first- and third-party applications and more ETL-like (extract, transform and load) functionality to handle poor quality content.

In the second half of the 2000’s, the enterprise search companies were absorbed by the large software companies:
  • Microsoft acquired FAST Search in 2008
  • Adobe acquired Mercado in 2009
  • Dassault Systèms acquired Exalead in 2010
  • Hewlett Packard acquired Autonomy in 2011
  • Oracle acquired Endeca in 2011
  • IBM acquired Vivisimo in 2012
User experience is a broad topic in itself, with active trends including:
  • Richer information about the user to determine context, such as their business context, social context, mobile device sensors, location, speech recognition, preferences and historical usage.
  • Advances in visualization such as HTML 5.
  • Natural language processing as in the trends seen with Wolfram Alpha and smart phone digital assistants, such as Apple’s Siri, Microsoft’s Cortana and Google Now.
  • Richer results that look less like a page of links and more like answers to questions.
  • Elements of knowledge management that add meaning to queries and results.
  • Enterprise search products will become increasingly and more deeply integrated with existing platforms, allowing more types of content to be searchable and in more meaningful ways. It will also become less of a dark art and more of a platform for discovery and analysis.
The future of enterprise search seems destined to continue with simple keyword and Boolean searching, augmented by faceted navigation based on metadata. Virtually every e-commerce web site today offers guided navigation based on metadata.

This ubiquitous model now appears in most of the leading enterprise search products and users immediately understand how a simple text query can quickly be focused to a specific domain by clicking on a metadata filter. This updated search model is increasing demand for auto-classification products which can generate descriptive metadata automatically based on an analysis of the document’s unstructured content.

Open source software has made significant improvements, displacing many of the traditional search vendors. Lucene and its supporting companies like LucidWorks provide solid search functionality at a hard-to-beat price. Where vendors are seeing success is in four main areas:
  • Providing functionality beyond typical "search" – extending to facets, true knowledge management, multimedia search, and other functionality.
  • Focusing on vertical-specific applications like fraud and supply-chain management.
  • Working with larger, more conservative enterprises.
  • Providing a SaaS, one-stop-shop for zero (or low) touch functionality.
A few major factors are going to drive the industry going forward:
  • Open source will continue to get better and drive out inefficiency in the market .
  • More, better information about the searcher: location awareness, profile sharing, time dependence, deeper understanding of the context and content of the search. With this information, you can provide better, more relevant results. 
  • Lower tolerance for hassle: people expect search to "just work" – not understanding that it can be just as complicated as any other major IT initiative. By having low-touch solutions, SaaS providers will make major progress in the small/medium business world.
  • Search all the things!: Integrated understanding of objects, video, speech, as well as traditional semantic sources like text will combine together better into a whole that allows for information retrieval no matter what the format.
Another area for future development is machine to machine consumption of information and sharing. Search providers are increasingly applying advanced analytics of text and other media so their users’ desires are more deeply satisfied through relevant search results. Search will be increasingly entity-centric and collaborative.

Future of search will include more semantic understanding of both content and queries. For example Exorbyte is focused on searching in structured master data – people, products and places, and its ability to query this data without use of restrictive match-keys for both lexicographical and semantic similarity is globally unique.

The future of search goes through natural language processing while on the other hand it will entail the capability of providing advanced information analysis during indexation time.

The facility to search within the document itself is becoming vital. The Docurated platform caters for instant access to the most relevant page or slide without even having to open the document.

Effective enterprise search can eradicate inefficiency. Enterprise search will become instant and intuitive, paving the way for increased productivity across the enterprise.

In my next post, I will highlight few search applications that could be worth looking into...

Sunday, November 30, 2014

SharePoint 2013 Improvements

In this post, I will describe few improved features in SharePoint 2013.

Cross-Site Publishing

SharePoint 2013 has cross-site publishing. In the previous versions of SharePoint, it was not possible to easily share content across sites. Using cross-site publishing, users can separate authoring and publishing into different site collections: authored content goes into an indexable "catalog", and you can then use FAST to index and deliver dynamic content on a loosely coupled front end.

This feature is required for services like personalization, localization, metadata-driven topic pages, etc. An example of its use is a product catalog in an e-commerce environment. It can be used more generally for all dynamic content. Note that cross-site publishing is not available in SharePoint Online.

Here is how it works. First, you designate a list or a library as a "catalog". FAST then indexes that content and makes it available to publishing site collections via a new content search web part (CSWP). There are few good features put into creating and customizing CSWP instances, including some browser-based configurations. Run-time queries should execute faster against the FAST index than against a SharePoint database.

Cross-site publishing feature could significantly improve your content reuse capabilities by enabling you to publish to multiple site collections.

Templates

Creating templates still begins with a master page which is an ASP.NET construction that defines the basic page structure such as headers and footers, navigation, logos, search box, etc. In previous versions, master pages tended to contain a lot of parts by default, and branding a SharePoint publishing site was somewhat tricky.

SharePoint 2013 has new Design Manager module, which is essentially a WYSIWYG master page/page layout builder. Design Manager is essentially an ASP.NET and JavaScript code generator. You upload HTML and CSS files that you create and preview offline. After you add more components in the UI (for example, specialized web parts), Design Manager generates the associated master page. Page layouts get converted to SharePoint specific JavaScript that the platform uses to render the dynamic components on the page.

You can generate and propagate a design package to reuse designs across site collections. There are template snippets that enable you to apply layouts within a design package, but they are not reusable across design packages.

This process is more straight forward than the previous versions, but it still would likely involve a developer.

Contributing Content

SharePoint 2013 enables contributors to add more complex, non-web part elements like embedded code and video that does not have to be based on a specific web part. This feature is called "embed code". Note that if you are using cross-site publishing with its search based delivery, widget behavior may be tricky and could require IT support.

With respect to digital asset management, SharePoint has had the ability to store digital assets. However, once you got past uploading a FLV or PNG file, there was scant recourse to leverage it. SharePoint 2013 brings a new video content type, with automatic and manual thumbnailing.

Creating image renditions capability has also improved. It allows you to contribute a full fidelity image to a library, and then render a derivative of that image when served through a web page.

Other added features include better mobile detection/mobile site development and an improved editing experience.

Metadata and Tagging Services

SharePoint 2013 has solid metadata and tagging services with improved and simplified the term store. However, there is still no versioning, version control or workflow for terms.

Big improvement is that using FAST, you can leverage metadata in the delivery environment much more readily than you could in previous versions. You can use metadata-based navigation structures (as opposed to folder hierarchies), and deploy automated, category pages and link lists based on how items are tagged.

Saturday, November 15, 2014

Enterprise Content Management and Mobile Devices

With mobile devices becoming increasingly powerful, users want to access their documents while on the move. iPads and other tablets in particular have become very popular. Increasingly, employers allow employees to bring mobile devices of their choice to work.

"Bring Your Own Device" (BYOD) policy became wide spread in organizations and users started expecting and demanding new features that would enable them to work on their documents from mobile devices. Therefore, the necessity to have mobile access to content has greatly increased in recent years.

As with most technology, mobile and cloud applications are driving the next generation of capabilities in ECM tools. The key capabilities in ECM tools are the ability to access documents via mobile devices, ability to sync documents across multiple devices, and the ability to work on documents offline.

Most tools provide a mobile Web-based application that allows users to access documents from a mobile’s Web browser. That is handy when users use a device for which the tool provides no dedicated application.

The capabilities of mobile applications vary across different tools. In some cases, the mobile application is very basic, allowing users to perform only read-only operations. In other cases, users can perform more complex tasks such as creating workflows, editing documents, changing permissions or adding comments.

Solutions and Vendors

Solutions emerged that specialize in cloud based file sharing capabilities (CFS). Dropbox, Google Drive, Box.com, and Syncplicity (acquired by EMC) provide services for cloud-based file sharing, sync, offline work, and some collaboration for enterprises.

There is considerable overlap of services between these CFS vendors and traditional document management (DM) vendors. CFS vendors build better document management capabilities (such as library services), and DM vendors build (or acquire) cloud-based file sharing, sync, and collaboration services. Customers invested in DM tools frequently consider deploying relevant technology for cloud file sharing and sync scenarios. Similarly, many customers want to extend their usage of CFS platforms for basic document management services.

DM vendors which actively trying to address these needs include Alfresco (via Alfresco Cloud), EMC, Microsoft (via SkyDrive/ Office 365), Nuxeo (via Nuxeo Connect), and OpenText (via Tempo Box). Collaboration/social vendors like Jive, Microsoft, and Salesforce have also entered the enterprise file sharing market. Other large platform vendors include Citrix which acquired ShareFile. Oracle, IBM, and HP are about to enter this market as well.

Key Features

Number of Devices - Number of devices that the ECM vendor provides mobile applications for is very important. Most tools provide specific native applications for Apple’s iPhone and iPad (based on iOS operating system) and Android-based phones and tablets. Some also differentiate between the iPhone and iPad and provide separate applications for those two devices. Some provide applications for other devices such as those based on Windows and BlackBerry.

File sync and offline capabilities - Many users use more than one device to get work done. They might use a laptop in the office, a desktop at home, and a tablet and a phone while traveling. They need to access files from all of those devices, and it is important that an ECM tool can synchronize files across different devices.

Users increasingly expect capabilities for advanced file sharing, including cloud and hybrid cloud-based services. Most tools do that by providing a sync app for your desktop/laptop, which then syncs your files from the cloud-based storage to your local machine.

Most tools require users to create a dedicated folder and move files to that dedicated folder, which is then synced. A few tools like Syncplicity allow users to sync from any existing folder on your machine.

A dedicated folder can be better managed and seems to be a cleaner solution. However, it means that users need to move files around which can cause duplication. The other approach of using any folder as a sync folder allows users to keep working on files in their usual location. That is convenient, but if users reach a stage when they have too many folders scattered around on their laptop and other synced machines, they might have some manageability issues.

Some tools allow users to selectively sync. Rather than syncing the entire cloud drive, users can decide which folders to sync. That is useful when users are in a slow speed area or they have other bandwidth-related constraints. In some cases, they can also decide whether they want a one-way sync or a bi-directional sync. Once they have the files synced up and available locally, they typically can work offline as well. When they go online, their changes are synced back to the cloud.

Most tools that provide a dedicated mobile applications can also sync files on mobile devices. However, mobile syncing is usually tricky due to the closed nature of mobile device file systems.

While most ECM and DM vendors provide some varying capabilities for mobile access, not all of them can effectively offer file sync across multiple devices.

Your options should be based on your users' requirements. Access them very carefully before deciding on a suitable solution for your organization.

Friday, October 31, 2014

Success in Enterprise Content Management Implementations

A successful enterprise content management (ECM) implementation requires an ongoing partnership between IT, compliance, and business managers.

Strict top-down initiatives that leave little for users' requirements consideration result in ECM systems that users don’t want to use.

Similarly, an ad hoc, overly decentralized approach leads to inconsistent policies and procedures, which in turn leads to disorganized, not governed, not foundable content. In both extremes, the ECM initiative ends with a failure.

Whether your organization uses an agile, waterfall or mixed approach to ECM deployment, ECM leaders must think about program initiation, planning, deployment, and ongoing improvement as a process and not as isolated events. Team composition will change over time of ECM project planning and roll-out, as different skill sets are needed.

For example, a business analyst is a key member of the team early in the project when developing a business case and projecting total cost of the project, while legal department will need to get involved when documenting e-discovery requirements.

But, there is often no clear location in the org chart for fundamental content management responsibilities, and that can contribute to weakened strategy, governance and return on investment (ROI).

Approach to ECM

Successful ECM initiatives balance corporate governance needs with the desire of business units to be efficient and competitive, and to meet cost and revenue targets.

Organizations should determine the balance of centralized versus decentralized decision making authority by the level of industry regulation, jurisdiction, corporate culture and autonomy of business units or field offices.

A central ECM project team of content management, business process, and technology experts should define strategy and objectives and align with the technology vision. Local subject matter experts in business units or regional offices can then be responsible for the execution and translation of essential requirements into localized policies and procedures, along with the business unit’s content management goals.

Business managers can help to measure current state of productivity, set goals for improvement, contribute to a business case or forecast total cost of a CMS ownership over a number of years. A trainer will be needed during pilot and roll-out to help with change management and system orientation. Legal department should approve updates to retention schedule and disposition policies as practices shift away from classification schemes designed for paper to more automated, metadata-driven approach.

Project Roles

The following roles are essential for an ECM project:
  • Steering committee is responsible for project accountability and vision. Their role is to define an overall vision for an ECM project and outline processes and procedures to ensure integrity of information.
  • Project manager is responsible for the ECM project management during CMS deployment. The project manager's role is to create project plans and timetables, identify risks and dependencies, liaise with business units, executive sponsors, IT, and other teams.
  • Business analyst is responsible for outlining the desired state of CMS implementation and success metrics. This role is to gather business and technical requirements by engaging with business, technical, and legal/compliance stakeholders. They need to identify the current state of operations and outline the desired future state by adopting a CMS system.
  • Information architect's role is to define and communicate the standards to support the infrastructure, configuration, and development of ECM application.System administrators - their role is to define and implement an approach to on-premises, cloud, or hybrid infrastructure to support a CMS.
  • CMS administrator is responsible for the operation of the CMS. This role is to define and implement processes and procedures to maintain the operation of the CMS.
  • User experience specialist's role is to define standards for usability and consistency across devices and applications, and create reusable design and templates to drive users' adoption.
  • Records and information managers' role is to define and deploy taxonomies, identify metadata requirements, and to develop retention, disposition, and preservation schedules.
Core competencies will be supplemented by developers, trainers, quality assurance, documentation, and other specialists at various phases of the ECM deployment project. It is important to provide leadership during the deployment of a CMS. The team should bring technical knowledge about repositories, standards and service-oriented architectures, combined with business process acumen and awareness of corporate compliance obligations.

Information architects will be important participants during both the planning and deployment phases of the project. Communication and process expertise are essential for ongoing success. IT, information architect, and information managers should learn the vocabulary, pain points, and needs of business units, and help translate users' requirements to technical solutions so that the deployed CMS could help to improve current processes.

Compliance subject matter experts should communicate the implications and rationale of any change in process or obligations to users responsible for creating or capturing content.

Project plans, budgets and timetables should include time for coaching, communication, and both formal and informal training. Even simple file sharing technology will require some investment in training and orientation when processes or policies are changed.

Strategic Asset

ECM is a long-term investment, not a one-time technology installation project. Enterprises can often realize short-term ROI by automating manual processes or high-risk noncompliance issues, but the real payoff comes when an enterprise treats content as a strategic asset.

A strong ECM project team demonstrates leadership, communication skills and openness to iteration, setting the foundation for long-term value from the deployment efforts.

For example, a company aligned its deployment and continuous improvement work by adopting more agile approaches to project delivery, as well as a willingness to adopt business metrics (faster time to market for new products), instead of technology management metrics (number of documents created per week). That change allowed the company to better serve its document sharing and collaboration needs of sales teams in the field.

The project team must engage directly with the user community to create systems that make work processes better. It is a good idea to include hands-on participation and validation with a pilot group.

Recommended Practices

Follow best practices from completed ECM projects. Review processes, applications, forms, and capture screens to identify areas of friction when people capture or share content. User experience professionals have design and testing experience, and they need to be included in the ECM deployment team.

User participation is valuable throughout the ECM deployment project. Direct input on process bottlenecks, tool usability and real-world challenges helps prioritize requirements, select technologies and create meaningful training materials.

Senior managers who participate on a steering committee, or are stakeholders in an information governance strategy, should allow their teams to allocate adequate time for participation. That might mean attending focus groups, holding interviews, attending demos and training, or experimenting with new tools.

Be Proactive

A sustainable and successful ECM initiative will be responsive to the changing behavior of customers, partners and prospects, changing needs of users, and corporate and business unit objectives. Stay current with ECM and industry trends. ECM project team members should keep one eye on the future and be open to learning about industry best practices.

Businesses will continue to adopt mobile, cloud and social technologies for customer and employees communication. Anticipate new forms of digital content and incorporate them into the ECM program strategy proactively, not reactively.

Proactively push vendors for commitments and road maps to accommodate those emerging needs. Stay alert to emerging new vendors or alternative approaches if the needs of business stakeholders are shifting faster than current ECM technology. Aim for breadth as well as depth of knowledge, and encourage team members to explore adjacent areas to ECM to acquire related knowledge and think more holistically.

Saturday, September 6, 2014

Managed Metadata in SharePoint - Part Two

In part one of this post, I described using metadata in SharePoint. In this part two, I will describe metadata management.

Managed metadata makes it easier for Term Store Administrators to maintain and adapt your metadata as business needs evolve. You can update a term set easily. And, new or updated terms automatically become available when you associate a Managed Metadata column with that term set. For example, if you merge multiple terms into one term, content that is tagged with these terms is automatically updated to reflect this change. You can specify multiple synonyms (or labels) for individual terms. If your site is multilingual, you can also specify multilingual labels for individual terms.

Managing metadata

Managing metadata effectively requires careful thought and planning. Think about the kind of information that you want to manage the content of lists and libraries, and think about the way that the information is used in the organization. You can create term sets of metadata terms for lots of different information.

For example, you might have a single content type for a document. Each document can have metadata that identifies many of the relevant facts about it, such as these examples:
  • Document purpose - is it a sales proposal? An engineering specification? A Human Resources procedure?
  • Document author, and names of people who changed it
  • Date of creation, date of approval, date of most recent modification
  • Department responsible for any budgetary implications of the document
  • Audience
Activities that are involved with managing metadata:
  • Planning and configuring
  • Managing terms, term sets, and groups
  • Specifying properties for metadata
Planning and configuring managed metadata

Your organization may want to do careful planning before you start to use managed metadata. The amount of planning that you must do depends on how formal your taxonomy is. It also depends on how much control that you want to impose on metadata.

If you want to let users help develop your taxonomy, then you can just have users add keywords to items, and then organize these into term sets as necessary.

If your organization wants to use managed term sets to implement formal taxonomies, then it is important to involve key stakeholders in planning and development. After the key stakeholders in the organization agree upon the required term sets, you can use the Term Store Management Tool to import or create your term sets. You can also use the tool to manage the term sets as users start to work with the metadata. If your web application is configured correctly, and you have the appropriate permissions, you can go to the Term Store Management Tool by following these steps:

1. Select Settings and then choose Site Settings.
2. Select Term store management under Site Administration.

Managing terms, term sets, and groups

The Term Store Management Tool provides a tree control that you can use to perform most tasks. Your user role for this tool determines the tasks that you can perform. To work in the Term Store Management Tool, you must be a Farm Administrator or a Term Store Administrator. Or, you can be a designated Group Manager or Contributor for term sets.

To take actions on an item in the hierarchy, follow these steps:

1. Point to the name of the Managed Metadata Service application, group, term set, or term that you want to change, and then click the arrow that appears.
2. Select the actions that you want from the menu.

For example, if you are a Term Store Administrator or a Group Manager you can create, import, or delete term sets in a group. Term set contributors can create new term sets.

Properties for terms and term sets

At each level of the hierarchy, you can configure specific properties for a group, term set, or term by using the properties pane in the Term Store Management Tool. For example, if you are configuring a term set, you can specify information such as Name, Description, Owner, Contact, and Stakeholders in pane available on the General tab. You can also specify whether you want a term set to be open or closed to new submissions from users. Or, you can choose the Intended Use tab, and specify whether the term set should be available for tagging or site navigation.