Marketing teams and corporate strategists are constantly on the lookout for online user behavior reports to both facilitate effective decision making, and enhance customer experience. Web properties are the best way to reach customers and to collect this information first hand. There are a myriad of both open source and commercial tools for such web analysis. Some of these products are SaaS based, whereas some can be implemented in-house. A few of the products available in the market are –
- Adobe SiteCatalyst
- Google Analytics – Standard and Premium
- IBM Coremetrics
- Open Web Analytics
- Digital Analytix
While choosing one of these products, enterprises should keep in mind the following as part of their selection process –
- Data capture. The collection capabilities of the tool are quite critical as they determine how easy it is to implement. These tools, most of which have browser-based collection mechanisms, must capture details that enable BI, such as new visitors, unique visitors, visits by browser – OS – network, visits by geographic location, average visit duration, visits by mobile devices, pages per visit, rich and emerging media, etc. The tools must accommodate custom variables that are able to track the information in a customized way. Additionally, the tools must support APIs that can be used to extract meaningful reports from the raw data collected to facilitate better decision making.
- Reporting. What type of reports can be generated? How can the reports be effectively viewed? Does the tool offer the ability for administrators to create custom reports? The points to be considered are dashboard view, what kind of report categorization is allowed, if it allows the time duration to be selected, if it allows conversion reports, custom reports, bounce rate reports, engagement reports for the page depth and reports on the technology used while accessing the reports and dashboards. Knowing about these features would help the enterprise to find out how much customization would be needed after the tool selection.
- Data freshness. This refers to finding information about when the collected data is reflected on the reporting tool. Some tools have the most recent 24 hours, some have 2 hours and some are real-time. Knowing how old or new the data is helps in tuning the results and being aware of when the results will take effect.
- Commercials. Budgeting is another important aspect for consideration during tool selection. Some are free where as some have commercial implications. Few tools offer usage-based pricing, while some are available for transaction-based fee.
- Data for enrichment. As some tools do not allow the Personally Identifiable Information (PII) to be transmitted for collection, there is a lot of information that needs to be de-coded to extract exact meaning from the report. Normally the web properties operate on some CMS and portal platform which would have a minimal user-centric data collection mechanism in place. These mechanisms include when the last login time for user was, how long the user stayed on the property, the data around registration workflow and data indicating the activeness of the users on the property. Such information can be derived from the portal or CMS platform itself, depending on where such information is being collected.
While implementing the tools, the code needs to be injected in the presentation layer, so if any change is needed, the enterprise would need to get in touch with the development process team. Though this is the reality, this implementation can be isolated by introducing the injection tool.
Watch out for my next blog about how one can isolate the two processes.